Single Image Revolutionizes Camera Quantum Efficiency Determination

A paper published in the Journal of Imaging has proposed a novel approach for accurate camera quantum efficiency (QE) determination from a single image.

Single Image Revolutionizes Camera Quantum Efficiency Determination
Difference between ground truth (GT) and recovered (Rec) quantum efficiency curves for 10 nm Omega Optical filters. Image Credit: https://www.mdpi.com/2313-433X/10/7/169

QEC Estimation Challenges

Determining the camera's spectral sensitivity or quantum efficiency (QE) is essential for addressing various image acquisition issues. These include color correction for comparing images taken with different cameras and under varying lighting conditions, as well as accurately reconstructing an object's true color when captured through absorbing mediums like water.

Directly estimating the QE curve (QEC) is both tedious and complex, requiring specialized equipment. Additionally, many camera manufacturers do not make spectral characteristics publicly available. While this has led to the development of indirect QEC estimation methods, these techniques often prove unreliable due to their high sensitivity to input data noise.

As a result, obtaining a reliable and efficient sensor QEC estimation remains a significant challenge for small companies and individual photographers who lack access to expensive equipment. This underscores the need for a method that allows for accurate and rapid QEC determination.

The Proposed Approach

In this study, the researcher introduced a novel technique and device for determining the QEC of photo or video cameras using only a single image. The primary aim was to address the instability issues often encountered in QEC determination and to present a method that ensures stable QEC reconstruction even in the presence of noise.

The proposed device features a set of ultra-narrow band-pass interference filters designed to achieve reliable and noise-tolerant QEC recovery. The filters have minimal spectral overlap, which contributes to accurate QEC estimation. The number of wavelengths at which QECs are recovered depends on the number of filters used.

For optimal accuracy, filters with a full width at half maximum (FWHM) of 3 nm were employed. These filters, with non-overlapping bands, were illuminated using a broadband light source diffused through a diffusion plate to ensure spatial homogenization. The transmitted light blob was then captured in a photograph.

The typical filter sizes used were 1 inch and 1/2 inch. Given the impracticality of using a single light source due to uneven illumination across forty filters arranged in an 8 by 5 array (approximately 16 by 10 cm), an array of identical light-emitting diodes (LEDs) was used. Each LED illuminated its corresponding interference filter.

To minimize ambient light interference, the image was captured in a dark room. The image depicted a cell array of interference filters covering the entire visible spectrum. The average intensity around the brightest pixel in each cell was calculated, with the radius for averaging consistent across all cells and determined by the camera's resolution. Reflectivity vectors were derived from each cell’s peak wavelength and converted into RGB triplets.

For implementation, Thorlabs, Andover Corporation, Omega Optical, and Spectrogon were identified as sources for comprehensive narrow band-pass interference filter sets. Simulations demonstrated that QECs recovered using 10 nm FWHM Omega Optical filters had a significant standard deviation of error, approximately 5 percent (σ = 0.051192), consistent with a condition number value of 1237.71.

In contrast, 3 nm FWHM filters from Andover Corporation, which have nearly non-overlapping spectra, resulted in a standard deviation of error of σ = 0.00204. This indicates minimal difference between recovered and true QECs. Notably, the simulations showed that even with noisy RGB triplet measurements, the recovered QECs had lower noise. For example, the standard deviation of error for QECs did not exceed 5 % when the RGB triplet error was 15 %.

The proposed method was validated with a single Thorlabs interference filter with a peak transmittance at 532 nm and an FWHM of 3 nm due to the high cost of acquiring a full set of filters.

Study Significance

The numerical results revealed that the amplification of input data noise was the primary cause of inaccuracies in reconstructing QECs from colored chip images. Noise reduction was effectively achieved by using a non-overlapping set of input data signals from ultra-narrow band filters.

The proposed approach allows for faster QEC estimation compared to existing methods, requiring only a single photograph. Specifically, this technique can estimate QECs in just a few seconds from a single image, whereas the traditional "Gold Standard" technique, which requires 20–30 minutes to measure 36 points on the QECs, is significantly slower.

In summary, this study demonstrated that accurate camera QEC determination can be achieved from a single image using ultra-narrow band interference filters, even in the presence of input data noise. The proposed device offers a valuable tool for camera and imaging sensor manufacturers, as well as individual photographers, enabling rapid colorimetric calibration.

Journal Reference

Rzhanov, Y. (2024). Accurate Determination of Camera Quantum Efficiency from a Single Image. Journal of Imaging, 10(7), 169. DOI:10.3390/jimaging10070169, https://www.mdpi.com/2313-433X/10/7/169

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Samudrapom Dam

Written by

Samudrapom Dam

Samudrapom Dam is a freelance scientific and business writer based in Kolkata, India. He has been writing articles related to business and scientific topics for more than one and a half years. He has extensive experience in writing about advanced technologies, information technology, machinery, metals and metal products, clean technologies, finance and banking, automotive, household products, and the aerospace industry. He is passionate about the latest developments in advanced technologies, the ways these developments can be implemented in a real-world situation, and how these developments can positively impact common people.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Dam, Samudrapom. (2024, July 29). Single Image Revolutionizes Camera Quantum Efficiency Determination. AZoQuantum. Retrieved on December 13, 2024 from https://www.azoquantum.com/News.aspx?newsID=10394.

  • MLA

    Dam, Samudrapom. "Single Image Revolutionizes Camera Quantum Efficiency Determination". AZoQuantum. 13 December 2024. <https://www.azoquantum.com/News.aspx?newsID=10394>.

  • Chicago

    Dam, Samudrapom. "Single Image Revolutionizes Camera Quantum Efficiency Determination". AZoQuantum. https://www.azoquantum.com/News.aspx?newsID=10394. (accessed December 13, 2024).

  • Harvard

    Dam, Samudrapom. 2024. Single Image Revolutionizes Camera Quantum Efficiency Determination. AZoQuantum, viewed 13 December 2024, https://www.azoquantum.com/News.aspx?newsID=10394.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.