EnviroStudies.net
EnviroStudies.net > Physics > Light > Page 8
Sensors

The Nature of Light

Page 8:  Sensors

Building affordable sensors for recording the intensity of specific light wavelengths is possible at industrial scales, but is not yet being done. Meanwhile, do-it-yourself (DIY) options are available for hobbyists, which may provide encouragement and ideas for eventual development of affordable commercial products.


Reverse LED

Light emitting diodes (LEDs) can receive light instead of emitting light, although not as efficiently as emitting light. Affordable sensors that use LEDs to receive light are covered in this book:


David R. Brooks
Bringing the Sun Down to Earth:
Designing Inexpensive Instruments for Monitoring the Atmosphere

Springer 2008

Appendix 7 covers building a transimpedance amplifier, for boosting the weak signal response from a standard LED that is used in receiving instead of emitting mode.

Response curves for LEDs as receivers tend to be shifted toward shorter wavelengths compared to their emission curves. For example, a blue LED may detect some UV light.

One problem requiring further research is that some LEDs when used for receiving light may be affected by temperature to have unstable response curves.

Devices that use LEDs to receive light take some skill to operate, but then so does more expensive equipment. And for some of these devices, more expensive equipment may be needed occasionally to calibrate the device.


Consumer Cameras

Consumer cameras provide another affordable platform to start with. Of interest are cameras that can generate images in a RAW file format (not JPEG or TIFF) for processing on a personal computer:

  RAW file format

The RAW file format allows processing of a camera’s color filter array (CFA) to be bypassed, which in turn allows the CFA to be physically removed from the camera to increase the camera’s sensitivity and dynamic range (allowing more photons to be collected at a wider range of wavelengths).

We first explain why this is possible, then review some of the work being done in this area.


Bayer Filter

The most widely used color filter array (CFA) is the Bayer filter, named after Bryce Bayer who invented the filter while working at Eastman Kodak in the 1970s.

Bayer filters make digital color photography possible at the consumer level, as we will now review and is explained in the following document from the Army Research Lab (ARL):

  ARL-TR-5061 (PDF)

While it is possible to develop image sensors that have a separate type of sensor for each of the three primary colors, like cones in the human eye, it is cheaper to simply have one type of sensor that accepts any color light (referred to as broad band), and filter the light before it reaches the sensors.

And it is also cheaper for all the sensors to simply be a single semiconductor (one sensor divided up into a grid of sensor areas), which is possible because the sensors are all the same. In this technology, what differentiates the spectral response is the filter in front of the sensors instead of the sensors themselves.

The sensor areas (pixels) all have the same spectral response (can receive all colors of visible light and even colors that are beyond the visible range). It is the filter in front of the sensors that determines which colors each pixels sees.

Such a filter is called a color filter array (CFA), and is a checkerboard of filter areas, each square of the checkerboard pattern acting like a separate filter aligned with a pixel on the sensor and determining which wavelengths that pixel will detect.


Figure 8.1:  Bayer filter.

The Bayer filter has more green-accepting squares than the other primary colors, oversampling green as the human eye does. Each pixel records only one of the primary colors, so that spatial resolution is degraded (all three primary color values are not recorded for each pixel).

The resulting image is a mosaic of primary color values, with one-third the resolution. A process called demosaicing averages together neighboring pixels to estimate all three primary color values for each pixel of the original (full) resolution.

By defualt, demosaicing is done in the camera to produce a JPEG or TIFF file. However, on some cameras, by selecting to save images in the RAW file format, the actual sensor values are stored as the originally sensed mosaic of a single primary color value per pixel.

Those RAW images, which look choppy because they are at one-third resolution, can then be processed with demosaicing software on a personal computer. That way, different demosaicing algorithms can be tried to see which one works better for a particular image. And demosaicing algorithms that are too powerful to be used in a camera can be used on a personal computer.


Monochrome Photography

A popular way to increase sensitivity and resolution of a consumer digital camera is to physically remove the Bayer filter from above the CMOS image sensor:

  How to Remove Bayer Filter

  Debayering Message Thread

That allows any light to reach any pixel, greatly increasing sensitivity. No light is blocked. A Bayer filter blocks a substantial amount of the light, because with a Bayer filter each pixel has 2 of the three primary colors blocked. By removing the Bayer filter, all light can hit any pixel.

Even more light actually gets through without the Bayer filter, because CMOS image sensors have a response curve that extends beyond visible light, not by much, but enough to sense some ultra-violet (UV), which is blocked by a Bayer filter.

Again, we emphasize this only works with cameras that can save an image in RAW file format instead of JPEG or TIFF. The resulting image does not need to be demosaiced, and cameras that only support JPEG or TIFF will try to demosaic it anyway, corrupting the image. The image needs to be saved as a RAW file to avoid demosaicing.

The resulting image is monochrome, since any wavelength of light is accepted (CMOS sensors are broadband). DIY astronomers use this type of photography to attach astronomy filters to the camera lens, providing control over which wavelengths are photographed.

For interesting UV photographs of plants, and a UV photograph of human skin with and without suntan lotion, scroll through all of page 94 of the debayering message thread:

  Page 94 of Debayering Message Thread


Microlens Array

Physically removing the Bayer filter from a CMOS image sensor is difficult because the filter is attached to the sensor during manufacturing lithography for best possible alignment.

And the filter likely also includes a microlens array that will come off with the Bayer filter, reducing imaging sensitivity but not by as much as is gained by removing the Bayer filter.

Removing the Bayer filter requires grinding, because in most cases the Bayer filter will not simply peel off. Since the microlens array is lithographed into the Bayer filter, it too will grind off when the Bayer filter is ground off (or will peel off with the Bayer filter if peeling is possible).

The microlens array is a grid of bubble shapes that direct light into the photon-capturing wells of the image sensor. In particular, photons that would bounce off the top of the wall between adjacent wells is reflected into a well by the microlenses (instead of allowing that photon to be reflected away).

Removal of the CFA (Bayer filter) in these systems also removes the microlenses (since the microlenses are lithographed onto the CFA). Then, photons that fall between capturing wells of the sensor are reflected back out of the sensor instead of diverted into a well. This causes some loss of photons.

Removal of the CFA allows many more photons to reach the sensor than is subsequently reflected back by the top of the walls between photon-capturing wells. So there is a net plus of sensitivity. Nevertheless, it would be even better if there was a way to not remove the microlens array.


Image Correction

Monochrome photography with a debayered camera may require image correction.

First, images usually need to be cropped, because grinding to remove the CFA is not usually performed near the edge of the CMOS image sensor, leaving that part of sensor covered with the Bayer filter. For many applications, cropping is a small price to pay for the benefits of debayering

Optionally, depending on how much fidelity is required by the application, additional correction may need to be made for uneven grinding across the image sensor when the Bayer filter was removed.

“The camera used in the current setup is a modified Canon EOS 5D Mark III, where the Bayer filter, the infrared blocking filter, and the antialiasing filter have been removed. The camera exhibited some nonuniformity due to incomplete removal of the Bayer filter, but this is taken into account by a nonuniformity correction of the FPA.”
— 
Renhorn, et al.

Hyperspectral Imaging

A hyperspectral image is an image that has many color channels, allowing color channels to be narrow. Hyperspectral imaging is possible by taking multiple photographs with different filters.

Hyperspectral imaging is also possible by taking multiple photographs with a linear variable filter (LVF). An LVF is a wedge that accurately changes which wavelengths are transmitted by tilting the LFV relative to the light rays.

An LVF may be tilted relative to a fixed camera after taking a photograph and before taking the next photograph, or an LVF may be affixed to the camera and the camera tilted relative to the scene light rays in between photographs. Tilting of the camera relative to light rays could be rotating the camera on a fixed tripod, translating on a track, etc.

“For each frame during scanning, the scene will be slightly shifted with a new set of spectral samples associated with each pixel position.”
— 
Renhorn, et al.

Correlating points across images may be performed using algorigthms already in use for satellite image processing.

Referring to an image sensor as a focal plane array (FPA), high quality LVFs are available to mount directly on an FPA. Lower cost LVFs may also be suitable depending on accuracy required.


< Previous Page: Human Eye

Return to EnviroStudies.net

[+] Show Contents of this Report
 
Copyright © 2018 by 3D Software, All rights reserved
3D Software, P.O. Box 221190, Sacramento CA 95822 USA   3DSoftware.com   Contact us
Tuesday, 25-Sep-2018 04:08:27 GMT