Talk about taking a dim view of things. Researchers have obtained ultrasharp images of weakly illuminated objects using a bare minimum of photons: mathematically stitching together information from single particles of light recorded by each pixel of a solid-state detector.
The achievement is likely to support studies of fragile biological materials, such as the human eye, that could be damaged or destroyed by higher levels of illumination. The development could also have applications for military surveillance, such as in a spy camera that records a scene with a minimum of illumination to elude detection.
To create detailed images using single photons, electrical engineer Ahmed Kirmani of the Massachusetts Institute of Technology in Cambridge and his colleagues developed an algorithm that takes into account correlations between neighboring parts of an illuminated object as well as the physics of low-light measurements. The researchers describe their work online today in Science.
“The amount of information they’ve been able to extract is quite incredible,” comments experimental physicist John Howell of the University of Rochester in New York, who was not part of the study.
“We didn’t invent a new laser or a new detector,” notes Kirmani. Instead, he explains, the team applied a new imaging algorithm that can be used with a standard, off-the-shelf photon detector.
Light from dark
In the team’s setup, low-intensity pulses of visible laser light scan an object of interest. The laser fires a pulse at a given location until a single reflected photon is recorded by a detector; each illuminated location corresponds to a pixel in the final image.
Variations in the time it takes for photons from the laser pulses to be reflected back from the object provides depth information about the body — a standard way of revealing three-dimensional structure. However, the algorithm developed by Kirmani and his colleagues provides that information using one-hundredth the number of photons required by existing light detection and ranging (LIDAR) techniques, which are commonly used in remote mapping or measuring forest biomass, for instance.
“The paper illustrates some remarkable examples of this new computational imaging technique and could point a future direction for a number of single-photon depth imaging approaches,” notes photonics expert Gerald Buller of Heriot-Watt University in Edinburgh, UK, who was not involved in the study.
Because the laser produces light of a single wavelength, the technique produces monochromatic pictures, but to some extent it can distinguish different materials based on the rate at which they reflect the laser's color. On average, darker regions require a greater number of pulses to hit them before one is reflected.
To simulate real-world conditions, the researchers used an incandescent lamp that created a level of stray background photons roughly equal to those that number reflected from the laser. To eliminate the noise, the team used various algorithms, which enabled them to produce high-resolution, 3D images using a total of about one million photons. By comparison, an image of similar quality taken with a mobile-phone camera under office lighting conditions would require a few hundred trillion photons, Kirmani calculates.