Imaging technique acquires a color image and depth map from a single monocular camera image

  
Imaging technique acquires a color image and depth map from a single monocular camera image

Researchers have developed an imaging technique that can simultaneously acquire a color image and a depth map from a single image taken by a monocular camera. This technique achieves high-precision distance/range detection, comparable to that of a stereo camera, through the combination of a lens device and image processing. This technology will be announced at The 22nd Symposium on Sensing via Image Information (SSII2016) which will be held at Pacifico Yokohama starting June 8, 2016.

In recent years, automobiles have been equipped with multiple sensors and cameras giving front, rear, and side information. This has led to increase in sophistication of assisted driving technologies such as driverless operation. In addition, the importance of image sensing using cameras has increased due to their use of in remote inspections of infrastructure, such as those conducted by drones and robots. In these applications, in addition to taking a two-dimensional image, it is also necessary to understand the object's dynamic three-dimensional parameters such as its shape, movement, and distance from the camera.

Various methods for measuring the distance to the object have been proposed, such as stereo cameras, infrared distance sensors, ultrasonic distance sensors, millimeter wave radar, LiDAR (Light Detection and Ranging), SfM (Structure from Motion) technology, and others. In stereo cameras, it is necessary to maintain a distance between the two camera lenses of approximately 30 cm when measuring for a high degree of distance accuracy, so miniaturization is inherently difficult. Infrared and ultrasonic sensors measure distance by illuminating the object with an infrared light pattern or bombarding it with ultrasonic waves, respectively, so measuring objects at distances greater than 10 m is difficult. In addition to being difficult to miniaturize, millimeter wave radar and LiDAR equipment is costly. SfM technology measures the distance to the object from multiple images taken by a moving camera; however, it is difficult to measure the distance of a moving object with a high degree of accuracy.

Imaging technique acquires a color image and depth map from a single monocular camera image

Thus, conventional distance sensors have a variety of advantages and disadvantages. Obtaining a high degree of accuracy in a compact low-cost package is difficult.

The researchers have developed a proprietary imaging technique that uses a combination of color filters and imaging processing to obtain both a color image and high-precision a depth map from a single monocular camera image. By attaching a proprietary color aperture filter consisting of blue and yellow filters to the lens aperture, a combination of blur and color shift occurs, and this combination depends on the distance to the object. Distance to the object is detected for each pixel through image analysis from the blur and color deviation obtained within a single photographic image. Deterioration in the quality of the captured image is also suppressed, because this color filter allows transmission of green light, which has a higher contribution ratio to overall image brightness. In tests using a commercial camera, the researchers have confirmed that the distance accuracy obtained from a single image taken using a monocular camera is comparable to that obtainable with an image taken by a stereo camera that has its lenses 35 cm apart. Therefore, it is possible to construct an inexpensive image sensor using this method as it consists solely of a lens device and image processing.

Imaging technique acquires a color image and depth map from a single monocular camera image

Explore further: Bell Labs improves lensless camera with additional pixel on sensor