Sam Bae and Ronald Korniski
Using a pair of complementary multiband bandpass filters allows a miniature lens system to obtain data for color 3D images in real time.
Human left and right eyes see the world from slightly different angles, and the disparity between them is interpreted as depth by the brain. A 3D camera mimics this by arranging two cameras. However, when the two-camera set-up is scaled down to fit a confined space smaller than a fingertip, compactness becomes an issue. Some patented solutions to this problem create a dual aperture in a single lens and alternately open the apertures to conserve space.1–6 However, until now there has been no good mechanism to alternately open the apertures in real time on this scale, which is necessary to prevent the image fields captured by both of the apertures from overlapping. This is to maximize the spatial resolution of the imaging chip: each of the dual apertures yields an image field that is as big as the imaging chip, and thus the apertures must be alternated. One approach puts shutters (mechanical and electro-optical) at the apertures. However, their mechanisms are too large to be integrated into a small lens system.
Another solution consists of coupling a pair of orthogonal polarizers or a pair of complementary single bandpass filters with corresponding illuminations, which alternately open or block the apertures. Because the effective viewpoint switching occurs outside the lens, this scheme has the advantage of not increasing the lens volume. However, as the polarization directions change when light reflects off internal surfaces, the polarizers experience crosstalk. A complementary single bandpass filter eliminates this problem, but each of the filters has only one passband and can only yield a monochromatic image.
We adopted a pair of complementary multiband bandpass filters (CMBFs), which have recently become commercially available. By complementary, we mean that none of the wavelengths transmitted by one filter overlaps with any of the others. Thus, a given wavelength of monochromatic light passes through only one of the two filters. If we illuminate within the bands of a single CMBF, light passes through only one of the two pupils, effectively shuttering the other.7 Each CMBF can transmit red, green, and blue (RGB) multispectral images to yield a color image. In addition, using a pair of filters exploits the ability of the illumination scheme to switch the viewpoints outside the lens system. When a monochromatic camera is used, six spectral light bands matching the passbands of one filter—three of the six rendering an RGB image of one viewpoint, and the rest rendering another RGB image of the other viewpoint—yield one stereo image (see Figure 1). (A color camera can be used to get a stereo image. For the time being, a monochromatic camera was used to make the color processing easier.) Each pixel is used for each viewpoint (see Figure 2).
To demonstrate our method’s size advantages, we built a lens system using commercially available 3mm lens elements. We could not source the dual apertures with complementary filters commercially. Consequently, we lithographed the apertures, cut off-the-shelf 25mm filter-disks into 4×2mm rectangles, and joined them at the edges to fit in the lens mount along with the 3mm lens elements (see Figure 3). We assembled an objective lens to accommodate the filters for light incident along the normal, since the CMBF works based on optical interference physics: the passbands shift and distort more and more as the angle of incidence increases. We placed all the lens elements in a plastic lens housing fabricated by a method called rapid prototyping (see Figure 3). Because the plastic lens was slightly translucent to ambient light, we wrapped aluminum foil around the housing. To project multispectral illuminations, we adopted a tunable filter to filter broadband light into spectral illuminations to fall within the passbands. The transmission of the spectral light bands used are plotted in bell-curve shapes (see Figure 1). We used Matlab software to control the sequence of the operation, illuminations, and snapshots. We watched real-time 3D imaging through a 3D display.
In summary, the CMBFs coupled with the corresponding spectral illuminations have enabled us to open miniature dual apertures alternately and capture color images from two viewpoints on a scale suitable for an endoscope to be used in minimally invasive neurological or skull base surgery. To demonstrate the miniaturization advantage, we assembled a 3mm objective lens accommodating an interference bipartite filter. Though this cost-effective, low-schedule-impact lens objective of commercially off-the-shelf components was good for proof of concept, it covered only a 50° total field of view with acceptable imagery at lower spatial frequencies at a slow relative aperture (the ratio of the equivalent focal length to effective aperture of a lens). In the next phase of development, we plan to custom-fabricate objectives to provide wider fields of view with better imagery and faster relative apertures.
Sam Bae joined the engineering staff at JPL in 2000. He has a BS in engineering physics from the University of California, Berkeley, an MS in mechanical engineering from Purdue University, and an additional MS in biomedical engineering from the University of California, Los Angeles.
Ron Korniski joined the engineering staff at JPL in 2008. He has a BS in mathematics and physics from Western Michigan University, Kalamazoo, an MS in optical sciences from the University of Arizona, Tucson, and an MBA from California State University, Pomona. He previously held positions with ITEK, Rockwell International, Optical Research Associates, OPTICS 1, and Science Applications International Corporation. He has been a member of SPIE since the late 1970s.