OSA's Digital Library

Optics Express

Optics Express

  • Editor: Andrew M. Weiner
  • Vol. 22, Iss. 3 — Feb. 10, 2014
  • pp: 3712–3723
« Show journal navigation

Optical performance test and validation of microcameras in multiscale, gigapixel imagers

Seo Ho Youn, Hui S. Son, Daniel L. Marks, Jeffrey M. Shaw, Paul O. McLaughlin, Steven D. Feller, David J. Brady, and Jungsang Kim  »View Author Affiliations


Optics Express, Vol. 22, Issue 3, pp. 3712-3723 (2014)
http://dx.doi.org/10.1364/OE.22.003712


View Full Text Article

Acrobat PDF (2437 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Wide field-of-view gigapixel imaging systems capable of diffraction-limited resolution and video-rate acquisition have a broad range of applications, including sports event broadcasting, security surveillance, astronomical observation, and bioimaging. The complexity of the system integration of such devices demands precision optical components that are fully characterized and qualified before being integrated into the final system. In this work, we present component and assembly level characterizations of microcameras in our first gigapixel camera, the AWARE-2. Based on the results of these measurements, we revised the optical design and assembly procedures to construct the second generation system, the AWARE-2 Retrofit, which shows significant improvement in image quality.

© 2014 Optical Society of America

1. Introduction

AWARE-2 (Advanced Wide-Field-of-View Architectures for Image Reconstruction and Exploitation) [1

1. D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012). [CrossRef] [PubMed]

] is a wide field-of-view (FOV), gigapixel-scale camera utilizing a multiscale optical design [2

2. D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009). [CrossRef] [PubMed]

]. The schematic optical design is shown in Fig. 1(a). A 60-mm glass objective lens [3

3. D. L. Marks and D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” Imaging Systems 2010, paper ITuC2 (OSA, 2010). [CrossRef]

] images the objects at distances between 40 m and infinity to an intermediate image surface, and sets of small secondary optics (called micro-optics) relay parts of this image to 14-megapixel monochromatic complementary metal-oxide-semiconductor (CMOS) image sensors (Aptina MT9F002). The combination of micro-optics and the image sensor is called a micro-camera. The micro-optics correct the residual spherical and chromatic aberrations present in the intermediate image, and provide demagnification so that the FOV from each microcamera overlaps sufficiently to form a continuous high resolution image. In order to capture a 120°-by-50° FOV, 98 microcameras are mounted on a precision-machined aluminum geodesic dome [4

4. H. S. Son, D. L. Marks, J. Hahn, J. Kim, and D. J. Brady, “Design of a spherical focal surface using close-packed relay optics,” Opt. Express 19, 16132–16138 (2011). [CrossRef] [PubMed]

, 5

5. H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, and J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52, 1541–1549 (2013). [CrossRef] [PubMed]

]. Custom electronics with field programmable gate arrays (FPGAs) control all the microcamera settings such as exposure, gain, and focus, and read the images out through an Ethernet network to a personal computer that runs image post-processing tools [6

6. D. R. Golish, E. M. Vera, K. J. Kelly, Q. Gong, P. A. Jansen, J. M. Hughes, D. S. Kittle, D. J. Brady, and M. E. Gehm, “Development of a scalable image formation pipeline for multiscale gigapixel photography,” Opt. Express 20, 22048–22062 (2012). [CrossRef] [PubMed]

]. After unwarping the optical distortion, balancing the illumination falloff, and stitching all the images, a composite image of 0.96 gigapixels is formed with an effective FOV of 120° by 50° and angular resolution of 40 μrad.

Fig. 1 (a) Optical design principle of AWARE-2 camera. 3D renderings of (b) AWARE-2 microcamera and (c) AWARE-2 Retrofit microcamera discussed in this work. See the main text for the optical design differences between the two microcameras. The internal structures of the barrels are shown for illustration. Baffles that cut off unnecessary stray light is effectively implemented by minimizing the lens apertures and the inner diameter of the barrel.

The multiscale optical design utilizing synchronous, parallel operation of a large number of microcameras enables instantaneous image acquisition over a wide FOV. At the same time, it poses significant engineering challenges in the optical and mechanical design as well as precision assembly of the structure that integrates all the microcameras and the common objective lens, so that a composite image with specified optical resolution can be acquired [5

5. H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, and J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52, 1541–1549 (2013). [CrossRef] [PubMed]

]. An adequate strategy for a cost-effective construction of a system consisting of a large number of optical elements like the AWARE cameras is to (1) first ensure the quality of optical components and sensors used in the imaging modules (microcameras), (2) assemble and verify the quality of each microcamera, and (3) integrate only fully qualified microcameras into the system and test the system operation [7

7. S.-H. Youn, D. L. Marks, P. O. McLaughlin, D. J. Brady, and J. Kim, “Efficient testing methodologies for microcameras in a gigapixel imaging system,” Proc. SPIE 8788, 87883B (2013). [CrossRef]

, 8

8. D. S. Kittle, D. L. Marks, H. S. Son, J. Kim, and D. J. Brady, “A testbed for wide-field, high-resolution, gigapixel-class cameras,” Rev. Sci. Instrum. 84, 053107 (2013). [CrossRef] [PubMed]

].

Although it was successful in demonstrating the powerful advantages of multiscale imaging, the first prototype AWARE-2 system did not provide the quality of image as targeted in the original design due to various imperfections in the component manufacturing and the microcamera assembly processes. In this work, we present systematic analysis of the AWARE-2 microcamera [Fig. 1(b)] that provides a detailed understanding of the root causes leading to the degraded image resolution seen in the system. We introduce an alternate microcamera design called the AWARE-2 Retrofit [Fig. 1(c)], developed to overcome many of the challenges discovered through these component evaluations. The two main differences between AWARE-2 and AWARE-2 Retrofit systems are (1) use of aspheric plastic optics in AWARE-2 vs. spherical glass optics in AWARE-2 Retrofit, and (2) the focus mechanism, where the sensor is moved with respect to a static optical assembly in AWARE-2 while a subset of lenses in the optical assembly is moved with respect to the static sensor in the AWARE-2 Retrofit.

Section 2 provides a basic characterization of the CMOS image sensor and verifies its performance independent of the micro-optics used in the microcamera. In Sec. 3, we present the optical qualities of the micro-optical components that negatively impact the optical performance in AWARE-2 microcameras. Section 4 provides the imaging performance of AWARE-2 micro-cameras, and improved AWARE-2 Retrofit microcameras where all of the issues identified in Sec. 3 are addressed. Section 5 summarizes the conclusions of the paper.

2. Modulation transfer function of image sensors

Monochromatic CMOS image sensors with 1.4-μm pixels are used in the AWARE-2 micro-cameras, while color CMOS image sensors with Bayer filters are used in the AWARE-2 Retrofit microcameras. The modulation transfer function (MTF) of an imaging system that represents its spatial bandwidth (and therefore, optical resolution) results from the convolution of the component bandwidths (optics and detecting sensor), and may further be affected by the characteristics of the operating condition (motion or vibration) [9

9. G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE Press, 2001). [CrossRef]

]. As a first step, we experimentally characterize the MTF of the image sensor using the conventional slanted-edge method, which is used to overcome the sampling limit of image sensors [9

9. G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE Press, 2001). [CrossRef]

]. This method enables oversampling by projecting slightly shifted points from an image of a tilted edge onto a line of pixels, capturing the edge spread function (ESF). By differentiating and taking the Fourier transform of the ESF in sequence, one obtains the MTF plot as a function of spatial frequency.

In the sensor MTF measurement shown in Fig. 2(a), a sharp slanted edge imprinted on glass is illuminated by a uniform broadband light source generated from a fiber light source and a ground glass diffuser. This object is then demagnified by a 20× microscope objective (MO) with 0.40 numerical aperture (NA) and projected onto the sensor. Since the optical resolution of the resulting demagnified image sufficiently exceeds the sensor bandwidth, it provides an adequate object to characterize the sensor MTF performance. To minimize any unwanted aberrations, the angular and translational alignment of the slanted edge and the 20× MO is performed with the aid of a helium-neon laser beam. The image sensor MTF is finally computed by commercial software (Imatest Master) and the resulting MTF performance of the monochromatic and color sensors are plotted in Fig. 2(b). The MTF of the color sensor is lower at high spatial frequency due to the effective downsampling by the Bayer filters.

Fig. 2 Image sensor MTF characterization. (a) Experimental setup to measure the sensor MTF. (b) The experimental MTF of monochromatic (red) and color (blue) image sensors measured with broadband light source as a function of spatial frequency in units of line pairs per mm (lp/mm). The shade illustrates uncertainty of the experiment, obtained from repetition of the experiment. (c) and (d) show typical MTF measurements as a function of wavelength for monochromatic and color sensors, respectively.

We also measured the sensor MTF as a function of wavelength, using bandpass filters inserted between the 20× MO and the slanted edge. The results are shown in Figs. 2(c) and 2(d) for monochromatic and color CMOS image sensors, respectively. The monochromatic sensor exhibits only a slight increase in the MTF with narrowband illumination and negligible variation between different wavelengths, and the color sensor MTF at 550 nm is higher than that at either 450 nm or 630 nm due to double sampling in green channels.

3. AWARE-2 and AWARE-2 retrofit micro-optic characterizations

Optical systems utilizing aspherical surfaces can in principle achieve better imaging performance than those made out of only spherical surfaces. Injection molding offers a cost-effective method for fabricating lenses with highly aspheric surfaces. AWARE-2 optical design utilizes a set of aspheric lenses made out of two different types of plastic material, E48R (refractive index of 1.531 and Abbe number of 51.7) and polycarbonate (PC, refractive index of 1.585 and Abbe number of 27.6), targeted to achieve diffraction-limited, achromatic imaging performance. In this section, we present the characterization of birefringence and surface form for the plastic molded aspheric lenses fabricated for the AWARE-2 micro-optics, as well as the stray light analysis in the AWARE-2 microcamera assemblies. We found that the presence of severe birefringence observed in lenses made out of PC and the deviation of the surface form from the design values lead to substantial degradation of the imaging performance far from the diffraction-limited design.

In order to overcome these challenges, the AWARE-2 Retrofit microcamera is designed using only spherical glass optics. Although glass spherical optics generally require more elements to achieve the same imaging resolution as aspherical designs and weigh more than the plastic components, it is shown here that the well-established grinding and polishing process of manufacturing glass optical components ensures the desired optical quality is achieved for the AWARE micro-optics.

The design, assembly and verification testing of AWARE microcamera poses unique challenge compared to conventional high-performance imaging systems. First, the need to closely-pack a high density of microcameras behind the common objective lens imposes substantial constraint on the lateral extent of the microcamera body [4

4. H. S. Son, D. L. Marks, J. Hahn, J. Kim, and D. J. Brady, “Design of a spherical focal surface using close-packed relay optics,” Opt. Express 19, 16132–16138 (2011). [CrossRef] [PubMed]

, 5

5. H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, and J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52, 1541–1549 (2013). [CrossRef] [PubMed]

]. Second, the micro-optics relays the intermediate image formed by the primary objective on to the image sensor, and therefore is a finite conjugate imaging system. Due to the shallow depth-of-focus arising from the low f -number, the distance between the intermediate image and the microcamera must be maintained precisely for adequate testing. Third, it is desirable to maintain the magnification of the micro-camera as the system adjusts focus, so that the burden of scaling and stitching of the resulting images with those from neighboring micro cameras remain minimal. Furthermore, the intermediate image surface is curved, and the microcamera must compensate for the curvature present in the object plane across its FOV. We developed precision-machined test jig and a fast focus algorithm [10

10. T. Nakamura, D. S. Kittle, S. H. Youn, S. D. Feller, J. Tanida, and D. J. Brady, “Autofocus for multiscale gigapixel camera,” Appl. Opt. 52, 8146–8153 (2013). [CrossRef]

] that automates the imaging performance characterization in a tabletop setup, a critical requirement for high-throughput test necessary for the AWARE system assembly.

3.1. Birefringence of optical elements

Birefringence [11

11. S. Bäumer, Handbook of Plastic Optics (Wiley-VCH, 2010). [CrossRef]

] caused by thermal stress during the plastic injection molding process can substantially degrade imaging resolution due to the phase gradient induced in the lenses. Therefore, it is critical to minimize the birefringence of optical elements used for high performance imaging systems. It is well-known that polycarbonate (PC), the high refractive index material used in the AWARE-2 micro-optics, is susceptible to a large degree of birefringence [12

12. D. G. LeGrand and J. T. Bendler, Handbook of Polycarbonate Science and Technology (Marcel Dekker, 2000).

]. Our measurements on E48R components confirmed that these lenses do not feature much birefringence, and the dominant source of birefringence comes from lenses made out of PC.

To experimentally quantify the magnitude of the birefringence in the elements, we constructed a circular polariscope, as shown in Fig. 3(a). In this setup, a broadband light illuminates a ground glass diffuser to provide a diffuse optical source for the sample under test. The bandpass filter (BF) at 633 nm is placed after the diffuser, forming a sharper fringe pattern for more accurate measurement. Combination of a linear polarizer (P) and a quarter-wave plate (QWP1) with its fast axis aligned at +45° of the polarizer axis generates circularly-polarized input light. The lens under test [in this case, the polycarbonate lens (n = 1.585) used as the first element of AWARE-2 micro-optics in Fig. 1(b)] is immersed in mineral oil (n = 1.47) to reduce its optical power for ease of image acquisition. The second quarter-wave plate (QWP2) with its fast axis at −45° and the analyzer (A) placed at +90° provide the dark field image. The imaging camera, consisting of an achromatic doublet (Thorlabs, Inc. AC127-025-A-ML, f = 25 mm), an infinity-corrected MO with 0.10 NA and a CMOS image sensor with a pixel size of 2.2-μm, captures the fringe pattern with a magnification of 0.42 and a resolution of about 100 lp/mm at the object plane.

Fig. 3 Birefringence measurement and simulation. (a) Circular polariscope for birefringence measurement. BF: bandpass filter. P: polarizer. QWP: quarter-wave plate. A: analyzer. (b) False color image of the stress fringe pattern observed on the polycarbonate optical element used in the AWARE-2 micro-optics. The white vertical line shows the cross-section intensity profile used for birefringence analysis. The color indicates the pixel value. (c) The cross-section intensity profile (blue) and the phase retardation (green) of (b). (d) The simulated MTF (blue) influenced by the observed birefringence, in comparison with the nominal MTF (red).

The intensity I of the light emerging from the analyzer is given by I ∝ sin2(Δ/2), where Δ is the wrapped phase retardation between the two polarization components of light propagating through the lens [13

13. H. E. Lai and P. J. Wang, “Study of process parameters on optical qualities for injection-molded plastic lenses,” Appl. Opt. 47, 2017–2027 (2008). [CrossRef] [PubMed]

, 14

14. ASTM International D4093-95 for the standard measurement of birefringence.

]. A total of 13 stress fringes observed along the white line in Fig. 3(b) equates to an accumulated phase retardation of 13 × 2π radians between the two polarization components, as shown in Fig. 3(c). In order to estimate the effect of the birefringence on the MTF, we conducted computer simulations by inserting a phase surface that matches the corresponding experimental phase retardation, after the first polycarbonate element in the lens prescription of the optical design. One can see from the resulting MTF in Fig. 3(d) that such a large phase gradient over the lens aperture can lead to the image degradation of the AWARE-2 microcamera. This measurement suggests that in order to reduce the effect of birefringence, one should (1) choose alternate lens material with lower birefringence to replace PC components, such as OKP4 (which has similar properties as PC, with refractive index of 1.607 and Abbe number of 27) or glass, (2) design thinner micro-optic lenses for better homogeneity and less phase accumulation, and (3) eliminate the hexagonal aperture to remove sharp edges that typically lead to thermal stress in the injection molding process. Prototype lenses made out of OKP4 plastic and glass [Figs. 4(a) and 4(b), respectively] show a dramatic reduction in birefringence.

Fig. 4 Fringe pattern images of prototype lenses. (a) A molded lens with less birefringent plastic material, OKP4. The white circles indicate the outer edge of the lens and the designed clear aperture. The arrow indicates the thermal stress that occurred at the gate during the injection molding process, located outside the designed aperture. (b) A glass lens used in the AWARE-2 Retrofit microcamera, showing no visible fringe. The color bar indicates the pixel value in the images.

3.2. Surface form of a molded lens

The main optical power and residual aberration of an imaging lens arise from the surface form. The deviation of lens surface from the nominal design can cause focal shift or deformation of wavefront (consequently decreasing the MTF), so we developed a process to measure the accurate surface profile of the molded plastic lenses to facilitate optimization of its fabrication process. The magnitude of impact from a specific lens surface on the image quality strongly depends on the role that the element plays in the optical design. In our experiment, we used an interferometric 3D-optical profiler (Zygo NewView 5000) with a vertical resolution of <1 nm to measure the surface profile of a spherical plastic lens with a radius of curvature R of 200 mm and a diameter of 9.74 mm. This instrument utilizes non-contact white-light interferometry to acquire high-speed 3 dimensional profile of a surface [15

15. L. Deck and P. de Groot, “High-speed non contact profiler based on scanning white-light interferometry,” Appl. Opt. 33, 7334–7338 (1994). [CrossRef] [PubMed]

]. We chose to profile a spherical lens with low surface sag instead of an aspherical lens that typically has a large surface sag, to avoid use of a compensating optic or null corrector. Although the specific lens characterized in this section was not an element used in AWARE-2 micro-optics, it was fabricated using the same injection molding process and represents a typical deviation of the lens surface form from the design values. It was necessary to capture and stitch multiple images to form the full profile, because the measured lens was much larger than the FOV of the the profiling instrument. We collected data spanning an area of 9.77 mm × 0.68 mm across the lens center, and compare the measured surface form to the designed profile.

The fit model used to describe the surface profile of a spherical lens z(x, y) is as follows:
z(x,y)=C[(xx0)2+(yy0)2]1+1(1+K)C2[(xx0)2+(yy0)2]+A(xx0)+B(yy0)+z0,
(1)
where C(= 1/R) is the curvature term, K is the conic constant that characterizes the ellipticity, A and B are slope terms of a plane to compensate any possible tilt from the measurement setup, and x0, y0, and z0 are the offset terms for the fitting. Figure 5(a) compares the nominal design to the experiment by fitting the measurement with only tilt compensation terms (i.e., A, B, x0, y0, and z0), where the resulting goodness-of-fit R2 and root-mean-square error (RMSE) are calculated to be 0.9834 and 2.1 μm, respectively (Table 1). In order to assess the impact of the surface deviation to the MTF, we fit the measured surface form with two additional parameters C = 0.004042 mm−1 and K = 823.3 to obtain a surface representation close to the measured form (R2 = 0.991 and RMSE = 0.47 μ in Table 1), as plotted in Fig. 5(b). We then use this surface representation in the original optical design to perform ray tracing simulation to estimate the MTF. The result shown in Fig. 5(c) indicates that the MTF is degraded by such surface deviation, and this particular prototype lens contribute substantially to the MTF degradation.

Fig. 5 Surface form measurements and MTF simulation of the measured surface. (a) Comparison of the experimental profile with the nominal design. (b) Surface model with fit parameters, C and K to obtain the realistic representation of the measured lens. Note that 2D plots in (a) and (b) were obtained by taking the cross-section of the 3D surface with the xz plane. (c) Simulated MTF (black dots) impacted by surface deviation of the prototype lens, plotted together with the MTF for the nominal design (red line) and the diffraction-limited curve (dashed line).

Table 1. Fitting the surface form of a molded prototype lens.

table-icon
View This Table

3.3. Stray light degrading image contrast

Light that enters an optical system, scatters in the system in an undesired way, and detected at the image sensor as false signal is called stray light. A uniform stray light distribution over the FOV reduces the image contrast, and the stray light with high spatial frequency components can cause artifacts such as ghost images or veiling glare in the final image [16

16. B. Dörband, H. Müller, and H. Gross, Handbook of Optical Systems, Metrology of Optical Components and Systems (Wiley-VCH, 2012).

]. Stray light can originate from incident light scattering off the internal mechanical structures of the imaging system, sides of lenses, or impurity inclusions of size <10 μm in lenses [16

16. B. Dörband, H. Müller, and H. Gross, Handbook of Optical Systems, Metrology of Optical Components and Systems (Wiley-VCH, 2012).

].

In order to quantify the stray light and compare the performance between prototype micro-cameras, we employ the veiling glare index (VGI) as a metric, defined as
VGI=IstrayIstray+Isignal,
(2)
where Istray is irradiance measured within a region where the light is blocked by the object, and Isignal is irradiance delivered to the microcamera without the obstruction by the object [17

17. S. Matsuda and T. Nitoh, “Flare as Applied to Photographic Lenses,” Appl. Opt. 11, 1850–1856 (1972). [CrossRef] [PubMed]

]. In the experiment to measure VGI, a microcamera captures two images, one with a completely dark object partially covering the FOV and the other without the object, to measure Istray and Istray+Isignal, respectively. The two images are then divided pixel by pixel to extract the values of VGI as a function of spatial coordinates. We sample data points in the center of the image field to obtain representative VGI values.

Figure 6(a) shows the test jig used to test the imaging properties of the assembled AWARE-2 and AWARE-2 Retrofit microcameras. The test jig is precision-machined to ensure that the tolerance in distance (< 25 μm) and tilt angle (< 0.05°) between the microcamera and the test target is tightly controlled. In this setup, the microcamera under test captures an image of a test object (either 1951 USAF resolution test pattern or the Imatest spatial frequency response chart imprinted on a glass substrate) illuminated by a diffuse light source. The captured image can be analyzed for both stray light analysis (VGI) and MTF measurement.

Fig. 6 (a) Test jig used to test AWARE-2 and AWARE-2 Retrofit microcameras. Broadband light source is filtered by a bandpass filter at 590 nm in order to minimize the chromatic aberrations and illuminates the test target placed at the nominal object plane of the microcamera under test. The ground-glass diffuser inserted after the bandpass filter provides uniform illumination. (b) Image of the 1951 USAF resolution test chart, normalized by the maximum pixel value, taken with the AWARE-2 microcamera. Stray light creates a bright ring at the edge of the field (cross-section shown by the red trace in the inset), and reduces the contrast across the image of a dark object (black trace in the inset). (c) Normalized image of the Imatest spatial frequency response (SFR) chart taken with the AWARE-2 Retrofit microcamera. The cross-sections in the inset shows absence of bright artifact ring (red trace), and a much higher contrast (black trace).

Figure 6(b) shows an image of 1951 USAF resolution test pattern taken by an AWARE-2 microcamera, where the false color shows the pixel intensity values normalized to the maximum pixel value in the image. The inset shows two cross-sectional intensities along the red and black lines shown in the image. Near the edge of the field, we observe an artifact of a bright ring (red trace in the inset). When measured across a dark square in the middle of the field (black trace in the inset), we observe a low contrast between the intensities within the square (Istray) and just outside the square (Istray+Isignal), leading to a high VGI of (25.3 ± 1.3)%.

To understand the source of unexpected levels of stray light in the system, we further performed stray light simulation with ray tracing software, confirming that it was caused by internal reflection from the flat sides of the front hexagonal lens. By removing the hexagonal lens from the design, covering the sides of all optical elements in the microcamera, and implementing effective baffles inside the barrel [Fig. 1(c)], the bright ring was eliminated and stray light level was dramatically reduced in the AWARE-2 Retrofit microcamera. Fig. 6(c) shows the absence of the artifact ring (red trace in the inset) and a higher image contrast with a calculated VGI of (6.2 ± 0.3)% (black trace in the inset) achieved in AWARE-2 Retrofit microcamera.

3.4. Optical assembly techniques

In order to achieve the diffraction-limited imaging resolution in an optical system, one not only has to fabricate high quality optics but also has to precisely align all the optical surfaces along the nominal optic axis. The tolerances allowed for this alignment typically depends on the robustness of the optical design, and good optical designs provide more relaxed tolerances for the precision with which the alignment must be completed. The relative misalignment (decenter or tilt) of any of the optical surfaces in the imaging system beyond the prescribed tolerance can cause aberrations such as coma or astigmatism, and lead to MTF degradation (especially at large field angles). The tolerances of the optical assembly can be met by using precision-machined optical barrels with seats for the optical element accurately designed (passive alignment), or by actively monitoring the center and tilt of each optical element before it is fixed in the barrel (active alignment). Passive alignment is desired for the simplicity and cost of the assembly procedure, but active alignment will provide an accurate assessment of the alignment errors of each optical component as the assembly progresses. While novel design strategies utilizing specific forms of aspheric surfaces have recently been introduced to make the system robust against alignment errors [18

18. B. Ma, K. Sharma, K. P. Thompson, and J. P. Rolland, “Mobile device camera design with Q-type polynomials to achieve higher production yield,” Opt. Express 21, 17454–17463 (2013). [CrossRef] [PubMed]

], our main performance limitation arises from manufacturing errors in the molding process that is expensive to optimize at current production volumes. We therefore move to glass spherical surfaces, and achieve the required assembly tolerances via optimal design of lens seats in the optical barrels. Both AWARE-2 and AWARE-2 Retrofit micro-optics are assembled using the passive alignment technique.

4. MTF performance results of various microcameras

The on-axis experimental MTF test results of three microcameras are presented in Fig. 7 as black dots, where the diffraction-limited performance (dashed curve), the nominal designed performance of the micro-optics (without the sensor MTF, red), and the sensor MTF (blue) used for each microcamera are also plotted for comparison. Figure 7(a) is the AWARE-2 microcamera, consisting of molded plastic aspheres assembled by passive alignment. Although the target optical design should feature 20% MTF near the Nyquist spatial frequency determined by the sensor pixel pitch (357 lp/mm), the actual imaging performance is far worse, where 20% MTF is achieved at spatial frequencies below 100 lp/mm. We attribute this performance degradation to the issues identified and characterized in Sec. 3.

Fig. 7 On-axis experimental MTF measurements plotted with diffraction-limited (dashed), nominal micro-optic design (red), and sensor MTF values (blue) for three microcamera designs. (a) AWARE-2, (b) AWARE-2A, and (c) AWARE-2 Retrofit microcamera. (a) and (b) were simulated and measured with broadband light illumination, and (c) was simulated and measured with 590-nm light. Note that (a) and (b) use monochromatic sensors and (c) uses a Bayer sensor, where the plotted MTF was obtained from linear interpolation of the 550-nm and 630-nm measurements in Fig. 2(d).

As further evidence, an AWARE-2A microcamera was designed and assembled in an attempt to eliminate all the issues identified in Sec. 3: glass spherical lenses were used to eliminate the birefringence and surface form errors, circular aperture optics were used to reduce stray light, and active alignment was used to eliminate the assembly errors. Figure 7(b) shows the MTF performance of the AWARE-2A microcamera. Although the designed performance of this microcamera is slightly worse than the AWARE-2 microcamera (red lines), AWARE-2A provides an existence proof that a near-ideal image performance can be achieved when the issues identified in Sec. 3 are adequately addressed. While the active alignment approach used for AWARE-2A provides an ultra-precise alignment with less than ∼10 μm of total indicator runout for each optical element in the barrel, it proves to be prohibitively expensive in terms of time and cost for AWARE-type applications.

AWARE-2 Retrofit microcamera consists of similar optical design approach to AWARE-2A microcamera using round glass spherical lenses, but with improved barrel and optical seat designs to achieve the necessary design tolerance by simple passive alignment process. Furthermore, we adopted the color image sensors and a focus mechanism that moves a subset of lenses with respect to a fixed image sensor in the AWARE-2 Retrofit, unlike the AWARE-2 and AWARE-2A microcameras. The mechanical mechanism for moving the sensor with respect to the static optical barrel requires flexible electrical readout cables attached to the sensor packed in tight spaces. The cable assembly under such constraint turns out to be mechanically stiff, leading to challenges in constraining sensor motion to pure translation along the optical axis. The resulting translation mechanism was bulky and stiff, requiring a lot of force from the actuator. AWARE-2 Retrofit optics were designed, so that the translation of the last lens group (two lenses) lead to focal shift in the entire system. We designed concentric carriage to ensure pure translation, even if the actuator pushes on the carriage on one edge. The net force required for the translation was below 0.03N, which can easily be achieved using a range of small motor mechanisms. Furthermore, the AWARE-2 Retrofit design is close to telecentric on the sensor side, so that the focal adjustment by the moving lens group does not change the magnification of the acquired image. This feature is crucial in simplifying the stitching process of the resulting images.

The comparison of these two assembly approaches using the experimental results of AWARE-2A and AWARE-2 Retrofit microcameras [Figs. 7(b) and 7(c)] indicate that a combination of optical designs and optomechanical design with better assembly tolerances can achieve sensor-limited performance in a passively assembled microcamera. We focus on AWARE-2 Retrofit microcamera as the natural design of choice for high performance AWARE systems, where both optical performance and cost of assembly are optimized.

We characterized the imaging performance of the AWARE-2 Retrofit microcamera over the entire FOV by measuring MTF from multiple Imatest SFR chart images located across the field. Figure 8(a) shows a typical example of the spatial frequency map across the sensor, where the MTF values are calculated from the slanted edges of various squares shown in Fig. 6(c). This plot shows the color map of frequency at which 20% MTF is obtained at each location on the sensor when the microcamera focus is optimized. The color map is generated by interpolating between the experimental data points extracted from vertical (black circles) and horizontal (white circles) slanted edges of the squares. In this measurement, one can see that the image quality is quite uniform across the entire sensor surface, where 20% MTF value is achieved at spatial frequencies of at least 210 lp/mm at any given location on the sensor. Figure 8(b) shows the best (green dots) and worst (orange dots) MTF performances corresponding to the sensor location 1 and 2 indicated in Fig. 8(a), respectively. This shows that AWARE-2 Retrofit microcameras with high imaging performance across the entire FOV can readily be constructed.

Fig. 8 (a) A color map of spatial frequency at MTF 20% measured in a typical AWARE-2 Retrofit microcamera at optimal focus. The plot was generated by linearly interpolating the experimental data (circles) of spatial frequencies at 20% of the MTF. The data points in black (white) circles are measured from vertical (horizontal) slanted edges. (b) Measured MTF curves at the best (green dots) and worst (orange dots) optical performance, corresponding to location 1 and 2 shown in (a), respectively.

5. Conclusion

We presented characterization results of various aspects of AWARE-2 microcameras, such as the image sensor MTF, birefringence and surface form of constituent lenses, stray light, and optical alignment. Our simulation of the optical performance using the measured imperfections allows us to assess their impacts on the system MTF. We demonstrate that these issues in manufacturing can be addressed by the optimal design utilizing glass spherical lenses and passive alignment, and our latest AWARE-2 Retrofit microcamera achieves imaging resolution performance close to ideal design values.

Acknowledgments

The authors gratefully acknowledge helpful discussion with Scott Cahall at Moondog Optics. This work was supported by the DARPA MTO AWARE program under contract HR0011-10-C-0073.

References and links

1.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012). [CrossRef] [PubMed]

2.

D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009). [CrossRef] [PubMed]

3.

D. L. Marks and D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” Imaging Systems 2010, paper ITuC2 (OSA, 2010). [CrossRef]

4.

H. S. Son, D. L. Marks, J. Hahn, J. Kim, and D. J. Brady, “Design of a spherical focal surface using close-packed relay optics,” Opt. Express 19, 16132–16138 (2011). [CrossRef] [PubMed]

5.

H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, and J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52, 1541–1549 (2013). [CrossRef] [PubMed]

6.

D. R. Golish, E. M. Vera, K. J. Kelly, Q. Gong, P. A. Jansen, J. M. Hughes, D. S. Kittle, D. J. Brady, and M. E. Gehm, “Development of a scalable image formation pipeline for multiscale gigapixel photography,” Opt. Express 20, 22048–22062 (2012). [CrossRef] [PubMed]

7.

S.-H. Youn, D. L. Marks, P. O. McLaughlin, D. J. Brady, and J. Kim, “Efficient testing methodologies for microcameras in a gigapixel imaging system,” Proc. SPIE 8788, 87883B (2013). [CrossRef]

8.

D. S. Kittle, D. L. Marks, H. S. Son, J. Kim, and D. J. Brady, “A testbed for wide-field, high-resolution, gigapixel-class cameras,” Rev. Sci. Instrum. 84, 053107 (2013). [CrossRef] [PubMed]

9.

G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE Press, 2001). [CrossRef]

10.

T. Nakamura, D. S. Kittle, S. H. Youn, S. D. Feller, J. Tanida, and D. J. Brady, “Autofocus for multiscale gigapixel camera,” Appl. Opt. 52, 8146–8153 (2013). [CrossRef]

11.

S. Bäumer, Handbook of Plastic Optics (Wiley-VCH, 2010). [CrossRef]

12.

D. G. LeGrand and J. T. Bendler, Handbook of Polycarbonate Science and Technology (Marcel Dekker, 2000).

13.

H. E. Lai and P. J. Wang, “Study of process parameters on optical qualities for injection-molded plastic lenses,” Appl. Opt. 47, 2017–2027 (2008). [CrossRef] [PubMed]

14.

ASTM International D4093-95 for the standard measurement of birefringence.

15.

L. Deck and P. de Groot, “High-speed non contact profiler based on scanning white-light interferometry,” Appl. Opt. 33, 7334–7338 (1994). [CrossRef] [PubMed]

16.

B. Dörband, H. Müller, and H. Gross, Handbook of Optical Systems, Metrology of Optical Components and Systems (Wiley-VCH, 2012).

17.

S. Matsuda and T. Nitoh, “Flare as Applied to Photographic Lenses,” Appl. Opt. 11, 1850–1856 (1972). [CrossRef] [PubMed]

18.

B. Ma, K. Sharma, K. P. Thompson, and J. P. Rolland, “Mobile device camera design with Q-type polynomials to achieve higher production yield,” Opt. Express 21, 17454–17463 (2013). [CrossRef] [PubMed]

OCIS Codes
(110.3000) Imaging systems : Image quality assessment
(110.4100) Imaging systems : Modulation transfer function
(120.4610) Instrumentation, measurement, and metrology : Optical fabrication
(110.1758) Imaging systems : Computational imaging

ToC Category:
Imaging Systems

History
Original Manuscript: December 3, 2013
Revised Manuscript: January 27, 2014
Manuscript Accepted: January 30, 2014
Published: February 7, 2014

Virtual Issues
Vol. 9, Iss. 4 Virtual Journal for Biomedical Optics

Citation
Seo Ho Youn, Hui S. Son, Daniel L. Marks, Jeffrey M. Shaw, Paul O. McLaughlin, Steven D. Feller, David J. Brady, and Jungsang Kim, "Optical performance test and validation of microcameras in multiscale, gigapixel imagers," Opt. Express 22, 3712-3723 (2014)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-22-3-3712


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012). [CrossRef] [PubMed]
  2. D. J. Brady, N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009). [CrossRef] [PubMed]
  3. D. L. Marks, D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” Imaging Systems 2010, paper ITuC2 (OSA, 2010). [CrossRef]
  4. H. S. Son, D. L. Marks, J. Hahn, J. Kim, D. J. Brady, “Design of a spherical focal surface using close-packed relay optics,” Opt. Express 19, 16132–16138 (2011). [CrossRef] [PubMed]
  5. H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52, 1541–1549 (2013). [CrossRef] [PubMed]
  6. D. R. Golish, E. M. Vera, K. J. Kelly, Q. Gong, P. A. Jansen, J. M. Hughes, D. S. Kittle, D. J. Brady, M. E. Gehm, “Development of a scalable image formation pipeline for multiscale gigapixel photography,” Opt. Express 20, 22048–22062 (2012). [CrossRef] [PubMed]
  7. S.-H. Youn, D. L. Marks, P. O. McLaughlin, D. J. Brady, J. Kim, “Efficient testing methodologies for microcameras in a gigapixel imaging system,” Proc. SPIE 8788, 87883B (2013). [CrossRef]
  8. D. S. Kittle, D. L. Marks, H. S. Son, J. Kim, D. J. Brady, “A testbed for wide-field, high-resolution, gigapixel-class cameras,” Rev. Sci. Instrum. 84, 053107 (2013). [CrossRef] [PubMed]
  9. G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE Press, 2001). [CrossRef]
  10. T. Nakamura, D. S. Kittle, S. H. Youn, S. D. Feller, J. Tanida, D. J. Brady, “Autofocus for multiscale gigapixel camera,” Appl. Opt. 52, 8146–8153 (2013). [CrossRef]
  11. S. Bäumer, Handbook of Plastic Optics (Wiley-VCH, 2010). [CrossRef]
  12. D. G. LeGrand, J. T. Bendler, Handbook of Polycarbonate Science and Technology (Marcel Dekker, 2000).
  13. H. E. Lai, P. J. Wang, “Study of process parameters on optical qualities for injection-molded plastic lenses,” Appl. Opt. 47, 2017–2027 (2008). [CrossRef] [PubMed]
  14. ASTM International D4093-95 for the standard measurement of birefringence.
  15. L. Deck, P. de Groot, “High-speed non contact profiler based on scanning white-light interferometry,” Appl. Opt. 33, 7334–7338 (1994). [CrossRef] [PubMed]
  16. B. Dörband, H. Müller, H. Gross, Handbook of Optical Systems, Metrology of Optical Components and Systems (Wiley-VCH, 2012).
  17. S. Matsuda, T. Nitoh, “Flare as Applied to Photographic Lenses,” Appl. Opt. 11, 1850–1856 (1972). [CrossRef] [PubMed]
  18. B. Ma, K. Sharma, K. P. Thompson, J. P. Rolland, “Mobile device camera design with Q-type polynomials to achieve higher production yield,” Opt. Express 21, 17454–17463 (2013). [CrossRef] [PubMed]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited