A compact image scanner is designed by using a compound eye system with plural optical units in which a ray path is folded by reflective optics. The optical units are aligned in two lines and take each image of a separated field of view (FOV), slightly overlapped. Since the optical units are telecentric in the object space and the magnification ratio is constant regardless of the object distance, the separated pieces of a total image are easily combined with each other even in the defocused position. Since the optical axes between adjacent optical units are crossed obliquely, object distance is derived from the parallax at each boundary position and an adequate deblurring process is achieved for the defocused image.
© 2012 OSA
A reduction type image sensor which has scanning mirrors, a fixed imaging lens and a line image sensor is used as a main reading device for a copier to meet specifications of high resolution and large depth of field (DOF). The size of the reduction type image sensor is large because the object distance is large and it needs a complicated driving mechanism to scan the mirrors while keeping the object distance.
Compound eye optics has been researched for long time to reduce optics size. A large FOV is divided into many small areas and plural units of imaging optical system take image of the separated area, thus reducing the total size.
In the second type of compound eye optics, each optical unit forms an independent image on its image sensor and those images are combined electrically for a final output image [4
4. J. Tanida, T. Kumagai, K. Yamada, S. Miyatake, K. Ishida, T. Morimoto, N. Kondou, D. Miyazaki, and Y. Ichioka, “Thin observation module by bound optics (TOMBO) concept and experimental verification,” Appl. Opt. 40(11), 1806–1813 (2001). [CrossRef] [PubMed]
6. G. Druart, N. Guérineau, R. Haïdar, S. Thétas, J. Taboury, S. Rommeluère, J. Primot, and M. Fendler, “Demonstration of an infrared microcamera inspired by Xenos peckii vision,” Appl. Opt. 48(18), 3368–3374 (2009). [CrossRef] [PubMed]
]. Limiting our discussion to image scanners, there are many reports and patents of an image scanner based on this second type [7
7. I. Maeda, T. Inokuchi, and T. Miyashita, US patent 4776683 (1988).
8. K. Nagatani, K. Morita, H. Okushiba, S. Kojima, and R. Sakaguchi, US patent 5399850 (1995).
]. However, this type of image scanner is only used as a CIS in which object distance is constant because magnification ratio changes due to the object distance and this causes mismatching of the image combining process.
We developed a compact image scanner based on the second type of compound eye optics. The DOF is large enough to be used as the main reading device of a copier by applying a telecentric optical system.
2. Basic construction of our compound eye optics and principle of large depth of field
Fig. 1 Conceptual construction of our compound eye scanner.
shows the conceptual construction of our optics to explain the basic idea. Each cylinder in Fig. 1
expresses an optical unit. The X direction is called main scanning direction. The Y direction is called sub scanning direction, in which a paper as an object is scanned on the top glass. A large FOV along the X direction is divided into small fields of 10 mm length of which an optical unit takes an image. Each optical unit is aligned in a zigzag alignment of two lines of A and B. The optical axes are inclined in the Y direction as shown in Fig. 1(c)
so that every optical unit has the same reading position on the top glass in the Y direction. Each optical unit forms an inverted image on the respective image sensor.
Fig. 2 Image combining process.
shows the image combining process. Figure 2(a)
shows a Japanese character as an object on the top glass, which strides on the areas of FOV of No.1 and No. 2 optical units. The areas of FOV overlap each other slightly. The images by the two optical units after scanned in the Y direction are shown in Fig. 2(b)
. The images of Fig. 2(b)
have the same part of S'S which is the transcription of the area of the overlapped FOV between No. 1 and No. 2 optical units. The images are reverted and combined to coincide with the overlapped images in electrical image combining processing.
Cross correlation is calculated in the image combining process. Since two pictures of the overlapped area have different coordinates only in the Y direction, the value of Δy is calculated by searching an optimal point with cross correlation between the two pictures. Thus the value of Δy derives the value of Δz at every boundary of adjacent optical units and every position in the Y direction. The images are locally expanded or contracted depending on the local irregularity of the object surface.
Additionally, since Δz
is known at every position, it is easy to sharpen the blurred image caused by defocusing locally. Point spread function at Δz
can be obtained easily from ray trace simulation and the blurred image is deconvolved with the point spread function. It means that this deblurring process extends the DOF, which is defined as depth range where modulation transfer function (MTF) exceeds a threshold, as show in Fig. 3
Fig. 3 Extension of depth of field (DOF) by deblurring process.
. The threshold value of MTF is determined as 0.3 at half of the Nyquist frequency, for example. The original MTF versus Δz
is expressed as graph (a) and its DOF is shown as za
in Fig. 3
. The deblurring process increases the MTF to graph (b) and the DOF is extended to zb
in Fig. 3
3. Design of reflective optical unit
We designed each optical unit as a reflective optics folding the ray path to further reduce the size in addition to the compound eye design as shown in Fig. 4
Fig. 4 Configuration of reflective optical elements in an optical unit from perspective view.
. It has five optical elements: two flat mirrors (M1 and M2), two concave mirrors (L1 and L2) and an aperture stop. It is important that the aperture stop is placed at the back focal position of L1 to form a telecentric optical system on the object side. M1 and M2 have roles to deflect rays in the Y direction. Since the optical path from L1 to the focal point on the object side is designed to be the focal length of L1, 20 mm, the rays after L1 are collimated. L1 and L2 have same curvature for easy trial manufacture and L2 is placed at 20 mm after the aperture stop; therefore, the image side is also telecentric and the magnification ratio of the optical unit equals one.
Fig. 7 Projection view of optical elements in Y-Z plane.
shows the projection view of those optical elements in the Y-Z plane. A thin illumination module composed of LED arrays is also shown in Fig. 7
. The optical track size is compact: 60 mm in the Y direction and 23.5 mm in the Z direction. The optical units are placed in an alternately inverted direction, and the optical elements are arranged in such a way as to avoid interference between optical elements of the adjacent optical units.
Fig. 8 MTF at 6 lp/mm versus defocal position in object space by simulation analysis.
shows the design result of MTF versus Δz
in the object space at spatial frequency ν
= 6 line pairs/mm (lp/mm). We define our MTF specification at 6 lp/mm, half of the Nyquist frequency of 600 dots per inch (dpi) sensor-pixel-density. Graph lines (a) and (b) are calculated at object height 0 in the X and Y directions, respectively, and lines (c) and (d) are calculated at object height 5 mm in the X and Y directions, respectively. Since rays are obliquely incident in L1 and L2, the optical unit has a slight astigmatism which is shown in the difference of the peak positions of lines (a) and (b) in Fig. 8
. However, the MTF of any line in Fig. 8
exceeds 0.3 in a wide range of 5 mm in the horizontal axis.
Assembly of our compound eye optics. (a) The optical units are put into a frame with alternately inverted directions. (b) The optical units are arranged in the same way as Fig. 6
shows assembly of our compound eye optics. Those 14 optical units are put into a frame as shown in Fig. 9(a)
with alternately inverted directions as shown in Fig. 9(b)
. An illumination module and an image sensor board are attached over and under the assembled imaging optics, respectively, and then a driver circuit and a signal processing circuit are connected. Paper charts are pasted onto a cylindrical rotating drum, and our image scanner is fixed in front of the drum and takes images of the charts.
Fig. 10 Example of the image combining process. (a) Before the combining process. Each image is inverted. (b) After combining process.
shows four pictures taken by four optical units in such an experiment. The horizontal and vertical directions correspond to the X and Y directions in Fig. 1
, respectively. There are overlapped areas in each boundary, as seen in Fig. 10(a)
. Figure 10(b)
is the result of a combining process in which each piece of the picture is shifted only in the Y direction so that the boundary areas of the four pictures coincide.
Fig. 11 Images of a resolution chart. Each image is combined from images of three optical units.
shows output images of a part of a resolution chart at some object distance. The width of the part of the resolution chart is 25 mm, therefore each image is combined from images of three optical units. A number in the images, n
, shows the number of lines and spaces per inch at the vertical positions. Therefore n
is converted to ν
by Eq. (2)
Resolution rapidly degrades by Δz
in a conventional CIS by GRIN lens array as shown in Fig. 11(a)
. Meanwhile in our imaging scanner, the resolution is better than the conventional CIS even without the deblurring process as shown in Fig. 11(b)
. Though the image at 7 mm in Fig. 11(b)
is blurred, the resolution can be restored by performing the deblurring process as shown in Fig. 11(c)
Contrast calculated from images at Δz
= 7 mm in Fig. 11(b)
, before and after deblurring process.
shows contrast C
) of the images at Δz
= 7 mm in Fig. 11(b)
. Here C
) is defined by Eq. (3)
) and Imin
) are the maximum and the minimum values of the intensity at ν
. We used the contrast C
) as an evaluation index instead of MTF derived from Fourier transform of a point image, because the point image cannot be resolved by the 600 dpi sensor. Contrast is lower than 0.2 in the range of ν
> 4 lp/mm before the deblurring process, while the deblurring process increases contrast over 0.4 in the range of ν
< 6 lp/mm.
We proposed compound eye optics constructed from plural reflective optical units for a compact image scanner. Since the optical unit is telecentric in the object space and the magnification ratio is constant regardless of the object distance, no degradation occurs in the image combining process and the original DOF of the optical unit is conserved after the image combining process. Since the optical axes between adjacent optical units are crossed obliquely, the images are shifted only in the sub scanning direction between adjacent optical units and the object distance can be calculated from the shift value of the images. An adequate deblurring process using the information of the object distance can restore contrast, which means that DOF is extended.
We fabricated the prototype, which is thin at 23.5 mm in optical track size, and demonstrated the performance that the contrast value at ν = 6 lp/mm is larger than 0.4, even with a large object distance of 7 mm from the top glass.
References and links
R. H. Anderson, “Close-up imaging of documents and displays with lens arrays,” Appl. Opt. 18(4), 477–484 (1979). [CrossRef] [PubMed]
J. Meyer, A. Brückner, R. Leitel, P. Dannberg, A. Bräuer, and A. Tünnermann, “Optical cluster eye fabricated on wafer-level,” Opt. Express 19(18), 17506–17519 (2011). [CrossRef] [PubMed]
M. Kawazu and Y. Ogura, “Application of gradient-index fiber arrays to copying machines,” Appl. Opt. 19(7), 1105–1112 (1980). [CrossRef] [PubMed]
J. Tanida, T. Kumagai, K. Yamada, S. Miyatake, K. Ishida, T. Morimoto, N. Kondou, D. Miyazaki, and Y. Ichioka, “Thin observation module by bound optics (TOMBO) concept and experimental verification,” Appl. Opt. 40(11), 1806–1813 (2001). [CrossRef] [PubMed]
A. Brückner, J. Duparré, R. Leitel, P. Dannberg, A. Bräuer, and A. Tünnermann, “Thin wafer-level camera lenses inspired by insect compound eyes,” Opt. Express 18(24), 24379–24394 (2010). [CrossRef] [PubMed]
G. Druart, N. Guérineau, R. Haïdar, S. Thétas, J. Taboury, S. Rommeluère, J. Primot, and M. Fendler, “Demonstration of an infrared microcamera inspired by Xenos peckii vision,” Appl. Opt. 48(18), 3368–3374 (2009). [CrossRef] [PubMed]
I. Maeda, T. Inokuchi, and T. Miyashita, US patent 4776683 (1988).
K. Nagatani, K. Morita, H. Okushiba, S. Kojima, and R. Sakaguchi, US patent 5399850 (1995).
(040.1240) Detectors : Arrays
(110.0110) Imaging systems : Imaging systems
(120.5800) Instrumentation, measurement, and metrology : Scanners
(220.0220) Optical design and fabrication : Optical design and fabrication
Original Manuscript: March 2, 2012
Revised Manuscript: April 11, 2012
Manuscript Accepted: April 18, 2012
Published: June 1, 2012
Hiroyuki Kawano, Tatsuki Okamoto, Taku Matsuzawa, Hajime Nakajima, Junko Makita, Naoyuki Fujiyama, Eiji Niikura, Tatsuya Kunieda, and Tadashi Minobe, "Compact image scanner with large depth of field by compound eye system," Opt. Express 20, 13532-13538 (2012)