## Improved resolution 3D object reconstruction using computational integral imaging with time multiplexing

Optics Express, Vol. 12, Issue 19, pp. 4579-4588 (2004)

http://dx.doi.org/10.1364/OPEX.12.004579

Acrobat PDF (2505 KB)

### Abstract

In the computational three-dimensional (3D) volumetric reconstruction integral imaging (II) system, volume pixels of the scene are reconstructed by superimposing the inversely mapped elemental images through a computationally simulated optical reconstruction process according to ray optics. Close placement of a 3D object to the lenslet array in the pickup process may result in significant variation in intensity between the adjacent pixels of the reconstructed image, degrading the quality of the image. The intensity differences result from the different number of the superimposed elemental images used for reconstructing the corresponding pixels. In this paper, we propose improvements of the reconstructed image quality in two ways using 1) normalized computational 3D volumetric reconstruction II, and 2) hybrid moving lenslet array technique (MALT). To reduce the intensity irregularities between the pixels, we normalize the intensities of the reconstructed image pixels by the overlapping numbers of the inversely mapped elemental images. To capture the elemental image sets for the MALT process, a stationary 3D object pickup process is performed repeatedly at various locations of the pickup lenslet array’s focal plane, which is perpendicular to the optical axis. With MALT, we are able to enhance the quality of the reconstructed images by increasing the sampling rate. We present experimental results of volume pixel reconstruction to test and verify the performance of the proposed reconstruction algorithm. We have shown that substantial improvement in the visual quality of the 3D reconstruction is obtained using the proposed technique.

© 2004 Optical Society of America

## 1. Introduction

4. N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens array,” Opt. Eng. **33**, 3624–3633 (1994). [CrossRef]

14. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. **26**, 157–159 (2001) [CrossRef]

17. J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real time integral photography for three-dimensional Images,” Appl. Opt. **37**, 2034–2045 (1998). [CrossRef]

21. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A **15**, 2059–2065 (1998) [CrossRef]

14. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. **26**, 157–159 (2001) [CrossRef]

18. A. Stern and B. Javidi, “Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging” Appl. Opt. **42**, 7036–7042 (2003). [CrossRef] [PubMed]

23. S. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Optics Express , **12**, 483–491 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-483 [CrossRef] [PubMed]

14. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. **26**, 157–159 (2001) [CrossRef]

15. Y. Frauel and B. Javidi, “Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,” Appl. Opt. **41**, 5488–5496 (2002). [CrossRef] [PubMed]

18. A. Stern and B. Javidi, “Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging” Appl. Opt. **42**, 7036–7042 (2003). [CrossRef] [PubMed]

18. A. Stern and B. Javidi, “Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging” Appl. Opt. **42**, 7036–7042 (2003). [CrossRef] [PubMed]

23. S. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Optics Express , **12**, 483–491 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-483 [CrossRef] [PubMed]

22. J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. **27**, 324–326 (2002). [CrossRef]

## 2. Overview

### 2.1 3D volumetric reconstruction using computational II

23. S. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Optics Express , **12**, 483–491 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-483 [CrossRef] [PubMed]

*z*for

*M*> 1, where

*M*is the magnification factor. As shown in Fig. 3, it is the ratio of the distance between the synthesized pinhole array and the reconstruction image plane at

*z*, to the distance between the synthesized pinhole array and the elemental image plane (

*g*), that is

*M*=

*z*/

*g*. The intensity at the reconstruction plane is inversely proportional to the square of the distance between the elemental image and the reconstruction plane. The inverse mappings of all the elemental images corresponding to the magnification factor

*M*form a single image at any reconstruction image plane

*z*=

*L*. It is possible to form the 3D volume information by repeating this process for all reconstruction planes of interest with different distance information. Therefore, we use all of the information of the recorded elemental images to reconstruct a full 3D scene, which requires simple inverse mapping and superposition operations.

### 2.2 Moving array lenslet technique

22. J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. **27**, 324–326 (2002). [CrossRef]

*v*

_{x},

*v*

_{y}be the velocity of the lenslet array movement in the

*x*,

*y*axes, respectively. Then, for the stationary object, the velocities in each direction should satisfy the following relations:

*v*

_{x}>

*p*

_{x}

*S*,

*v*

_{y}>

*p*

_{y}

*S*, where,

*S*is the inverse of the electronic shutter speed of a CCD or the inverse of the response time of the human eye, and

*p*

_{x},

*p*

_{y}are the lenslet pitches in the

*x*,

*y*axes, respectively. As the lenslet array moves, the elemental images change and have different perspectives within one lenslet pitch. Different elemental image sets are recorded at different sampling points by the 2D image sensor. For optical reconstruction, these recorded elemental image sets are integrated in the time domain by displaying them on the SLM or LCD at the same sampling speed of the elemental images in the pickup procedure.

## 3. 3D volumetric reconstruction with improved resolution

*mm*. The elemental images of the object are captured with the digital camera and the pickup microlens array. The microlens array used in the experiments has 53 × 53 square refractive lenses in a 55

*mm*square area. The size of each lenslet is 1.09

*mm*× 1.09

*mm*, with less than 7.6

*μm*separation. The focal length of each microlens is 3.3

*mm*. The size of each captured elemental image is 70 pixels × 70 pixels.

### 3.1 Hybrid MALT

22. J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. **27**, 324–326 (2002). [CrossRef]

*k*be the total number of elemental image sets captured with an optical MALT pickup process. Each unique elemental image set is obtained by moving the pickup lenslet array within one pitch of the lenslet. Figure 3 is the illustration of the lateral axis coordinates of the reconstruction plane according to the

*p*th elemental image at (

*x*,

*z*) for the

*i*-th elemental image set. (

*x*,

*y*) are lateral coordinates in the reconstructed image plane, and z is the longitudinal coordinate. In this example, compared with the 1

^{st}lenslet array, the

*i*-th lenslet array has been moved downward. To represent the color images, we capture the elemental images and reconstruct the 3D image for three different wavelengths (blue, red, and green). Let

*p*th row and the

*q*th column elemental image of the

*i*-th elemental image set, and

*x*,

*y*,

*z; λ*) be the inversely mapped image of the elemental image

*x*,

*y*,

*z*) for each wavelength.

*p*th row and the

*q*th column microlens for the

*i*-th position of the pickup microlens array. Therefore, it is also a function of the wavelength of the rays emanating from the object [13]. Let

*x*

_{i}/

*M, y*

_{i}/

*M*be the relative displacement of the lenslet array in

*x*and

*y*directions for

*i*-th elemental image set compared with the 1

^{st}elemental image set, respectively. Then the relative displacements of

*x*,

*y*,

*z; λ*) at the reconstructed image plane are (

*x*

_{i},

*y*

_{i}) in the (

*x*,

*y*) direction, respectively. Therefore,

*x*,

*y*,

*z; λ*) can be represented as:

*s*

_{x},

*s*

_{y}are the sizes of elemental image

*x*,

*y*directions, respectively. The relative displacements,

*x*

_{i},

*y*

_{i}for the 1

^{st}elemental image set are zeros. For the

*i*-th elemental image set, he reconstructed 3D image at (

*x*,

*y*,

*z*) is the superposition of all the inversely mapped elemental images:

*m*,

*n*are the number of the elemental images in

*x*,

*y*directions, respectively. From Eq. (2) we can obtain the computational hybrid MALT image at display distance

*z*as a superposition for all

*k*computationally reconstructed images with each displacement.

*k*is the total number of the pickup MALT process, that is, the total number of the recorded elemental image sets used to reconstruct the 3D scene.

*x*

_{i}/

*M*or

*y*

_{i}/

*M*exceeds the pixel size of one pitch, we can take the modular value of the displacement, that is,

*x*

_{i}/

*M*mod (pitch) or

*y*

_{i}/

*M*mod (pitch), which are less than one pitch. For the elemental image set 12, pixel displacement in x direction is 72, which can be treated as 72 mod (70) = 2 pixel displacement.

### 3.3 3D volumetric reconstruction using normalized hybrid MALT

*k*different elemental image sets, we perform computational MALT to obtain a 3D scene. At the display distance

*z*, 12 different reconstructed images, which have been obtained using the computational II technique, are normalized and imported to the computational MALT reconstruction process to improve the resolution of the image at display distance

*z*. A 3D scene is achieved by superimposing individually computed normalized II images. Figure 8 shows the reconstructed 3D object by MALT images at

*z*= 7

*mm*after normalization.

*z*= 7

*mm*, where the right headlight area is well focused. Figure 9(b) shows the reconstructed image at a distance of

*z*= 9

*mm*which focuses on the front emblem of the car. Figure 9(c) shows the reconstructed images at a distance of

*z*= 11

*mm*, where we can clearly see the left headlight and rear part of the right front wheel cap. Figure 9(d) shows the reconstructed image at a distance of

*z*= 21

*mm*, where we can see the rear area of the car. In this case, we can see the blurred front area of the car. Figure 10 is a movie of a series of the reconstructed 3D volume imagery using the proposed computational II from the image display plane at

*z*= 6

*mm*to the image display plane at

*z*= 30

*mm*with increment of 0.1

*mm*. Using the proposed normalized computational II reconstruction with hybrid MALT, it is possible to obtain improved resolution of a full 3D volume reconstructed image.

## 4. Conclusion

## References and Links

1. | S. A. Benton, ed., |

2. | D. H. McMahon and H. J. Caulfield, “A technique for producing wide-angle holographic displays,” Appl. Opt. |

3. | P. Ambs, L. Bigue, R. Binet, J. Colineau, J.-C. Lehureau, and J.-P. Huignard, “Image reconstruction using electro-optic holography,” |

4. | N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens array,” Opt. Eng. |

5. | M. Martínez-Corral, M. T. Caballero, and A. Pons, “Axial apodization in 4Pi-confocal microscopy by annular binary filters,” J. Opt. Soc. Am. A |

6. | B. Javidi and F. Okano, eds., |

7. | J. W. V. Gissen, M. A Viergever, and C. N. D. Graff, “Improved tomographic reconstruction in seven-pinhole imaging,” IEEE Trans. Med. Imag. |

8. | L. T. Chang, B. Macdonald, and V. Perez-Mendez, “Axial tomography and three dimensional image reconstruction,” IEEE Trans. Nucl. Sci. |

9. | T. Okoshi, |

10. | G. Lippmann, “La photographic intergrale,” C. R. Acad. Sci. |

11. | H. E. Ives, “Optical properties of a Lipmann lenticulated sheet,” J. Opt. Soc. Am. |

12. | D. L. Marks and D. J. Brady, “Three-dimensional source reconstruction with a scanned pinhole camera,” Opt. Lett. |

13. | J. W. Goodman, |

14. | H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. |

15. | Y. Frauel and B. Javidi, “Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,” Appl. Opt. |

16. | J.-S. Jang and B. Javidi, “Formation of orthoscopic three-dimensional real images in direct pickup one-step integral imaging,” Opt. Eng. |

17. | J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real time integral photography for three-dimensional Images,” Appl. Opt. |

18. | A. Stern and B. Javidi, “Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging” Appl. Opt. |

19. | C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. |

20. | T. Okoshi, “Optimum design and depth resolution of lens sheet and projection type three dimensional displays,” Appl. Opt. |

21. | H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A |

22. | J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. |

23. | S. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Optics Express , |

**OCIS Codes**

(080.0080) Geometric optics : Geometric optics

(100.6890) Image processing : Three-dimensional image processing

(110.0110) Imaging systems : Imaging systems

(110.6880) Imaging systems : Three-dimensional image acquisition

**ToC Category:**

Research Papers

**History**

Original Manuscript: August 11, 2004

Revised Manuscript: September 10, 2004

Published: September 20, 2004

**Citation**

Seung-Hyun Hong and Bahram Javidi, "Improved resolution 3D object reconstruction using computational integral imaging with time multiplexing," Opt. Express **12**, 4579-4588 (2004)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-12-19-4579

Sort: Journal | Reset

### References

- S. A. Benton, ed., Selected Papers on Three-Dimensional Displays (SPIE Optical Engineering Press, Bellingham, WA, 2001).
- D. H. McMahon and H. J. Caulfield, �??A technique for producing wide-angle holographic displays,�?? Appl. Opt. 9, 91-96 (1970). [CrossRef] [PubMed]
- P. Ambs, L. Bigue, R. Binet, J. Colineau, J.-C. Lehureau and J.-P. Huignard, �??Image reconstruction using electro-optic holography,�?? Proc. of the 16th Annual Meeting of the IEEE Lasers and Electro-Optics Society, LEOS 2003, vol. 1 (IEEE, Piscataway, NJ, 2003) pp. 172-173
- N. Davies, M. McCormick and M. Brewin, �??Design and analysis of an image transfer system using microlens array,�?? Opt. Eng. 33, 3624-3633 (1994). [CrossRef]
- M. Martínez-Corral, M. T. Caballero and A. Pons, �??Axial apodization in 4Pi-confocal microscopy by annular binary filters,�?? J. Opt. Soc. Am. A 19, 1532-1536 (2002). [CrossRef]
- B. Javidi, and F. Okano, eds., Three Dimensional Television, Video, and Display Technologies (Springer, Berlin, 2002)
- J. W. V. Gissen, M. A Viergever and C. N. D. Graff, �??Improved tomographic reconstruction in seven-pinhole imaging,�?? IEEE Trans. Med. Imag. MI-4, 91-103 (1985) [CrossRef]
- L. T. Chang, B. Macdonald and V. Perez-Mendez, �??Axial tomography and three dimensional image reconstruction,�?? IEEE Trans. Nucl. Sci. NS-23, 568-572 (1976) [CrossRef]
- T. Okoshi, Three-dimensional imaging techniques (Academic Press, New York, 1976).
- G. Lippmann, �??La photographic intergrale,�?? C. R. Acad. Sci. 146, 446-451 (1908).
- H. E. Ives, �??Optical properties of a Lipmann lenticulated sheet,�?? J. Opt. Soc. Am. 21, 171-176 (1931). [CrossRef]
- D. L. Marks, and D. J. Brady, �??Three-dimensional source reconstruction with a scanned pinhole camera,�?? Opt. Lett. 23, 820-822 (1998) [CrossRef]
- J. W. Goodman, Introduction to Fourier Optics (McGraw-Hill, New York, NY, 1996).
- . H. Arimoto and B. Javidi, �??Integral three-dimensional imaging with digital reconstruction,�?? Opt. Lett. 26, 157-159 (2001) [CrossRef]
- Y. Frauel and B. Javidi, �??Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,�?? Appl. Opt. 41, 5488-5496 (2002). [CrossRef] [PubMed]
- J.-S. Jang and B. Javidi, �??Formation of orthoscopic three-dimensional real images in direct pickup one-step integral imaging,�?? Opt. Eng. 42, 1869-1870 (2003). [CrossRef]
- J. Arai, F. Okano, H. Hoshino and I. Yuyama, �??Gradient-index lens-array method based on real time integral photography for three-dimensional Images,�?? Appl. Opt. 37, 2034-2045 (1998). [CrossRef]
- A. Stern and B. Javidi, �??Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging�?? Appl. Opt. 42, 7036-7042 (2003). [CrossRef] [PubMed]
- C. B. Burckhardt, �??Optimum parameters and resolution limitation of integral photography,�?? J. Opt. Soc. Am. 58, 71-76 (1968). [CrossRef]
- T. Okoshi, �??Optimum design and depth resolution of lens sheet and projection type three dimensional displays,�?? Appl. Opt. 10, 2284-2291 (1971). [CrossRef] [PubMed]
- H. Hoshino, F. Okano, H. Isono and I. Yuyama, �??Analysis of resolution limitation of integral photography,�?? J. Opt. Soc. Am. A 15, 2059-2065 (1998) [CrossRef]
- J.-S. Jang and B. Javidi, �??Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,�?? Opt. Lett. 27, 324-326 (2002). [CrossRef]
- S. Hong, J.-S. Jang and B. Javidi, �??Three-dimensional volumetric object reconstruction using computational integral imaging,�?? Optics Express, 12, 483-491 (2004), <a href="http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-483">http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-483</a> [CrossRef] [PubMed]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.