## Performance of 3D integral imaging with position uncertainty

Optics Express, Vol. 15, Issue 19, pp. 11889-11902 (2007)

http://dx.doi.org/10.1364/OE.15.011889

Acrobat PDF (401 KB)

### Abstract

We present the theoretical and simulation results on the analysis of Synthetic Aperture Integral Imaging (SAII) technique and its sensitivity to pickup position uncertainty. SAII is a passive three dimensional imaging technique based on multiple image acquisitions with different perspective of the scene under incoherent or natural illumination. In practical SAII applications, there is always an uncertainty associated with the position at which each sensor captures the elemental image. We present a theoretical analysis that quantifies image degradation in terms of Mean Square Error (MSE) metric. Simulation results are also presented to identify the parameters affecting the reconstruction degradation and to confirm the analysis. We show that in SAII with a given uncertainty in the sensor locations, the high spatial frequency content of the 3D reconstructed images are most degraded. We also show an inverse relationship between the reconstruction distance and degradation metric. To the best of our knowledge, this is the first time that the effects of sensor position uncertainty on 3D computational reconstruction in synthetic aperture integral imaging systems have been quantitatively analyzed.

© 2007 Optical Society of America

## 1. Introduction

4. B. Javidi, S.-H. Hong, and O. Matoba, “Multi dimensional optical sensors and imaging systems,” Appl. Opt. **45**, 2986–2994 (2006). [CrossRef] [PubMed]

9. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. **36**, 1598–1603 (1997). [CrossRef] [PubMed]

10. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE **94**, 591–607 (2006). [CrossRef]

14. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. **A 15**, 2059–2065 (1998). [CrossRef]

16. A. Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express **11**, 2446–2451 (2003). [CrossRef] [PubMed]

20. R. Martínez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Enhanced depth of field integral imaging with sensor resolution constraints,” Opt. Express **12**, 5237–5242 (2004). [CrossRef] [PubMed]

19. S. Kishk and B. Javidi, “Improved resolution 3D object sensing and recognition using time multiplexed computational integral imaging,” Opt. Express **11**, 3528–3541 (2003). [CrossRef] [PubMed]

12. J. S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. **27**, 1144–1146 (2002). http://www.opticsinfobase.org/abstract.cfm?URI=ol-27-13-1144 [CrossRef]

21. J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. **27**, 324–326 (2002). [CrossRef]

22. J. Hong, J. -H. Park, S. Jung, and B. Lee, “Depth-enhanced integral imaging by use of optical path control,” Opt. Lett. **29**, 1790–1792 (2004) http://www.opticsinfobase.org/abstract.cfm?URI=ol-29-15-1790. [CrossRef] [PubMed]

23. S. -W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. **29**, 2420–2422 (2004) [CrossRef] [PubMed]

24. M. Martínez-Corral, B. Javidi, R. Martínez-Cuenca, and G. Saavedra, “Integral imaging with improved depth of field by use of amplitude modulated microlens array,” Appl. Opt. **43**, 5806–5813 (2004). [CrossRef] [PubMed]

25. Y. S. Hwang, S. -H. Hong, and B. Javidi, “Free View 3-D Visualization of Occluded Objects by Using Computational Synthetic Aperture Integral Imaging,” J. Display Technol. **3**, 64–70 (2007) http://www.opticsinfobase.org/abstract.cfm?URI=jdt-3-1-64. [CrossRef]

25. Y. S. Hwang, S. -H. Hong, and B. Javidi, “Free View 3-D Visualization of Occluded Objects by Using Computational Synthetic Aperture Integral Imaging,” J. Display Technol. **3**, 64–70 (2007) http://www.opticsinfobase.org/abstract.cfm?URI=jdt-3-1-64. [CrossRef]

27. Y. Frauel and B. Javidi, “Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,” Appl. Opt. **41**, 5488–5496 (2002). [CrossRef] [PubMed]

28. J. Arai, M. Okui, M. Kobayashi, and F. Okano, “Geometrical effects of positional errors in integral photography,” J. Opt. Soc. Am. **A 21**, 951–958 (2004) http://www.opticsinfobase.org/abstract.cfm?URI=josaa-21-6-951 [CrossRef]

28. J. Arai, M. Okui, M. Kobayashi, and F. Okano, “Geometrical effects of positional errors in integral photography,” J. Opt. Soc. Am. **A 21**, 951–958 (2004) http://www.opticsinfobase.org/abstract.cfm?URI=josaa-21-6-951 [CrossRef]

28. J. Arai, M. Okui, M. Kobayashi, and F. Okano, “Geometrical effects of positional errors in integral photography,” J. Opt. Soc. Am. **A 21**, 951–958 (2004) http://www.opticsinfobase.org/abstract.cfm?URI=josaa-21-6-951 [CrossRef]

**A 21**, 951–958 (2004) http://www.opticsinfobase.org/abstract.cfm?URI=josaa-21-6-951 [CrossRef]

**A 21**, 951–958 (2004) http://www.opticsinfobase.org/abstract.cfm?URI=josaa-21-6-951 [CrossRef]

## 2. Computational reconstruction in integral imaging

15. S. -H. Hong, J. -S Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging.” Opt. Express **12**, 483–491 (2004). [CrossRef] [PubMed]

*z*

_{0}/

*g*); where z

_{0}is the distance between the desired plane of reconstruction and the sensor along the optical axis; and

*g*denotes the distance of the image plane from each lens [see Fig. 2.]. This approach is more appropriate for SAII in which each elemental image is captured on a full sensor and has a better resolution comparing to lenslet based elemental images. An additional merit of this approach on the computational side is that there would be no need to handle magnified images (very large matrices) for the reconstruction process. This greatly reduces the reconstruction time, required resources and computational burden. The magnification factor

*M*, is given by

*z*

_{0}/

*g*. The original elemental image

*O*

*(*

_{kl}*x*,

*y*) is flipped and shifted according to the following expression:

*S*

*and*

_{x}*S*

*denote the separation of sensors in*

_{y}*x*and

*y*directions at the pickup plane respectively, whereas subscripts

*k*and

*l*signify the location of elemental image

*O*

*in the pickup grid. The final reconstruction plane consists of partial overlap of flipped and shifted elemental images as:*

_{kl}*K*and

*L*denote the number of elemental images acquired in the

*x*and

*y*directions; also

*R*compensates for intensity variation due to different distances from the object plane to elemental image

*O*

_{kl}on the sensor and is given by [see Fig. 2.].

_{0}+

*g*)

^{2}. This is due to the fact that for reconstruction we always have

*MN*

*<*

_{x}*z*

_{0}, then the term (z

_{0}+

*g*)

^{2}would dominate in Eq. (3). This condition is equivalent to having an imaging sensor (CCD) which is small in dimensions comparing to the effective focal length of the imaging optics.

## 3. Synthetic Aperture Integral Imaging (SAII)

## 4. Sensitivity analysis of SAII

*p*, is introduced in the position from which elemental images of an accurate SAII pickup are reconstructed. In other words, the ith elemental image,

*O*

*(*

_{i}**p**), is shifted from its original location by Δ

*p*; where

**p**=(

*x*,

*y*) denotes the position variable in the sensor plane.

*x*. Since we eventually want to analyze random displacements which have independent components in

*x*and

*y*direction, extension of the results to the more realistic case of two dimensional position errors is straight forward.

*K*denotes the number of elemental images taken in each direction on the synthetic aperture; Δ

*p*

*denotes the sensor location error associated with ith sensor during pickup and magnification factor,*

_{i}*M*, is given by

*z*/

*g*. We define the difference between Eqs. (2) and (4) as the error metric:

*E*(.) is the expectation operator;

*R*is a function of system parameters and is dominated by (

*z*+

*g*) in Eq. (3). We proceed with the analysis in the Fourier domain where the stochastic spatial shifts of elemental images appear as phase terms. Thus

*err*(

**p**,

*z*) can be written as:

**f**=(

*f*

*,*

_{x}*f*

*) stands for the couplet of spatial frequencies in the x and y directions such that*

_{y}**Ĩ**=

*I*(

*f*

*,*

_{x}*f*

*)=*

_{y}*F*.

*T*{

*I*(

**p**)}. Note that at this stage we assume that the random displacements are all in the

*x*direction, thus the shift appears as a phase term exp(-

*jf*

*Δ*

_{x}*p*

*/*

_{i}*M*) in Eq. (7) only incorporating

*f*

*. Henceforth, we use the term error spectrum for |*

_{x}*Err*(

**f**,

*z*)|

^{2}. We show in appendix A how one gets the following expression for the expected value of the error spectrum:

*γ*=

*E*{exp(-

*jf*

*Δ*

_{x}*p*/

*M*)} is the moment generating function of random variable Δ

*p*and can be derived for all basic probability distributions, e.g. Gaussian, uniform, Laplacian and etc [29]. Hereafter, we assume that the camera location error follows a zero mean Gaussian distribution with variance σ

^{2}, i.e.

*N*(0, σ

^{2}). However, it is straight forward to assume other distributions. For a Gaussian random variable

*X*~

*N*(µ,σ

^{2}) the moment generating function is

*M*

*(*

_{x}*t*)=

*E*{

*e*

*}=exp(*

^{tX}*µt*+

*σ*

^{2}

*t*

^{2}/2) [29], thus for Δ

*p*~

*N*(0, σ

^{2}),

*γ*becomes a real number,

*γ*=

*E*{exp(-

*jf*

*Δ*

_{x}*p*/

*M*)}=exp(-

*f*

^{2}

_{x}*σ*

^{2}/2

*M*

^{2}). Essentially this parameter depends on the reconstruction distance and the random behavior of the dislocation errors and can be explicitly calculated for any given error distribution. Equation (8) can be further simplified to:

*x*and

*y*components, Eq. (9) can be extended to the following:

*z*in the reconstruction plane:

*γ*

^{2}) acts as a weight for the energy spectrum of the shifted elemental images. As discussed earlier, in the case of a Gaussian positioning error, (1-

*γ*

^{2}) [1-exp(-

*f*

^{2}

_{x}*σ*

^{2}/

*M*

^{2})], which is a high pass function. This means that the higher spectral components of elemental images contribute more significantly in the MSE of each plane comparing to the energy contained in the low spatial frequencies. In addition, at larger reconstruction distances, the stop band of this filter becomes wider and thus diminishes more of the spectral components of the elemental images and consequently reduces the resulting MSE. The bandwidth of this high pass filter depends solely on the ratio of variance of the positioning error probability distribution function, σ, and the magnification factor

*M*. In particular, for a Gaussian sensor positioning error, i.e. Δ

*p*~

*N*(0, σ

^{2}), the Full Width Half Maximum (FWHM) is given by:

*z*, which means one should expect less degradation when reconstructing far objects. This important outcome has been verified and demonstrated in the experimental results section of this paper. Likewise, the second term in the right hand side of Eq. (12) also filters out the low spatial frequencies from the cross correlation of two distinct flipped and shifted elemental images. This has the same effect as described for the first term with the only difference that (1-

*γ*

^{2})

^{2}has a larger FWHM comparing to (1-

*γ*

^{2}) in the first term and thus contributes less in the total MSE.

*K*elemental images. Extension to 2D pickup arrays is straight forward. Without loss of generality, we assume the point source to be located at distance

*z*from the pickup plane on the optical axis of the first elemental image, i.e.

*δ*(

*p*) ; the distance between image plane and the lens is

*g*[see Fig. 2]; also the camera is linearly translated with pitch

*S*

*. Thus according to Eq. (1) the*

_{p}*k*th flipped and shifted elemental image is:

*p*is the position variable. According to Eq. (19) we have the following expression for the error spectrum:

*p*~

*N*(0, σ

^{2}), then one has

*γ*=exp(-

*f*

^{2}

_{x}*σ*

^{2}

*g*

^{2}/2z

^{2}) and Eq. (16) can be rearranged to:

*p*~

*N*(0,1). The elemental images of the point source located at distance

*z*are calculated through geometrical optics. The simulation is repeated for 1000 trials with different random positions. At each trial, the set of elemental images are used to reconstruct the point source at its respective distance and the results are compared with the result of reconstruction using correctly positioned sensors. In each trial, MSE is computed according to Eq. (6) and (11) and all the 1000 MSEs for each reconstruction distance are averaged. The total MSE computed from Monte Carlo simulation is compared to the MSE calculated using Eq. (17) with similar parameters as

*K*=16,

*S*

_{p}=2.5mm and Δ

*p*~

*N*(0,1) for the point source located at different distances from

*z*=24cm to

*z*=40cm. Figure 3 shows the agreement of simulation results and that of the mathematical analysis.

*S*

*=*

_{x}*S*

*=2.5mm and the position displacements follow Δ*

_{y}*p*~

*N*(0, 1). Figure 4 shows the combination of the reconstruction results form all 1000 trials at 24cm and 40cm.

## 5. Experimental results

*x*-

*y*grid with the pitch of 5mm in both

*x*and

*y*directions. At each node, an elemental image is captured from the scene. The imaging sensor is 22.7×15.6mm and has a 10µm pixel pitch. Effective focal length of the camera lens is about 20mm; and elemental images are captured in a planar 16×16 grid. A subset of elemental images can be seen in Fig. 5(b) each conveying different perspective information. The dimension of the cars is about 5×2.5×2cm, whereas the helicopter is about 9×2.5×2cm in size. In Fig. 6 we show the 3D reconstruction of the scene in three different distances of the objects in Fig. 5(a) according to Eq. (2). As is clear, at each distance one of the objects is in focus while the others appear washed out.

*S*

_{x}*k*,

*S*

_{y}*l*} about which sensor position is measured with error (Δ

*P*

_{x},Δ

*P*

_{y}), which we model as two independent random variables, Δ

*P*

_{x,y}~

*N*(0,σ

^{2}). We use a rectangular grid with equal spacing in

*x*and

*y*directions, i.e. the pitch of

*S*

_{x,y}=

*S*

*=5mm. Thus, the measured position of the sensor for capturing elemental image*

_{p}*O*

*is represented by{*

_{kl}*S*

*(*

_{p}*k*+Δ

*P*

*/*

_{x}*S*

*),*

_{p}*S*

*(*

_{p}*l*+Δ

*P*

*/*

_{y}*S*

*)} where*

_{p}*k*,

*l*=0,1,2,…,15.

*P*

_{x,y}follows a

*N*(0,σ

^{2}), we define the fraction 100σ

^{2}/

*S*

_{p}to be the pitch error percentage. Note that Δ

*P*

_{x,y}/

*S*

_{p}represents a normalized positioning error metric. We perform computational reconstruction with Eq. (2) to reconstruct a plane of the 3D scene at the specific distance

*z*by utilizing the distorted camera positions. In order to quantify the degradation due to dislocation of cameras, the reconstruction results are compared with the ones using the correct positions on the equally spaced grid. Mean Square Error metric is computed using Eq. (6) to measure the difference of these images quantitatively. Figure 7 shows the result of reconstruction using known and random positions respectively at

*z*=24cm with 30% pitch error.

*z*. As a result, we have 500 reconstructed images of that plane for which we calculate the MSE [Eq. (6)] by using the corresponding reconstruction with correct positions. This simulation is done for distances from 24cm to 40cm which corresponds to magnification,

*M*, from 12 to 20. The pitch error was chosen to be 30% in both directions, i.e. σ

^{2}/

*S*

*=0.3. Since Δ*

_{p}*P*

_{x,y}is a Gaussian random variable, such an error means that 70% of the time Δ

*P*

_{x,y}remains less than 0.3 of the pitch (0.3

*S*

*). Figure 8(a) shows the box-and-whisker diagram of the MSEs for*

_{p}*z*=24

*cm*(

*M*=12) to

*z*=40

*cm*(

*M*=20).

*z*=

*z*

_{0}(

*M*=

*M*

_{0}). The blue box shows the variance of the MSE which is limited to its upper and lower quartiles and dotted blue line is limited to the smallest and largest computed MSEs. According to Monte Carlo simulation the average of these 500 MSEs is computed at each plane which is shown with the solid red line in Fig. 8(a). This average for each particular plane of the scene is a reasonable estimation of the error one can expect due to a 30% camera positioning error.

*g*is constant, magnification increase is identical to the reconstruction distance increase. Note that the variance of the error at each plane decreases when reconstruction distance increases, while its rate of decrease is greater than rate of decrease of the average MSE. This fact can be explained using Eq. (12) that shows MSE is inversely proportional to

*z*.

^{2}) emphasizes contributions of higher spatial frequencies of the elemental images in MSE, while suppresses the low spatial frequencies contributions. Fig. 9 also verifies the fact that error is larger for objects closer to pickup plane (sensor).

## 6. Conclusion

## Appendix A.

*p*has a random nature. Thus, the expected value for the error spectrum depends on the behavior of this variable, i.e. the distribution governing the spatial dislocation of the sensors during pickup. Since expectation is a linear function, one can break down the error expectation as following:

*γ*=

*E*{exp(-

*jf*

*Δ*

_{x}*p*/

*M*)} which is the moment generating function of random variable Δ

*p*[29]. Eq. (A.2) reduces to:

*p*

*denotes the sensor location error associated with*

_{i}*i*th sensor which is a random variable and the expected value of the random variable or a function of the random variable is constant which can come outside of the summation. In addition the sensor location errors are assumed to be independent, thus we have:

## References and Links

1. | S. A. Benton, ed., |

2. | B. Javidi and F. Okano, eds., |

3. | T. Okoshi, |

4. | B. Javidi, S.-H. Hong, and O. Matoba, “Multi dimensional optical sensors and imaging systems,” Appl. Opt. |

5. | M. G. Lippmann, “Epreuves reversibles donnant la sensation durelief,” J. Phys. |

6. | Y. A. Dudnikov, “On the design of a scheme for producing integral photographs by a combination method,” Sov. J. Opt. Technol. |

7. | H. E. Ives, “Optical properties of a Lippmann lenticuled sheet,” J. Opt. Soc. Am. |

8. | P. Sokolov, “Autostereoscpy and Integral Photography by Professor Lippmann’s Method,” (Moscow State Univ. Press, Moscow, Russia, 1911). |

9. | F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. |

10. | A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE |

11. | B. Wilburn, N. Joshi, V. Vaish, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” Proc. of the ACM |

12. | J. S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. |

13. | B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. |

14. | H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. |

15. | S. -H. Hong, J. -S Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging.” Opt. Express |

16. | A. Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express |

17. | Y. Igarishi, H. Murata, and M. Ueda, “3D display system using a computer-generated integral photograph,” Jpn. J. Appl. Phys. |

18. | L. Erdmann and K. J. Gabriel, “High resolution digital photography by use of a scanning microlens array,” Appl. Opt. |

19. | S. Kishk and B. Javidi, “Improved resolution 3D object sensing and recognition using time multiplexed computational integral imaging,” Opt. Express |

20. | R. Martínez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Enhanced depth of field integral imaging with sensor resolution constraints,” Opt. Express |

21. | J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. |

22. | J. Hong, J. -H. Park, S. Jung, and B. Lee, “Depth-enhanced integral imaging by use of optical path control,” Opt. Lett. |

23. | S. -W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. |

24. | M. Martínez-Corral, B. Javidi, R. Martínez-Cuenca, and G. Saavedra, “Integral imaging with improved depth of field by use of amplitude modulated microlens array,” Appl. Opt. |

25. | Y. S. Hwang, S. -H. Hong, and B. Javidi, “Free View 3-D Visualization of Occluded Objects by Using Computational Synthetic Aperture Integral Imaging,” J. Display Technol. |

26. | S. Yeom, B. Javidi, and E. Watson, “Photon counting passive 3D image sensing for automatic target recognition,” Opt. Express |

27. | Y. Frauel and B. Javidi, “Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,” Appl. Opt. |

28. | J. Arai, M. Okui, M. Kobayashi, and F. Okano, “Geometrical effects of positional errors in integral photography,” J. Opt. Soc. Am. |

29. | N. Mukhopadhyay, |

**OCIS Codes**

(000.0000) General : General

(100.3010) Image processing : Image reconstruction techniques

(100.6890) Image processing : Three-dimensional image processing

(110.6880) Imaging systems : Three-dimensional image acquisition

**ToC Category:**

Image Processing

**History**

Original Manuscript: July 9, 2007

Revised Manuscript: August 28, 2007

Manuscript Accepted: August 28, 2007

Published: September 5, 2007

**Citation**

Behnoosh Tavakoli, Mehdi Daneshpanah, Bahram Javidi, and Edward Watson, "Performance of 3D integral imaging with position uncertainty," Opt. Express **15**, 11889-11902 (2007)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-15-19-11889

Sort: Year | Journal | Reset

### References

- S. A. Benton, ed., Selected Papers on Three-Dimensional Displays (SPIE Optical Engineering Press, Bellingham, WA, 2001).
- B. Javidi and F. Okano, eds., Three Dimensional Television, Video, and Display Technologies (Springer, Berlin, 2002).
- T. Okoshi, Three-dimensional Imaging Techniques (Academic Press, New York, 1976).
- B. Javidi, S.-H. Hong, and O. Matoba, "Multi dimensional optical sensors and imaging systems," Appl. Opt. 45, 2986-2994 (2006). [CrossRef] [PubMed]
- M. G. Lippmann, "Epreuves reversibles donnant la sensation durelief," J. Phys. 7, 821-825 (1908).
- Y. A. Dudnikov, ‘‘On the design of a scheme for producing integral photographs by a combination method,’’Sov. J. Opt. Technol. 41, 426-429 (1974).
- H. E. Ives, "Optical properties of a Lippmann lenticuled sheet," J. Opt. Soc. Am. 21, 171-176 (1931). [CrossRef]
- P. Sokolov, "Autostereoscpy and Integral Photography by Professor Lippmann’s Method," (Moscow State Univ. Press, Moscow, Russia, 1911).
- F. Okano, H. Hoshino, J. Arai, I. Yuyama, "Real time pickup method for a three-dimensional image based on integral photography," Appl. Opt. 36, 1598-1603 (1997). [CrossRef] [PubMed]
- A. Stern and B. Javidi, "Three-dimensional image sensing, visualization, and processing using integral imaging," Proc. IEEE 94, 591-607 (2006). [CrossRef]
- B. Wilburn, N. Joshi, V. Vaish, A. Barth, A. Adams, M. Horowitz, M. Levoy, "High performance imaging using large camera arrays," Proc. of the ACM 24, 765-776 (2005).
- J. S. Jang and B. Javidi, "Three-dimensional synthetic aperture integral imaging," Opt. Lett. 27, 1144-1146 (2002). [CrossRef]
- B. Burckhardt, "Optimum parameters and resolution limitation of integral photography," J. Opt. Soc. Am. 58, 71-76 (1968). [CrossRef]
- H. Hoshino, F. Okano, H. Isono, and I. Yuyama, "Analysis of resolution limitation of integral photography," J. Opt. Soc. Am. A 15, 2059-2065 (1998). [CrossRef]
- S. -H. Hong, J. -S Jang, and B. Javidi, "Three-dimensional volumetric object reconstruction using computational integral imaging." Opt. Express 12, 483-491 (2004). [CrossRef] [PubMed]
- A. Stern and B. Javidi, "3-D computational synthetic aperture integral imaging (COMPSAII)," Opt. Express 11, 2446-2451 (2003). [CrossRef] [PubMed]
- Y. Igarishi, H. Murata, and M. Ueda, "3D display system using a computer-generated integral photograph," Jpn. J. Appl. Phys. 17, 1683-1684 (1978). [CrossRef]
- [REMOVED HYPERLINK FIELD]L. Erdmann and K. J. Gabriel, "High resolution digital photography by use of a scanning microlens array," Appl. Opt. 40, 5592-5599 (2001). [CrossRef]
- S. Kishk and B. Javidi, "Improved resolution 3D object sensing and recognition using time multiplexed computational integral imaging," Opt. Express 11, 3528-3541 (2003). [CrossRef] [PubMed]
- R. Martínez-Cuenca, G. Saavedra, M. Martinez-Corral and B. Javidi, "Enhanced depth of field integral imaging with sensor resolution constraints," Opt. Express 12, 5237-5242 (2004). [CrossRef] [PubMed]
- J.-S. Jang and B. Javidi, "Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics," Opt. Lett. 27, 324-326 (2002). [CrossRef]
- J. Hong, J. -H. Park, S. Jung, and B. Lee, "Depth-enhanced integral imaging by use of optical path control," Opt. Lett. 29, 1790-1792 (2004) [CrossRef] [PubMed]
- S. -W. Min, J. Kim, and B. Lee, "Wide-viewing projection-type integral imaging system with an embossed screen," Opt. Lett. 29, 2420-2422 (2004) [CrossRef] [PubMed]
- M. Martínez-Corral, B. Javidi, R. Martínez-Cuenca, and G. Saavedra, "Integral imaging with improved depth of field by use of amplitude modulated microlens array," Appl. Opt. 43, 5806-5813 (2004). [CrossRef] [PubMed]
- Y. S. Hwang, S. -H. Hong, and B. Javidi, "Free View 3-D Visualization of Occluded Objects by Using Computational Synthetic Aperture Integral Imaging," J. Display Technol. 3, 64-70 (2007) [CrossRef]
- S. Yeom, B. Javidi, and E. Watson, "Photon counting passive 3D image sensing for automatic target recognition," Opt. Express 13, 9310-9330 (2005). [CrossRef] [PubMed]
- Y. Frauel and B. Javidi, "Digital three-dimensional image correlation by use of computer-reconstructed integral imaging," Appl. Opt. 41, 5488-5496 (2002). [CrossRef] [PubMed]
- J. Arai, M. Okui, M. Kobayashi, and F. Okano, "Geometrical effects of positional errors in integral photography," J. Opt. Soc. Am. A 21, 951-958 (2004) [CrossRef]
- N. Mukhopadhyay, Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.