## Fresnel and Fourier hologram generation using orthographic projection images

Optics Express, Vol. 17, Issue 8, pp. 6320-6334 (2009)

http://dx.doi.org/10.1364/OE.17.006320

Acrobat PDF (1457 KB)

### Abstract

A novel technique for synthesizing a hologram of three-dimensional objects from multiple orthographic projection view images is proposed. The three-dimensional objects are captured under incoherent white illumination and their orthographic projection view images are obtained. The orthographic projection view images are multiplied by the corresponding phase terms and integrated to form a Fourier or Fresnel hologram. Using simple manipulation of the orthographic projection view images, it is also possible to shift the three-dimensional objects by an arbitrary amount along the three axes in the reconstruction space or invert their depths with respect to the given depth plane. The principle is verified experimentally.

© 2009 Optical Society of America

## 1. Introduction

1. A. W. Lohmann and D. P. Paris, “Binary Fraunhofer holograms generated by computer,” Appl. Opt. **6**, 1739–1748 (1967). [CrossRef] [PubMed]

2. J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. **9**, 405–407 (1966). [CrossRef]

3. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. **45**, 4026–4036 (2006). [CrossRef] [PubMed]

3. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. **45**, 4026–4036 (2006). [CrossRef] [PubMed]

4. D. Abookasis and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A **20**, 1537–1545 (2003). [CrossRef]

5. Y. Sando, M. Itoh, and T. Yatagai, “Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects,” Opt. Lett. **28**, 2518–2520 (2003). [CrossRef] [PubMed]

4. D. Abookasis and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A **20**, 1537–1545 (2003). [CrossRef]

5. Y. Sando, M. Itoh, and T. Yatagai, “Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects,” Opt. Lett. **28**, 2518–2520 (2003). [CrossRef] [PubMed]

6. N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express **15**, 5754–5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef] [PubMed]

7. N. T. Shaked and J. Rosen, “Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections,” Appl. Opt. **47**, D21–D27 (2008). [CrossRef] [PubMed]

8. D. Abookasis and J. Rosen, “Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene,” Appl. Opt. **45**, 6533–6538 (2006). [CrossRef] [PubMed]

9. Y. Sando, M. Itoh, and T. Yatagai, “Full-color computer-generated holograms using 3-D Fourier spectra,” Opt. Express **12**, 6246–6251 (2004), http://www.opticsinfobase.org/oe/abstract.cfm?uri=OE-12-25-6246 [CrossRef] [PubMed]

8. D. Abookasis and J. Rosen, “Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene,” Appl. Opt. **45**, 6533–6538 (2006). [CrossRef] [PubMed]

7. N. T. Shaked and J. Rosen, “Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections,” Appl. Opt. **47**, D21–D27 (2008). [CrossRef] [PubMed]

## 2. Orthographic projection geometry

6. N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express **15**, 5754–5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef] [PubMed]

*x*,

_{p}*y*) in this projection geometry is given by

_{p}*x*,

_{o}*y*) is the camera position and

_{o}*f*is the focal length of the camera lens. Figure 1(c) is the orthographic projection geometry that we use in the proposed method. The projection lines are all parallel as the angular orthogonal projection shown in Fig. 1(a) but the image plane is not slanted like the perspective projection geometry shown in Fig. 1(b). If we let

**r**denote one of the projection lines, the angle

*φ*shown in Fig. 1(c) is the angle that the projection of

**r**onto the

*x*-

*z*plane makes with the

*z*-axis. Similarly, the angle

*θ*is the angle that the projection of

**r**onto the

*y*-

*z*plane makes with the

*z*-axis. The projection image coordinates (

*x*,

_{p}*y*) can be written as

_{p}*s*,

*t*, and

*l*are defined as shown in Fig. 1(c) in order to represent the projection direction more conveniently [11

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express **13**, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. **43**, 4882–4895 (2004). [CrossRef] [PubMed]

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express **13**, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

^{-1}(

*s*

_{1}/

*l*)=tan

^{-1}(

*s*

_{1}/

*f*) where

_{la}*f*is the focal length of the lens array. In the same manner, the pixels at the green dots are assembled to form another orthographic image with a projection angle of tan

_{la}^{-1}(

*s*

_{2}/

*l*). Since one pixel is extracted from each elemental image, the number of the pixels in the synthesized orthographic image is the same as the number of the elemental lenses in the lens array. The total number of the synthesized orthographic images is given by the number of the pixels in one elemental image.

## 3. Fourier hologram generation using multiple orthographic view images

*φ*and

*θ*or equivalently,

*s*,

*t*, and

*l*as defined in Fig. 1(c) and Fig. 3 as P

_{st}(

*x*,

_{p}*y*), the proposed method calculates the Fourier hologram H by

_{p}*l*is assumed to be a constant so that the orthographic projection direction is completely described by

*s*and

*t*, and

*b*is a positive constant that will be determined later.

*f*is the focal length of the lens and λ is the wavelength. Now we show that the proposed method given by Eq. (4) produces an exact Fourier hologram given by Eq. (5) without any approximation using single-source-point method [6]. Let us consider one infinitesimal object point with the size of (Δ

*x*, Δ

*y*, Δ

*z*), located at coordinates (

*x*,

*y*,

*z*), and having the value of O(

*x*,

*y*,

*z*). The orthographic projection image

*p*

^{SSP}

_{st}(

*x*,

_{p}*y*) that corresponds to this infinitesimal object point is given by Eq. (3) as

_{p}^{SSP}

_{(s,t)}is the hologram that corresponds to the infinitesimal object point. The hologram for entire 3D object scene is the volume integral of H

^{SSP}(

*s*,

*t*) over all 3D object points. Hence we get

*s*and

*t*values. If we rewrite Eq. (8) using continuous coordinates

*u*=

*Ms*and

*v*=

*Mt*at the Fourier hologram plane, finally we get

## 4. Fresnel hologram generation using multiple orthographic view images

*s*and

*t*, of each orthographic projection image. Finally, the Fresnel hologram is obtained by adding all shifted and multiplied orthographic projection images. Like the case of the Fourier hologram, the proposed method for Fresnel hologram generation is also intuitively straightforward. The orthographic projection image represents the intensity distribution of one set of the parallel rays penetrating the projection image plane. Since the parallel rays undergo the same amount of phase change and lateral position shift when they propagate a distance

*D*as shown in Fig. 4, we can estimate the complex field contributed by one set of the parallel rays by shifting laterally and multiplying a phase factor to the orthographic image. By repeating this process for all sets of parallel rays, or equivalently for all orthographic projection images, and adding them, we can get the complex field of 3D objects.

*D*from the projection image plane as shown in Fig. 4, the complex field

*H*(

_{s,t}*u*,

*v*) contributed by an orthographic projection image

*P*(

_{s,t}*x*,

_{p}*y*) of the projection angles

_{p}*s*and

*t*is calculated by the proposed method as

*b*and

*c*are the constants to be determined later. The proposed method generates the Fresnel hologram by adding all

*H*(

_{s,t}*u*,

*v*), i.e.

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. **43**, 4882–4895 (2004). [CrossRef] [PubMed]

*jkz*] is omitted since we can assume arbitrary phase distribution on the object surface. We start from Eq. (12

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. **43**, 4882–4895 (2004). [CrossRef] [PubMed]

_{s}and Δ

_{t}, Eq. (12) can be represented as an integral form. From Eqs. (11) and (12

**43**, 4882–4895 (2004). [CrossRef] [PubMed]

*x*,

*y*,

*z*) located at (

*x*,

*y*,

*z*). Using Eq. (6), the hologram H

^{SSP}(

*u*,

*v*) for this infinitesimal object point is given by

^{SSP}(

*u*,

*v*) over all 3D object points. Hence we get

*cD*>>

*z*and the object function O(

*x*,

*y*,

*z*) is slowly varying in comparison to the quadratic phase exponential function in Eq. (16), we can approximate Eq. (16) to

## 5. Hologram generation for 3D location shifted or depth inverted object

_{s,t}(

*x*,

_{p}*y*) by (

_{p}*δx*,

*δy*) to form the modified orthographic projection image P'

_{s,t}(

*x*,

_{p}*y*)=

_{p}*P*(

_{s,t}*x*-

_{p}*δx*,

*y*-

_{p}*δy*) as shown in Fig. 5(a) leads to shifted coordinates (

*x*'

_{p},

*y*'

_{p}) given by

*δx*,

*δy*).

*s*and

*t*. For the longitudinal shift of

*δz*, each orthographic projection image P

_{s,t}(

*x*,

_{p}*y*) is shifted by (

_{p}*δzs*/

*l*,

*δzt*/

*l*) to form a modified orthographic projection image P'

_{s,t}(

*x*,

_{p}*y*)=P

_{p}_{s,t}(

*x*-

_{p}*δzs*/

*l*,

*y*-

_{p}*δzt*/

*l*) as shown in Fig. 5(b), giving new coordinates (

*x*'

_{p},

*y*'

_{p}) as

*z*+

*δz*. Note that the shift of the orthographic projection image by a constant value results in the lateral shift of the 3D image; the projection angle, i.e.

*s*and

*t*, dependent shift provides longitudinal shift of the 3D image.

*s*and

*t*with the orthographic image corresponding to −

*s*and -

*t*, i.e. P'

_{s,t}(

*x*,

_{p}*y*)=P

_{p}_{-s,-t}(

*x*,

_{p}*y*) as shown in Fig. 5(c). The new coordinates are given by

_{p}*z*, resulting in a converted depth with respect to the

*z*=0 plane. Depth inversion with respect to the arbitrary depth plane is also possible by sequentially applying the depth shift and the depth conversion. For example, shifting the depth by

*δz*=-

*d*using Eq. (20), inverting the depth with respect to

*z*=0 plane, and shifting again the depth by

*δz*=

*d*gives a 3D image whose depth is inverted with respect to the

*z*=

*d*plane.

## 6. Experimental result

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express **13**, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

*f*=3.3mm focal length. The valid number of elemental lenses is 67(H) × 59(V). The elemental images formed by the lens array are captured by the CCD of 3288(H) × 2470(V) resolution through the imaging lens system, Nikon AF Nikkor 28-80mm.

_{la}*s*=Δ

*t*=1mm/41=24.4um. Due to a limited field of view of the elemental lens, each elemental image contains only a part of the object. Also there are elemental images that do not contain any object image since the object is out of the field of view of the corresponding elemental lenses. Figure 8 shows the orthographic images generated using the elemental images of Fig. 7. In Fig. 8, it is observed that the disparity of the closer object, object ‘C’, is smaller than that of the farther object, object ‘B’, which confirms that the generated image has orthographic projection geometry [12

**43**, 4882–4895 (2004). [CrossRef] [PubMed]

*s*/

*f*=Δ

_{la}*t*/

*f*=0.42° and the whole angular range is -7.2°~ +7.2° for the horizontal direction and -7.2°~ +7.6° for the vertical direction. The sampling rate of each orthographic image is given by the elemental lens pitch. Hence the object is sampled with 1 mm interval in the orthographic image.

_{la}14. L. Zhang, D. Wang, and A. Vincent, “Adaptive reconstruction of intermediate views from stereoscopic images,” IEEE Trans. Circuits Syst. Video Technol. **16**, 102–113 (2006). [CrossRef]

15. J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, “View image generation in perspective and orthographic projection geometry based on integral imaging,” Opt. Express **16**, 8800–8813 (2008), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-12-8800 [CrossRef] [PubMed]

*s*=Δ

*f*=24.4um to Δ

*s*=Δ

*f*=12.2um, while the whole angular range is maintained unchanged, i.e. -7.2°~ +7.2° for the horizontal direction and -7.2°~ +7.6° for the vertical direction.

*l*=

*f*=3.3mm,

_{la}*b*=2/(

*λl*)=5.7×10

^{5}and Δ

*s*=Δ

*t*=12.2um. Note that the wavelength is set to a large value in order to alleviate any aliasing induced by low sampling rate of the orthographic images in the generation process. Assuming the pixel pitch of the generated hologram is Δ

*u*=Δ

*v*=22.4um, the corresponding focal length

*f*of the Fourier transform lens is given as

*f*=

*l*Δ

*u*/(2Δ

*s*)=3.03mm by Eq. (10). In the reconstruction stage, another sets of the parameters that have more realistic values, i.e. λ=532nm,

*f*=135.5mm, Δ

*u*=Δ

*v*=22.4um, are used. From Eq. (5), one can easily verify that the use of these different reconstruction parameters scales the lateral coordinates

*x*and

*y*of the object space, i.e. decreases the lateral size of the object, by a factor of 135.5/3.03=44.72 while leaving the axial coordinates

*z*nearly unchanged. Note that these reconstruction parameters were chosen in the experiment such that the axial coordinate is kept unchanged for the purpose of the clear demonstration of the theory. One can choose different focal lengths

**f**of the Fourier transform lens to control the lateral and axial magnifications of the object space.

*l*=

*f*=3.3mm,

_{la}*D*=350mm,

*b*=2

*D*/λ=657.9,

*c*=2, Δ

*u*=Δ

*v*=1mm and Δ

*s*=Δ

*f*=12.2um. Again, the wavelength was set to a large value to avoid any aliasing in the generation process. Also note that Δ

*u*=Δ

*v*=1mm are the same as the elemental lens pitch of the lens array used in the experiment since it determines the sampling rate of the orthographic images. In the reconstruction stage, the wavelength and the pixel pitch of the hologram are changed to λ=532nm, and Δ

*u*=Δ

*v*=22.4um. One can verify from Eq. (13) that these change scales the lateral coordinates

*x*and

*y*of the object by a factor of

*z*and the distance

*D*unchanged. Note that the use of the different sets of the parameters and the doubling of the orthographic projection images in our experiment are mainly due to low sampling rate of the lens array method. If the orthographic images are obtained with higher sampling rate using different methods, these processes will not be required.

*δx*,

*δy*,

*δz*)=(0,0,0) and no depth inversion, (b) shift (

*δx*,

*δy*,

*δz*)=(10mm,10mm,-20mm) and no depth inversion, and (c) depth shift (

*δx*,

*δy*,

*δz*)=(0,0,80mm) after depth inversion. Figure 10 shows the numerical reconstruction results. In the numerical reconstruction, the focal length of the Fourier transform lens is assumed to be 135.5mm as explained above. Using the Fresnel diffraction formula and the lens function [13], the intensity at 135.5mm+

*z*from the Fourier transform lens is calculated. Figure 10(a) shows that the Fourier hologram generated with the proposed method can reconstruct two plane objects successfully with correct depth order. The effect of the lateral and depth shift is shown in Fig. 10(b). We can see that the lateral shift and the depth shift are reflected in the results, as desired. The depth inversion result is shown in Fig. 10(c). Note that the depths of the objects are originally 30mm for object ‘C’ and 50mm for object ‘B’. By depth inversion, they are transferred to -30mm for ‘C’ and -50mm for ‘B’. Then, by depth shifting by 80mm, they are brought back to 30mm for ‘B’ and 50mm for ‘C’. Figure 10(c) shows this final result. As expected, object ‘C’ is focused at 50mm and object ‘B’ is focused at 30mm, which reveals that the depth order is inverted.

*D*, is set at 350mm. The resolution of the generated Fresnel holograms shown in Fig. 11 is 260(H) × 260(V) pixels including small zero padding around the active area. Note that, unlike the case of Fourier holograms, the resolution of the generated Fresnel hologram is not the same as the number of the orthographic projection images. In the case of Fresnel hologram, the orthographic images are shifted by

*csD*/

*l*and overlapped on the hologram plane as shown in Eqs. (11), (12) and Fig. 4. Hence the resolution of the generated hologram is determined by the area covered by the shifted orthographic images and the pixel size on the hologram plane. Since the distance is

*D*=350mm, the angular range, i.e. tan

^{-1}(

*s*/

*l*), is about -7.2°~ +7.2°, and the size of one orthographic image is Δ

*u*(value used in generation process) × (number of pixels in one orthographic image) = 1mm × 67 = 67mm, the covered area on the hologram plane can be estimated by using the first term in Eq. (11) as around 67+350×2×2×tan(7.2°)≈244mm. The hologram pixel pitch used in the generation step is Δ

*u*=1mm as explained before. Therefore, the resolution of the active area of the generated Fresnel hologram is around 244/1=244 pixels for

*u*-axis. Similar estimation gives 59+350×2×{tan(7.2°)+ tan(7.6°)}≈241 pixels for

*v*-axis.

*z*from the Fresnel hologram plane is calculated. Figures 11 and 12 reveal that the proposed method successfully generates a Fresnel hologram of the 3D objects from their orthographic projection images; their lateral/axial shift and depth inversion can also be performed with the given set of orthographic projection images.

## 7. Conclusion

## Acknowledgment

## References and links

1. | A. W. Lohmann and D. P. Paris, “Binary Fraunhofer holograms generated by computer,” Appl. Opt. |

2. | J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. |

3. | T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. |

4. | D. Abookasis and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A |

5. | Y. Sando, M. Itoh, and T. Yatagai, “Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects,” Opt. Lett. |

6. | N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express |

7. | N. T. Shaked and J. Rosen, “Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections,” Appl. Opt. |

8. | D. Abookasis and J. Rosen, “Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene,” Appl. Opt. |

9. | Y. Sando, M. Itoh, and T. Yatagai, “Full-color computer-generated holograms using 3-D Fourier spectra,” Opt. Express |

10. | M.-S. Kim, G. Baasantseren, N. Kim, and J.-H. Park, “Hologram generation of 3D objects using multiple orthographic view images,” J. Opt. Soc. Korea |

11. | J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express |

12. | J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. |

13. | J. W. Goodman, |

14. | L. Zhang, D. Wang, and A. Vincent, “Adaptive reconstruction of intermediate views from stereoscopic images,” IEEE Trans. Circuits Syst. Video Technol. |

15. | J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, “View image generation in perspective and orthographic projection geometry based on integral imaging,” Opt. Express |

**OCIS Codes**

(090.1760) Holography : Computer holography

(100.6890) Image processing : Three-dimensional image processing

(110.2990) Imaging systems : Image formation theory

(110.6880) Imaging systems : Three-dimensional image acquisition

**ToC Category:**

Holography

**History**

Original Manuscript: December 22, 2008

Revised Manuscript: February 26, 2009

Manuscript Accepted: March 18, 2009

Published: April 2, 2009

**Citation**

Jae-Hyeung Park, Min-Su Kim, Ganbat Baasantseren, and Nam Kim, "Fresnel and Fourier hologram generation using orthographic projection images," Opt. Express **17**, 6320-6334 (2009)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-17-8-6320

Sort: Year | Journal | Reset

### References

- A. W. Lohmann and D. P. Paris, "Binary Fraunhofer holograms generated by computer," Appl. Opt. 6, 1739-1748 (1967). [CrossRef] [PubMed]
- J. P. Waters, "Holographic image synthesis utilizing theoretical methods," Appl. Phys. Lett. 9, 405-407 (1966). [CrossRef]
- T. Mishina, M. Okui, and F. Okano, "Calculation of holograms from elemental images captured by integral photography," Appl. Opt. 45, 4026-4036 (2006). [CrossRef] [PubMed]
- D. Abookasis and J. Rosen, "Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints," J. Opt. Soc. Am. A 20, 1537-1545 (2003). [CrossRef]
- Y. Sando, M. Itoh, and T. Yatagai, "Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects," Opt. Lett. 28, 2518-2520 (2003). [CrossRef] [PubMed]
- N. T. Shaked, J. Rosen, and A. Stern, "Integral holography: white-light single-shot hologram acquisition," Opt. Express 15, 5754-5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef] [PubMed]
- N. T. Shaked and J. Rosen, "Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections," Appl. Opt. 47, D21-D27 (2008). [CrossRef] [PubMed]
- D. Abookasis and J. Rosen, "Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene," Appl. Opt. 45, 6533-6538 (2006). [CrossRef] [PubMed]
- Y. Sando, M. Itoh, and T. Yatagai, "Full-color computer-generated holograms using 3-D Fourier spectra," Opt. Express 12, 6246-6251 (2004), http://www.opticsinfobase.org/oe/abstract.cfm?uri=OE-12-25-6246 [CrossRef] [PubMed]
- M.-S. Kim, G. Baasantseren, N. Kim, and J.-H. Park, "Hologram generation of 3D objects using multiple orthographic view images," J. Opt. Soc. Korea 12, 269-274 (2008). [CrossRef]
- J.-H. Park, J. Kim, and B. Lee, "Three-dimensional optical correlator using a sub-image array," Opt. Express 13, 5116-5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]
- J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, "Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification," Appl. Opt. 43, 4882-4895 (2004). [CrossRef] [PubMed]
- J. W. Goodman, Introduction to Foureir Optics, 2nd ed., (McGraw-Hill, New York, 1996), Chaps. 4-5, p. 66-105.
- L. Zhang, D. Wang, and A. Vincent, "Adaptive reconstruction of intermediate views from stereoscopic images," IEEE Trans. Circuits Syst. Video Technol. 16, 102-113 (2006). [CrossRef]
- J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, "View image generation in perspective and orthographic projection geometry based on integral imaging," Opt. Express 16, 8800-8813 (2008), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-12-8800. [CrossRef] [PubMed]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

### Figures

Fig. 1. |
Fig. 2. |
Fig. 3. |

Fig. 4. |
Fig. 5. |
Fig. 6. |

Fig. 7. |
Fig. 8. |
Fig. 9. |

Fig. 10. |
Fig. 11. |
Fig. 12. |

« Previous Article | Next Article »

OSA is a member of CrossRef.