OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 17, Iss. 8 — Apr. 13, 2009
  • pp: 6320–6334
« Show journal navigation

Fresnel and Fourier hologram generation using orthographic projection images

Jae-Hyeung Park, Min-Su Kim, Ganbat Baasantseren, and Nam Kim  »View Author Affiliations


Optics Express, Vol. 17, Issue 8, pp. 6320-6334 (2009)
http://dx.doi.org/10.1364/OE.17.006320


View Full Text Article

Acrobat PDF (1457 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

A novel technique for synthesizing a hologram of three-dimensional objects from multiple orthographic projection view images is proposed. The three-dimensional objects are captured under incoherent white illumination and their orthographic projection view images are obtained. The orthographic projection view images are multiplied by the corresponding phase terms and integrated to form a Fourier or Fresnel hologram. Using simple manipulation of the orthographic projection view images, it is also possible to shift the three-dimensional objects by an arbitrary amount along the three axes in the reconstruction space or invert their depths with respect to the given depth plane. The principle is verified experimentally.

© 2009 Optical Society of America

1. Introduction

Holography has been exploited for various applications since it was first proposed by Gabor in 1948. For three-dimensional (3D) displays, holography has been considered as the perfect technique, since it can provide flawless 3D images with complete human depth cues. One problem that needs to be solved for realizing holographic 3D displays is the complicated hologram capture process. In order to get a hologram of 3D objects, we need to build a coherent optical system with laser illumination, which is generally complicated. Moreover, optical hologram acquisition is possible only for moderate sized objects and is not possible for distant objects or background scenes due to the difficulties of laser illumination. Computer generated holograms (CGH) can be one solution [1

1. A. W. Lohmann and D. P. Paris, “Binary Fraunhofer holograms generated by computer,” Appl. Opt. 6, 1739–1748 (1967). [CrossRef] [PubMed]

,2

2. J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9, 405–407 (1966). [CrossRef]

]. CGH calculates the hologram numerically, eliminating the necessity of a coherent optic system. CGH, however, requires the full 3D information for the objects to perform the hologram calculation. Therefore CGH is available only for computer-generated (CG) objects and not for objects in the real world. The hologram acquisition of 3D objects in the real world still requires a complicated coherent optical system.

2. Orthographic projection geometry

xp=xcosφzsinφ,
yp=ycosθzsinθcosφxsinφsinθ.
(1)
Fig. 1. Projection geometry

Figure 1(b) is the perspective projection geometry used by N.T. Shaked et al. [6

6. N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express 15, 5754–5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef] [PubMed]

]. The projection lines converge at a vanishing point that corresponds to the principal point of the camera lens. The projection image coordinates (xp, yp) in this projection geometry is given by

xp=xo(xxo)f/z,
yp=yo(yyo)f/z,
(2)

where (xo, yo) is the camera position and f is the focal length of the camera lens. Figure 1(c) is the orthographic projection geometry that we use in the proposed method. The projection lines are all parallel as the angular orthogonal projection shown in Fig. 1(a) but the image plane is not slanted like the perspective projection geometry shown in Fig. 1(b). If we let r denote one of the projection lines, the angle φ shown in Fig. 1(c) is the angle that the projection of r onto the x-z plane makes with the z-axis. Similarly, the angle θ is the angle that the projection of r onto the y-z plane makes with the z-axis. The projection image coordinates (xp, yp) can be written as

xp=x+ztanφ=x+zs/l,
yp=y+ztanθ=y+zt/l,
(3)

where s, t, and l are defined as shown in Fig. 1(c) in order to represent the projection direction more conveniently [11

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express 13, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

,12

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef] [PubMed]

]. The use of the orthographic projection geometry given by Eq. (3) leads to the exact Fourier or Fresnel hologram calculation and easy 3D manipulation of the reconstructed 3D images, as will be explained in the following sections.

One efficient way to capture the orthographic image is to use a lens array [11

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express 13, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

]. From single capture using the lens array, a number of orthographic projection images can be obtained. Figure 2 shows the configuration. When a 3D object is imaged through the lens array, each elemental lens of the lens array forms an image of the 3D object at its focal plane, which is called an elemental image. If the pixels are collected from each elemental image at the same local position, then the assembled pixels form an orthographic projection image. For example, in Fig. 2, pixel at the position of a red dot is collected from every elemental image to form an orthographic projection image with a projection angle of tan-1(s 1/l)=tan-1(s 1/fla) where fla is the focal length of the lens array. In the same manner, the pixels at the green dots are assembled to form another orthographic image with a projection angle of tan-1(s 2/l). Since one pixel is extracted from each elemental image, the number of the pixels in the synthesized orthographic image is the same as the number of the elemental lenses in the lens array. The total number of the synthesized orthographic images is given by the number of the pixels in one elemental image.

Fig. 2. Orthographic image acquisition using a lens array

Although it provides a convenient way to obtain orthographic projection images at a single capture, it should be noted that this lens array method also brings several limitations. One limitation is the range of the projection angle. Due to the limited field of view and paraxial imaging of the lens array, the angular range that can be captured by the lens array is limited to a small value. The resolution of the captured orthographic projection images is also limited in the lens array method. The sampling rate of the captured orthographic projection images is determined dominantly by the elemental lens pitch that may not small enough to capture the high details of the object. This low sampling rate limits the maximum object spatial bandwidth that can be processed in the Fourier and Fresnel hologram generation.

3. Fourier hologram generation using multiple orthographic view images

Figure 3 illustrates Fourier hologram generation using the orthographic projection images. The Fourier hologram of 3D objects is generated by the following steps. First, the orthographic projection images of the 3D objects are captured. The second step is the multiplication of each orthographic projection image by the phase factor of the slanted plane wave. The slanting angle of the plane wave is determined by the projection angle of the orthographic projection image. Third, the product of the multiplication is integrated into a single complex value. This complex value is the complex field of the 3D object at a single point in the Fourier plane. Finally, by repeating the above steps for all orthographic projection images, the entire Fourier hologram is acquired.

The procedure of the proposed method is intuitively straightforward. Generally, the light field of the 3D object is diffracted along the Fresnel propagation and refracted by the lens, resulting in a redistributed light field in the Fourier plane. If we concentrate on the parallel rays as shown in Fig. 3, we can see that each point on the Fourier plane corresponds to the integration of the parallel rays. Since the orthographic projection image is the intensity distribution of the parallel projection rays, the complex field at a point in the Fourier plane can be calculated by integrating the orthographic projection image with the slanted plane wave phase factor that accounts for the slanted projection angle.

Fig. 3. Fourier hologram generation from the orthographic projection images

Let us present the proposed method mathematically. If we denote the orthographic projection image corresponding to the projection angles φ and θ or equivalently, s, t, and l as defined in Fig. 1(c) and Fig. 3 as Pst(xp, yp), the proposed method calculates the Fourier hologram H by

H(s,t)=Pst(xp,xp)exp[j2πb(xps+ypt)]dxpdyp,
(4)

where l is assumed to be a constant so that the orthographic projection direction is completely described by s and t, and b is a positive constant that will be determined later.

The Fourier hologram of the 3D object O(x,y,z) is given by[13

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

]

H(u,v)=O(x,y,z)exp[jπλf(zfu2+zfv22xu2yv)]dxdydz,
(5)

where f is the focal length of the lens and λ is the wavelength. Now we show that the proposed method given by Eq. (4) produces an exact Fourier hologram given by Eq. (5) without any approximation using single-source-point method [6]. Let us consider one infinitesimal object point with the size of (Δx, Δy, Δz), located at coordinates (x, y, z), and having the value of O(x, y, z). The orthographic projection image p SSP st(xp, yp) that corresponds to this infinitesimal object point is given by Eq. (3) as

PstSSP(xp,yp)=O(x,y,z)δ(xpxszl,ypytzl)ΔxΔyΔz,
(6)

where δ is the Dirac delta impulse function. Substituting Eq. (6) into Eq. (4) leads to

HSSP(s,t)=O(x,y,z)δ(xpxszl,ypytzl)ΔxΔyΔz
×exp[j2πb(xps+ypt)]dxpdyp
=O(x,y,z)exp[j2πb(xp+ys+zls2+zlt2)]ΔxΔyΔz
(7)

where HSSP (s,t) is the hologram that corresponds to the infinitesimal object point. The hologram for entire 3D object scene is the volume integral of HSSP(s,t) over all 3D object points. Hence we get

H(s,t)=HSSP(s,t)dxdydz
=O(x,y,z)exp[j2πb(xs+ys+zls2+zlt2)]dxdydz.
(8)

In practice, the orthographic projection images are captured at discrete s and t values. If we rewrite Eq. (8) using continuous coordinates u=Ms and v=Mt at the Fourier hologram plane, finally we get

H(u,v)=O(x,y,z)exp[j2πbM(xu+yv+zlMu2+zlMv2)]dxdydz,
(9)

where M is a magnification factor. Equations (5) and (9) are the same, provided that

M=2fl,b=2λl.
(10)

Therefore, the exact Fourier hologram of the 3D object can be generated using the orthographic projection images. Note that no approximation is necessary in the process, unlike other methods using different projection geometry.

4. Fresnel hologram generation using multiple orthographic view images

Figure 4 illustrates Fresnel hologram generation using the proposed method. The steps used for generating Fresnel hologram in the proposed method is as follows. First, the orthographic projection images of the 3D objects are captured as before. Second, each orthographic projection image is shifted and multiplied by a constant phase term. The amount of the shift and the phase angle of the constant phase term are determined by the projection angle, or equivalently s and t, of each orthographic projection image. Finally, the Fresnel hologram is obtained by adding all shifted and multiplied orthographic projection images. Like the case of the Fourier hologram, the proposed method for Fresnel hologram generation is also intuitively straightforward. The orthographic projection image represents the intensity distribution of one set of the parallel rays penetrating the projection image plane. Since the parallel rays undergo the same amount of phase change and lateral position shift when they propagate a distance D as shown in Fig. 4, we can estimate the complex field contributed by one set of the parallel rays by shifting laterally and multiplying a phase factor to the orthographic image. By repeating this process for all sets of parallel rays, or equivalently for all orthographic projection images, and adding them, we can get the complex field of 3D objects.

Letting the Fresnel hologram plane be at distance D from the projection image plane as shown in Fig. 4, the complex field Hs,t(u,v) contributed by an orthographic projection image Ps,t(xp, yp) of the projection angles s and t is calculated by the proposed method as

Hs,t(u,v)=Ps,t(ucsDl,vctDl)exp{j2πb[(sl)2+(tl)2]},
(11)

where b and c are the constants to be determined later. The proposed method generates the Fresnel hologram by adding all Hs,t(u,v), i.e.

H(u.v)=s,tHs,t(u,v).
(12)

Now we show that the Fresnel hologram calculated using the orthographic projection images by Eq. (12

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef] [PubMed]

) is equivalent to the Fresnel hologram of the 3D object that is given by [13

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

]

H(u,v)=1jλ(D+z)O(x,y,z)exp{jπλ(D+z)[(ux)2+(vy)2]}dxdydz,
(13)

where a phase term exp[jkz] is omitted since we can assume arbitrary phase distribution on the object surface. We start from Eq. (12

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef] [PubMed]

). If we obtain the orthographic projection image with sufficiently small angular separation, or sufficiently small Δs and Δt, Eq. (12) can be represented as an integral form. From Eqs. (11) and (12

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef] [PubMed]

), we get

H(u,v)=Ps,t(ucsDl,vctDl)exp{j2πb[(sl)2+(tl)2]}dsdt.
(14)

Again we consider an infinitesimal object point O(x, y, z) located at (x, y, z). Using Eq. (6), the hologram HSSP(u,v) for this infinitesimal object point is given by

HSSP(u,v)=Ps,tSSP(ucsDl,vctDl)exp{j2πb[(sl)2+(tl)2]}dsdt
=O(x,y,z)δ(ucsDlxszl,vctDlytzl)
×ΔxΔyΔzexp{j2πb[(sl)2+(tl)2]}dsdt
=l2(cD+z)2O(x,y,z)exp{j2πb(cD+z)2[(ux)2+(vy)2]}ΔxΔyΔz
(15)

The hologram for entire object scene is the volume integral of HSSP(u,v) over all 3D object points. Hence we get

H(u,v)=HSSP(u,v)dxdydz
=l2(cD+z)2O(x,y,z)exp[j2πb(cD+z)2[(ux)2+(vy)2]]dxdydz.
(16)

If we assume that cD>>z and the object function O(x,y,z) is slowly varying in comparison to the quadratic phase exponential function in Eq. (16), we can approximate Eq. (16) to

H(u,v)=1cD+2zO(x,y,z)exp{j2πbcD(cD+2z)[(ux)2+(vy)2]}dxdydz
(17)

where the constant term is ignored [13

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

]. Equation (17) is the same as Fresnel hologram of the 3D object given by Eq. (13

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

), provided that

b=2Dλ,c=2.
(18)

Therefore, the hologram generated by the proposed method is equivalent to the Fresnel hologram of the 3D object.

Fig. 4. Fresnel hologram generation from orthographic projection images

5. Hologram generation for 3D location shifted or depth inverted object

A simple method to shift the 3D location or invert the depth of the captured 3D objects is proposed in this section. The depth inversion and the location shift are achieved by modifying the captured orthographic projection images. With the modified orthographic projection images, the method for generating Fourier or Fresnel holograms is performed as explained in sections 3 and 4, resulting in a hologram for the depth inverted or 3D location shifted 3D objects. Figure 5 shows the concept.

The capability of the 3D location shift and the depth inversion comes from the simple relation between the projection image coordinates and the 3D object coordinates of the orthographic projection geometry, which is given by Eq. (3). First, the lateral shift of the 3D objects in the reconstruction volume is achieved by shifting all orthographic projection images by the same amount. From Eq. (3), shifting the orthographic projection image Ps,t(xp, yp) by (δx, δy) to form the modified orthographic projection image P's,t(xp, yp)=Ps,t(xp-δx, yp-δy) as shown in Fig. 5(a) leads to shifted coordinates (x'p, y'p) given by

x'p=xp+δx=(x+δx)+zs/l,
y'p=yp+δy=(y+δy)+zt/l,
(19)

implying that the 3D image is reconstructed with lateral shift (δx, δy).

The longitudinal shift of the 3D image is achieved by shifting each orthographic image according to its projection view angle, i.e. s and t. For the longitudinal shift of δz, each orthographic projection image Ps,t(xp, yp) is shifted by (δzs/l, δzt/l) to form a modified orthographic projection image P's,t(xp, yp)=Ps,t(xp-δzs/l, yp-δzt/l) as shown in Fig. 5(b), giving new coordinates (x'p, y'p) as

x'p=xp+δzs/l=x+(z+δz)s/l,
y'p=yp+δzt/l=y+(z+δz)t/l,
(20)

Equation (20) reveals that the new coordinates correspond to the shifted object depth z+δz. Note that the shift of the orthographic projection image by a constant value results in the lateral shift of the 3D image; the projection angle, i.e. s and t, dependent shift provides longitudinal shift of the 3D image.

Fig. 5. Modification of orthographic images for manipulating 3D object (a)Lateral shift by (δx, δy), (b) depth shift by δz, and (c) depth inversion

Finally, the depth inversion is performed by exchanging each orthographic image of the projection angle s and t with the orthographic image corresponding to −s and -t, i.e. P's,t(xp, yp)=P-s,-t(xp, yp) as shown in Fig. 5(c). The new coordinates are given by

x'p=xzs/l,
y'p=yzt/l,
(21)

which indicates that the new coordinates correspond to the inverted depth −z, resulting in a converted depth with respect to the z=0 plane. Depth inversion with respect to the arbitrary depth plane is also possible by sequentially applying the depth shift and the depth conversion. For example, shifting the depth by δz=-d using Eq. (20), inverting the depth with respect to z=0 plane, and shifting again the depth by δz=d gives a 3D image whose depth is inverted with respect to the z=d plane.

6. Experimental result

We verified the proposed method experimentally. In the experiment, two plane objects ‘C’ and ‘B’ at different depths are captured using a lens array. From the elemental images, the orthographic images are synthesized by collecting the pixels at the same position in each elemental image [11

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express 13, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

]. Using the synthesized orthographic images, the Fourier and Fresnel holograms are generated with and without depth shift and depth inversion based on the proposed method. Finally, the holograms are numerically reconstructed at various depths.

The experimental setup used to capture the elemental images is shown in Fig. 6. The objects are located away from the lens array at 30mm for ‘C’ and 50mm for ‘B’. The lens array consists of identical elemental lenses of 1mm (H) × 1mm (V) lens pitch and fla=3.3mm focal length. The valid number of elemental lenses is 67(H) × 59(V). The elemental images formed by the lens array are captured by the CCD of 3288(H) × 2470(V) resolution through the imaging lens system, Nikon AF Nikkor 28-80mm.

Fig. 6. Experimental setup to capture the elemental images

Figure 7 shows the elemental images captured by the CCD. The resolution of each elemental image is 41(H) × 41(V) pixels. Since the elemental lens pitch is 1mm, the pixel size of the elemental image is given by Δst=1mm/41=24.4um. Due to a limited field of view of the elemental lens, each elemental image contains only a part of the object. Also there are elemental images that do not contain any object image since the object is out of the field of view of the corresponding elemental lenses. Figure 8 shows the orthographic images generated using the elemental images of Fig. 7. In Fig. 8, it is observed that the disparity of the closer object, object ‘C’, is smaller than that of the farther object, object ‘B’, which confirms that the generated image has orthographic projection geometry [12

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef] [PubMed]

]. The resolution of the generated orthographic image is 67(H) × 59(V) pixels. The total number of the generated orthographic images is 41(H) × 41(V), but only 34(H) × 35(V) orthographic images in the central part are used in generating the holograms. The angular separation between the projection lines of the adjacent orthographic images is Δs/flat/fla=0.42° and the whole angular range is -7.2°~ +7.2° for the horizontal direction and -7.2°~ +7.6° for the vertical direction. The sampling rate of each orthographic image is given by the elemental lens pitch. Hence the object is sampled with 1 mm interval in the orthographic image.

Fig. 7. Captured elemental images
Fig. 8. Generated orthographic images

The Fourier and Fresnel holograms are generated using the orthographic images shown in Fig. 8. Since the number of orthographic images, i.e. 34(H) × 35(V), is not sufficient and the sampling rate of each orthographic image, i.e. 1mm (H) × 1mm (V), is low, two techniques were used in the experiment. First, each orthographic image is repeated, doubling the number of orthographic images to 68(H) × 70(V). Note that the use of the intermediate view reconstruction (IVR) technique, which synthesizes the intermediate image by interpolating the neighboring images, can enhance the result [14

14. L. Zhang, D. Wang, and A. Vincent, “Adaptive reconstruction of intermediate views from stereoscopic images,” IEEE Trans. Circuits Syst. Video Technol. 16, 102–113 (2006). [CrossRef]

,15

15. J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, “View image generation in perspective and orthographic projection geometry based on integral imaging,” Opt. Express 16, 8800–8813 (2008), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-12-8800 [CrossRef] [PubMed]

]. In our experiment, however, the intermediate image is for simplicity generated by repeating without any interpolation. By this image repeating, the angular separation between the projection lines of the adjacent orthographic images is decreased from 0.42° to 0.21°, or equivalently the pixel size of the elemental image is reduced from Δsf=24.4um to Δsf=12.2um, while the whole angular range is maintained unchanged, i.e. -7.2°~ +7.2° for the horizontal direction and -7.2°~ +7.6° for the vertical direction.

Second, different sets of parameters were used in the generation and the reconstruction of the holograms. In the case of Fourier hologram, the parameters used in the generation process with Eqs. (4) and (10) are λ=1064um, l=fla=3.3mm, b=2/(λl)=5.7×105 and Δst=12.2um. Note that the wavelength is set to a large value in order to alleviate any aliasing induced by low sampling rate of the orthographic images in the generation process. Assuming the pixel pitch of the generated hologram is Δuv=22.4um, the corresponding focal length f of the Fourier transform lens is given as f=lΔu/(2Δs)=3.03mm by Eq. (10). In the reconstruction stage, another sets of the parameters that have more realistic values, i.e. λ=532nm, f=135.5mm, Δuv=22.4um, are used. From Eq. (5), one can easily verify that the use of these different reconstruction parameters scales the lateral coordinates x and y of the object space, i.e. decreases the lateral size of the object, by a factor of 135.5/3.03=44.72 while leaving the axial coordinates z nearly unchanged. Note that these reconstruction parameters were chosen in the experiment such that the axial coordinate is kept unchanged for the purpose of the clear demonstration of the theory. One can choose different focal lengths f of the Fourier transform lens to control the lateral and axial magnifications of the object space.

In the case of Fresnel hologram, the parameters used in the generation process with Eqs. (11), (12), and (18) are λ=1064um, l=fla=3.3mm, D=350mm, b=2D/λ=657.9, c=2, Δuv=1mm and Δsf=12.2um. Again, the wavelength was set to a large value to avoid any aliasing in the generation process. Also note that Δuv=1mm are the same as the elemental lens pitch of the lens array used in the experiment since it determines the sampling rate of the orthographic images. In the reconstruction stage, the wavelength and the pixel pitch of the hologram are changed to λ=532nm, and Δuv=22.4um. One can verify from Eq. (13) that these change scales the lateral coordinates x and y of the object by a factor of 1064um/532nm ≈ 1mm/22.4um ≈ 44.7 while leaving the axial coordinate z and the distance D unchanged. Note that the use of the different sets of the parameters and the doubling of the orthographic projection images in our experiment are mainly due to low sampling rate of the lens array method. If the orthographic images are obtained with higher sampling rate using different methods, these processes will not be required.

Figure 9 shows the generated Fourier holograms. The resolution of the generated Fourier hologram is 68(H) × 70(V) pixels, which is the same as the number of the repeated orthographic images. The Fourier holograms are generated by the proposed method for three cases, i.e. (a) no shift (δx, δy, δz)=(0,0,0) and no depth inversion, (b) shift (δx, δy, δz)=(10mm,10mm,-20mm) and no depth inversion, and (c) depth shift (δx, δy, δz)=(0,0,80mm) after depth inversion. Figure 10 shows the numerical reconstruction results. In the numerical reconstruction, the focal length of the Fourier transform lens is assumed to be 135.5mm as explained above. Using the Fresnel diffraction formula and the lens function [13

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

], the intensity at 135.5mm+z from the Fourier transform lens is calculated. Figure 10(a) shows that the Fourier hologram generated with the proposed method can reconstruct two plane objects successfully with correct depth order. The effect of the lateral and depth shift is shown in Fig. 10(b). We can see that the lateral shift and the depth shift are reflected in the results, as desired. The depth inversion result is shown in Fig. 10(c). Note that the depths of the objects are originally 30mm for object ‘C’ and 50mm for object ‘B’. By depth inversion, they are transferred to -30mm for ‘C’ and -50mm for ‘B’. Then, by depth shifting by 80mm, they are brought back to 30mm for ‘B’ and 50mm for ‘C’. Figure 10(c) shows this final result. As expected, object ‘C’ is focused at 50mm and object ‘B’ is focused at 30mm, which reveals that the depth order is inverted.

Fig. 9. Amplitude (upper figure) and phase (lower figure) of the generated Fourier hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift of -20mm, (c) with depth inversion and depth shift of 80mm
Fig. 10. Numerical reconstruction of the generated Fourier hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm

Figures 11 and 12 show the Fresnel holograms generated by the proposed method and their numerical reconstruction results. The distance from the Fresnel hologram to the orthographic image plane, D, is set at 350mm. The resolution of the generated Fresnel holograms shown in Fig. 11 is 260(H) × 260(V) pixels including small zero padding around the active area. Note that, unlike the case of Fourier holograms, the resolution of the generated Fresnel hologram is not the same as the number of the orthographic projection images. In the case of Fresnel hologram, the orthographic images are shifted by csD/l and overlapped on the hologram plane as shown in Eqs. (11), (12) and Fig. 4. Hence the resolution of the generated hologram is determined by the area covered by the shifted orthographic images and the pixel size on the hologram plane. Since the distance is D=350mm, the angular range, i.e. tan-1 (s/l), is about -7.2°~ +7.2°, and the size of one orthographic image is Δu(value used in generation process) × (number of pixels in one orthographic image) = 1mm × 67 = 67mm, the covered area on the hologram plane can be estimated by using the first term in Eq. (11) as around 67+350×2×2×tan(7.2°)≈244mm. The hologram pixel pitch used in the generation step is Δu=1mm as explained before. Therefore, the resolution of the active area of the generated Fresnel hologram is around 244/1=244 pixels for u-axis. Similar estimation gives 59+350×2×{tan(7.2°)+ tan(7.6°)}≈241 pixels for v-axis.

Using the Fresnel diffraction formula [13

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

], the intensity image at 350mm+z from the Fresnel hologram plane is calculated. Figures 11 and 12 reveal that the proposed method successfully generates a Fresnel hologram of the 3D objects from their orthographic projection images; their lateral/axial shift and depth inversion can also be performed with the given set of orthographic projection images.

Fig. 11. Amplitude (upper figure) and phase (lower figure) of the generated Fresnel hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm
Fig. 12. Numerical reconstruction of the generated Fresnel hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm

7. Conclusion

A novel method to generate Fourier and Fresnel holograms of 3D objects from their orthographic projection images is proposed. The lateral/axial shift and depth inversion of the 3D object can also be performed with the given set of the orthographic projection images using the proposed method, making it possible to locate 3D objects at any position in the reconstruction volume. The principle and the feasibility of the proposed method are verified experimentally by capturing the orthographic projection images using a lens array and generating Fourier and Fresnel holograms under various conditions. Consequently, the proposed method provides an efficient way to generate Fourier and Fresnel holograms of the real, existing 3D objects without any need for a coherent holographic capture process.

Acknowledgment

This research was partly supported by the MKE (The Ministry of Knowledge Economy), Korea under the ITRC (Information Technology Research Center) Support program supervised by the IITA (Institute for Information Technology Advancement) (IITA-2009-C1090-0902-0018)

This work was partly supported by the grant of the Korean Ministry of Education, Science and Technology. (The Regional Core Research Program / Chungbuk BIT Research-Oriented University Consortium)

References and links

1.

A. W. Lohmann and D. P. Paris, “Binary Fraunhofer holograms generated by computer,” Appl. Opt. 6, 1739–1748 (1967). [CrossRef] [PubMed]

2.

J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9, 405–407 (1966). [CrossRef]

3.

T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45, 4026–4036 (2006). [CrossRef] [PubMed]

4.

D. Abookasis and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A 20, 1537–1545 (2003). [CrossRef]

5.

Y. Sando, M. Itoh, and T. Yatagai, “Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects,” Opt. Lett. 28, 2518–2520 (2003). [CrossRef] [PubMed]

6.

N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express 15, 5754–5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef] [PubMed]

7.

N. T. Shaked and J. Rosen, “Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections,” Appl. Opt. 47, D21–D27 (2008). [CrossRef] [PubMed]

8.

D. Abookasis and J. Rosen, “Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene,” Appl. Opt. 45, 6533–6538 (2006). [CrossRef] [PubMed]

9.

Y. Sando, M. Itoh, and T. Yatagai, “Full-color computer-generated holograms using 3-D Fourier spectra,” Opt. Express 12, 6246–6251 (2004), http://www.opticsinfobase.org/oe/abstract.cfm?uri=OE-12-25-6246 [CrossRef] [PubMed]

10.

M.-S. Kim, G. Baasantseren, N. Kim, and J.-H. Park, “Hologram generation of 3D objects using multiple orthographic view images,” J. Opt. Soc. Korea 12, 269–274 (2008). [CrossRef]

11.

J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express 13, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]

12.

J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef] [PubMed]

13.

J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

14.

L. Zhang, D. Wang, and A. Vincent, “Adaptive reconstruction of intermediate views from stereoscopic images,” IEEE Trans. Circuits Syst. Video Technol. 16, 102–113 (2006). [CrossRef]

15.

J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, “View image generation in perspective and orthographic projection geometry based on integral imaging,” Opt. Express 16, 8800–8813 (2008), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-12-8800 [CrossRef] [PubMed]

OCIS Codes
(090.1760) Holography : Computer holography
(100.6890) Image processing : Three-dimensional image processing
(110.2990) Imaging systems : Image formation theory
(110.6880) Imaging systems : Three-dimensional image acquisition

ToC Category:
Holography

History
Original Manuscript: December 22, 2008
Revised Manuscript: February 26, 2009
Manuscript Accepted: March 18, 2009
Published: April 2, 2009

Citation
Jae-Hyeung Park, Min-Su Kim, Ganbat Baasantseren, and Nam Kim, "Fresnel and Fourier hologram generation using orthographic projection images," Opt. Express 17, 6320-6334 (2009)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-17-8-6320


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. A. W. Lohmann and D. P. Paris, "Binary Fraunhofer holograms generated by computer," Appl. Opt. 6, 1739-1748 (1967). [CrossRef] [PubMed]
  2. J. P. Waters, "Holographic image synthesis utilizing theoretical methods," Appl. Phys. Lett. 9, 405-407 (1966). [CrossRef]
  3. T. Mishina, M. Okui, and F. Okano, "Calculation of holograms from elemental images captured by integral photography," Appl. Opt. 45, 4026-4036 (2006). [CrossRef] [PubMed]
  4. D. Abookasis and J. Rosen, "Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints," J. Opt. Soc. Am. A 20, 1537-1545 (2003). [CrossRef]
  5. Y. Sando, M. Itoh, and T. Yatagai, "Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects," Opt. Lett. 28, 2518-2520 (2003). [CrossRef] [PubMed]
  6. N. T. Shaked, J. Rosen, and A. Stern, "Integral holography: white-light single-shot hologram acquisition," Opt. Express 15, 5754-5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef] [PubMed]
  7. N. T. Shaked and J. Rosen, "Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections," Appl. Opt. 47, D21-D27 (2008). [CrossRef] [PubMed]
  8. D. Abookasis and J. Rosen, "Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene," Appl. Opt. 45, 6533-6538 (2006). [CrossRef] [PubMed]
  9. Y. Sando, M. Itoh, and T. Yatagai, "Full-color computer-generated holograms using 3-D Fourier spectra," Opt. Express 12, 6246-6251 (2004), http://www.opticsinfobase.org/oe/abstract.cfm?uri=OE-12-25-6246 [CrossRef] [PubMed]
  10. M.-S. Kim, G. Baasantseren, N. Kim, and J.-H. Park, "Hologram generation of 3D objects using multiple orthographic view images," J. Opt. Soc. Korea 12, 269-274 (2008). [CrossRef]
  11. J.-H. Park, J. Kim, and B. Lee, "Three-dimensional optical correlator using a sub-image array," Opt. Express 13, 5116-5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef] [PubMed]
  12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, "Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification," Appl. Opt. 43, 4882-4895 (2004). [CrossRef] [PubMed]
  13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed., (McGraw-Hill, New York, 1996), Chaps. 4-5, p. 66-105.
  14. L. Zhang, D. Wang, and A. Vincent, "Adaptive reconstruction of intermediate views from stereoscopic images," IEEE Trans. Circuits Syst. Video Technol. 16, 102-113 (2006). [CrossRef]
  15. J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, "View image generation in perspective and orthographic projection geometry based on integral imaging," Opt. Express 16, 8800-8813 (2008), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-12-8800. [CrossRef] [PubMed]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited