OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 19, Iss. 10 — May. 9, 2011
  • pp: 9086–9101
« Show journal navigation

Calculation for computer generated hologram using ray-sampling plane

Koki Wakunami and Masahiro Yamaguchi  »View Author Affiliations


Optics Express, Vol. 19, Issue 10, pp. 9086-9101 (2011)
http://dx.doi.org/10.1364/OE.19.009086


View Full Text Article

Acrobat PDF (1590 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

We introduce a new algorithm for calculating computer generated hologram (CGH) using ray-sampling (RS) plane. RS plane is set at near the object and the light-rays emitted by the object are sampled at the plane. Then the light-rays are transformed into the wavefront with using the Fourier transforms. The wavefront on the CGH plane is calculated by wavefront propagation simulation from RS plane to CGH plane. The proposed method enables to reproduce high resolution image for deep 3D scene with angular reflection properties such as gloss appearance.

© 2011 OSA

1. Introduction

Hologram can reproduce very realistic three-dimensional (3D) images that satisfy all depth cues in the 3D perception of human vision without any special observation devices. For the electronic display of holography, the hologram pattern is calculated from 3D data, using the technique called computer generated hologram (CGH). The CGH for 3D image display requires large amount of computation, and many researches deal with horizontal parallax only (HPO) 3D display. However, full-parallax display which reproduces both horizontal and vertical parallax information is desirable, because it definitely surpasses other autostereoscopic 3D display techniques.

One of the common methods for CGH calculation is achieved by simulating the light propagation from point sources on the object surface (point-based method) [1

1. M. Lucente, “Optimization of hologram computation for real-time display,” Proc. SPIE 1667, 32–43 (1992).

,2

2. J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9(11), 405–406 (1966).

]. Thanks to the progress of high-speed calculation techniques, it becomes possible to define huge number of point sources in high-density on the object surface, allowing to render smooth surfaces. However, there are some issues in the point-based technique; hidden surface removal, gloss reproduction, view-dependent texture, which are important for realistic 3D image display. There have been several proposals on the hidden surface removal and gloss reproduction by CGH [3

3. J. S. Underkoffler, “Occlusion processing and smooth surface shading for fully computed synthetic holography,” Proc. SPIE 3011, 53–60 (1997).

5

5. K. Yamaguchi and Y. Sakamoto, “Computer generated hologram with characteristics of reflection: reflectance distributions and reflected images,” Appl. Opt. 48(34), H203–H211 (2009). [PubMed]

], but still farther improvements are required in the rendering techniques for CGH, in comparison with the field of Computer Graphics (CG).

On the other hand, the CGH calculation methods based on the light-ray reproduction, similar to holographic stereogram (HS) [6

6. T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [PubMed]

] or integral photography (IP) [7

7. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45(17), 4026–4036 (2006). [PubMed]

], were also proposed. In Ref. [6

6. T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [PubMed]

], Yatagai originally presented the concept and demonstrated the HPO CGH based on the stereoscopic approach. Based on this approach, it is possible to apply computer graphics (CG) rendering techniques, such as the hidden surface removal, surface shading and gloss reproduction [8

8. P. W. McOwan, W. J. Hossack, and R. E. Burge, “Three-dimensional stereoscopic display using ray traced computer generated holograms,” Opt. Commun. 82(12), 6–11 (1993).

10

10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

], which are important for realistic 3D display. Also the CGH from real scene can be easily generated from multi-view images captured by a camera array. It is possible to calculate the full-parallax CGH that reproduces light-rays by applying Fourier transform to the projection image generated by CG rendering technique [10

10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

]. In the reconstruction of the CGH, the light-rays from all elementary hologram cells are reproduced and the 3D image is reconstructed. However, the image far from the hologram plane is blurred due to the light-ray sampling and the diffraction at the hologram surface [11

11. J. T. McCrickerd, “Comparison of stereograms: pinhole, fly’s eye, and holographic types,” J. Opt. Soc. Am. A 62(1), 64–70 (1972).

15

15. M. Yamaguchi, N. Ohyama, and T. Honda, “Imaging characteristics of holographic stereogram,” Jpn. J. Opt. 22(11), 714–720 (1993) (in Japanese).

]. Details of the analysis are described in the Section 2. These influences increase in proportional to the distance between the image and the CGH plane, thus it is not suitable for the display of deep scene. Even though some techniques that take account of the phase information in the stereogram to avoid the image degradation by the diffraction effect [10

10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

,16

16. H. Kang, T. Yamaguchi, and H. Yoshikawa, “Accurate phase-added stereogram to improve the coherent stereogram,” Appl. Opt. 47(19), D44–D54 (2008). [PubMed]

], it is still not clear to realize realistic reproduction of deep scene using CG rendering techniques.

In this paper, we propose a new algorithm for calculating full-parallax CGH with the use of virtual “light-ray sampling (RS) plane”. This method can be considered as a hybrid approach that integrates the advantage of a light-ray based method and wavefront based method. The proposed method can be applied to both virtual and real objects that are rendered by artificial computer graphics or multi-view image data captured by a camera array. Even if the objects are located distant from the CGH plane, the resolution of the reconstructed image is not degraded since the long-distance light propagation is calculated on the basis of diffraction theory. Therefore, the method makes it possible to display deep scene and objects far from the CGH plane. Additionally, the method can reproduce the angular reflection properties of the object surface, such as the glossy or metallic characteristics, using directional light-ray information, thus it will be possible to reproduce a realistic image using various CG rendering techniques.

Muffoletto presented partitioned holographic computation [20

20. R. P. Muffoletto, “Numerical techniques for Fresnel diffraction in computational holography,” Ph. D. Thesis (Louisiana State University, 2006).

], in which the objects were grouped into subsets, and an intermediate hologram was generated for each subset. Then the final hologram was computed superposing the reconstructed wavefronts from the intermediate holograms. The purpose of introducing intermediate hologram was reduction of the amount of calculation. Contrary to this, the important point of this paper is to demonstrate that the introduction of RS plane enables higher image resolution in deep 3D image when light-ray based method for CGH calculation is used. The intermediate plane is employed to derive wavefront, not holographic fringe. This leads higher potential of application. We can define a set of RS planes, and the wave propagation between RS planes can be used for occlusion processing as briefly described in Subsection 3.4.

Matsushima and Nakahara demonstrated extremely high-definition CGH, in which an intermediate plane was used for occlusion processing [21

21. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [PubMed]

]. The method used the polygon-based CGH calculation, but still the rendering techniques for computer graphics provide us much more variation of methods for obtaining realistic images. Occlusion processing called “silhouette-masking technique” presented in Ref. [21

21. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [PubMed]

] is quite simple and applicable to the proposed method, even though the proposed method generates the wavefront from the light-ray information.

In Section 2 we analyze the influence of light-ray sampling and diffraction on image resolution in the ray-based 3D displays, and describe how wavefront-based approach can improve the image resolution in the object far from the hologram plane. Section 3 introduces a new algorithm for calculating CGH using RS plane. Section 4 shows the results of simulated and optical reconstruction of the CGH synthesized by the proposed method. Finally, discussions and conclusions are given in Section 5.

2. Resolution limit in the image of CGHs based on the light-ray reconstruction

There have been reported the results of analysis on the imaging properties of 3D displays based on light-ray reconstruction in IP, pinhole and HS. It is known that the IP and the full-parallax HS are quite similar if the image and the parallax information are recorded in high-resolution [10

10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

14

14. I. Glaser and A. A. Friesem, “Imaging properties of holographic stereograms,” Proc. SPIE 120, 150–162 (1977).

], and they can reproduce all the light-rays from the objects within a certain viewing angle, which sometimes called “light-field”. However, in the ray-based 3D displays, the resolution of the reconstructed image is degraded if the objects are located far from the hologram plane, comparing with the wavefront-based approach.

3D display by CGH is capable of reconstructing wavefront, thus that the limitations in the ray-based method can be overcome. Despite of its capability, the methods for calculating CGH based on HS, which enables to employ conventional rendering techniques of CG [6

6. T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [PubMed]

10

10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

], [16

16. H. Kang, T. Yamaguchi, and H. Yoshikawa, “Accurate phase-added stereogram to improve the coherent stereogram,” Appl. Opt. 47(19), D44–D54 (2008). [PubMed]

,17

17. W. Plesniak, M. Halle, V. M. Bove Jr, J. Barabas, and R. Pappu, “Reconfigurable image projection holograms,” Opt. Eng. 45(11), 115801 (2006).

], have same limitation as the ray-based 3D displays mentioned above. Figure 1
Fig. 1 The CGH calculation model based on light-ray information. In (a), a projection image is obtained using light-ray based rendering, where the center of projection is each sampling point on the hologram plane. Each projection image is Fourier transformed to derive the wavefront at each elementary hologram cell shown in (b). Tiling each hologram cell calculated in this manner, the whole hologram pattern is obtained. In reconstruction, the light-rays from the objects are reconstructed from all the elementary hologram cells, and 3D image can be observed.
shows a typical model of full-parallax CGH calculation based on the principle of HS. This section reviews the factors that limit the image resolution in ray-based 3D displays.

2.1 The influence of light-ray sampling on image resolution

In the method based on the light-ray reconstruction, the resolution of the reproduced images is limited due to the light-ray sampling as shown in Fig. 2
Fig. 2 Influence of light-ray sampling for the image resolution.
. The light-rays are usually sampled at the display plane which corresponds to the hologram plane in Fig. 2(a), while the angle of the light-ray direction is also sampled [Fig. 2(b)]. When the distance of the observer from the display plane, denoted as W, is large comparing to the depth of the image, denoted as z, the resolution of the image is affected mainly by the sampling on the display plane. Let p rs be the sampling pitch of the elementary cell on the display plane, then the resolution limit of the reconstructed image δrs is given by
δrs=prsW(z+W).
(1)
The image resolution also depends on the angular resolution of the light-rays, Δθa as shown in Fig. 2(b). The influence of the angular resolution becomes serious especially when the image is far from the display plane. From Fig. 2(b), the resolution limit due to the angular resolution of light-rays, δa, is given by
δa=|z|Δθa.
(2)
The limitation of image resolution is the mixture of the influences given in Eqs. (1) and (2) in the real situations. If, z << W the angular sampling becomes critical. In any cases, it is clear that the image resolution decreases with increasing the image distance z from the display plane.

2.2 The influence of diffraction

When the light-rays are reconstructed by small cells that reproduce light-rays, the reconstructed light-rays are broadened by the diffraction at the sampling plane and the reconstructed image is blurred. Let λ be the wavelength, then the resolutionδd is given by
δdλ|z|ars,
(3)
where ars is the size of the elementary hologram. The resolution of the reconstructed image is decreased in proportion to the distance z, thus the light-ray based 3D displays including HS-based CGH cannot reproduce high resolution image located at far from the display plane.

3. Method for CGH calculation

In this section, the principle and the algorithm of CGH calculation using RS plane is described. Figure 3
Fig. 3 Principle of the proposed CGH calculation using RS plane.
illustrates the model for this method. RS plane is defined near the object location and light-ray information from the object is sampled by light-ray sampling points on the RS plane likewise the method based on the light-ray reconstruction. The light-ray information corresponds to the projection images shown in Fig. 1; these images can be generated using conventional CG techniques such as ray tracing and image based rendering (IBR) from multi-view images of real or virtual object. Then each image is Fourier transformed and the complex amplitude distribution is centered at each light-ray sampling point, thus, the wavefront of the RS plane is obtained. It is considered as the conversion of the light-ray information into the wavefront on the RS plane. The wavefront propagation from RS plane to CGH plane is calculated by two-dimensional Fresnel diffraction. Finally simulating the interference of the wavefront propagated from the RS plane and the reference wave at the CGH plane, we obtain the hologram pattern.

Proposed method can reproduce high resolution images even for the object located far from CGH plane, because the light-rays are sampled near the object and the long-distance wavefront propagation is calculated based on the diffraction theory thus δrs, δa, and δd in Eqs. (1)(3) can be kept small, as z is substituted with the image distance from the RS plane. Therefore, the method can suit the display of deep scene and the object at far from CGH plane. Since the image on the RS plane can be calculated by ray-based rendering techniques, it is easy to implement the hidden-surface-removal, surface shading, texture mapping, and gloss appearance, hence the proposed method can reproduce image realistically. The details of the proposed method are described in the following subsections.

3.1 Calculation of the projection images on the RS plane

The The model of RS plane is illustrated in Fig. 4
Fig. 4 The model of RS plane and projection images at each light-ray sampling point.
. On the RS plane, light-rays are sampled at I × J points, where I and J are the total number of light-ray sampling points in horizontal and vertical directions. In the first step, the projection images for all light-ray sampling points are calculated. The center of projection for the projection image pij[m,n] (in M × N pixels) corresponds to the sampling point (xi, yj), where i = 0, 1, …I-1 and j = 0, 1, …J-1. In the following explanation, M and N are assumed to be powers of 2 for simplicity, since FFT (fast Fourier transform) is applied to pij[m,n]. If the target is virtual object, the projection image can be rendered directly using common rendering software for CG as shown in Fig. 5(a)
Fig. 5 Schematic of obtaining the projection images. (a) The projection images can be obtained by direct projection using general CG rendering technique; (b) The projection image can be obtained from multi-view images by applying IBR technique.
. In case that the target is a real object, it is often difficult to directly capture pij[m,n] since the camera should be placed very near the object. In this case, the projection images can be calculated from multi-view images captured by camera array as shown in Fig. 5(b), based on the application of IBR technique.

3.2 Conversion the light-ray information into the wavefront

The process of conversion of light-ray information into wavefront is shown in Fig. 6
Fig. 6 A diagram of converting the light-ray information into the wavefront on the RS plane.
. The projection image pij[m,n] at the point (xi, yj) is multiplied by discrete random phase distribution φij[m,n], where φij[m,n] uniformly distributes in the range [0,2π). Then it is transformed into the complex amplitude distribution Pij[k,l] by using FFT as follows;
Pij[k,l]=FFT{pij[m,n]exp{jϕij[m,n]}}=Pij((kM2)ΔkRS,(lN2)ΔlRS),
(4)
where Pij[k, l] is a discrete version of Pij(x,y), which is the complex amplitude distribution of the small region around (xi, yj) on the RS plane, ΔkRS and ΔlRS are the sampling pitch on the RS plane, and m and k = 0, 1, … M-1, n and l = 0, 1, … N-1, respectively. As shown in Fig. 7
Fig. 7 Discrete wavefront WRS[kRS, lRS] on the RS plane.
, discrete two-dimensional wavefront WRS[kRS, lRS] is obtained by tiling Pij[k,l] as
WRS[kRS,lRS]=i=0I1j=0J1Pij[kRSxiΔkRS+M2,lRSyjΔlRS+N2],
(5)
where kRS = 0, 1, … IM-1, lRS = 0,1, … JN-1. As Pij[kRSxi/ΔkRS+M/2,lRSyj/ΔlRS +N/2] represents the wavefront within the small region (M/2kRSxi/ΔkRS<M/2, N/2lRSyj/ΔlRS<N/2), the wavefront on the RS plane is obtained by tiling Pij[k,l] without overlapping. This process corresponds to the recording process for the HS shown in Fig. 1.

3.3 Wavefront propagation from RS plane to CGH plane

Wavefront propagation from RS plane to CGH plane is calculated by two-dimensional Fresnel diffraction. Let the complex amplitude at (x, y) on the RS plane be defined as WRS(x, y), and the complex amplitude at (X, Y) by the object wave on the CGH plane be defined as WO(X, Y). When the RS plane is at a distance R from the CGH plane, the Fresnel diffraction equation is given by
WO(X,Y)=1jλRexp[j2πRλ]WRS(x,y)exp[j2πλ(Xx)2+(Yy)22R]dxdy,
(6)
where λ is the wavelength. WRS(x, y) is sampled by sampling pitch ΔkRS and ΔlRS mentioned above. Let the sampling pitch be ΔkH and ΔlH on the CGH plane, then the discrete expression of the Fresnel diffraction becomes
WO[kH,lH]=kRS=0IM1lRS=0JN1WRS[kRS,lRS]               ×exp[j2πλ(kHΔkHkRSΔkRS)2+(lHΔlHlRSΔlRS)22R]         =AO[kH,lH]exp{jφO[kH,lH]},
(7)
where kH = 0, 1, … KH, lH = 0, 1, … LH, KH and LH are the number of pixels along horizontal and vertical directions on CGH plane, and a constant coefficient is omitted. We can calculate the wavefront on the CGH plane based on Eq. (7) and finally, by simulating the interference with the reference wave, the hologram pattern H[kH, lH] are calculated as
H[kH,lH]=c+2Re{AO[kH,lH]exp[jφO[kH,lH]j2πlHΔlHλsinθref]}       =c+2AOcos{φ[kH,lH]2πlHΔlHλsinθref},
(8)
where θRef. is a reference wave angle from z-axis toward y-axis. It is possible to decrease the calculation cost by using look up table or discrete Fresnel transform algorithm by using FFT and also Shifted Fresnel diffraction [18

18. R. P. Muffoletto, J. M. Tyler, and J. E. Tohline, “Shifted Fresnel diffraction for computational holography,” Opt. Express 15(9), 5631–5640 (2007). [PubMed]

], if the two-dimensional wavefront of RS plane is set in parallel with CGH plane.

3.4 Application to the objects at different distance from CGH plane

In the case when there exist plural objects at different distance from CGH plane and/or background scene, it is required to set the RS plane near each object, since otherwise the image resolution is degraded in the object far from the RS plane. In such a case as shown in Fig. 8
Fig. 8 Calculation of light propagation for the objects located at various distances.
, the wavefront propagation by Fresnel diffraction from the RS plane 1 to the RS plane 2 is calculated first. The wavefront propagated from RS plane 1 is masked at the region of the object 2 on the RS plane 2 and the wavefront of the object 2 is substituted for masking region, we can obtain the wavefront combined the object 1and the object 2 on the RS plane 2. We can also perform this procedure by using the “silhouette-masking technique” presented in Ref. [21

21. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [PubMed]

]. Then same procedure is calculated on RS plane 3 and finally the diffraction from the RS plane 3 to CGH plane is calculated, the wavefront on the CGH plane is obtained. The details of the method that accounts occlusion with plural RS planes will be reported in another paper.

4. Experiment

In this section, we demonstrate the result of computer simulation and optical reconstruction of CGH calculated with the use of proposed RS plane. The experiment consists of two parts. In the first part, to confirm that the proposed method is applicable for deep scene, we calculated the CGH based on the proposed method and the conventional method based on light-ray reproduction, which corresponds to the CGH of HS (referred to as HSCGH hereafter), and compared the simulated reconstruction images of deep scene. In the second part of the experiment, the objects that have glossy surface were recorded in CGH based on the proposed method, and the reconstructed images were verified both numerically and optically. Glossy object was chosen because it is especially difficult to be produced by conventional wavefront based method. The optical reconstruction was carried out by using the CGH printer described in Section 4.1.

4.1 CGH printing system

4.2 Reproduction of deep scene

The planar objects 1 and 2 were arranged at different distances from the CGH plane, to demonstrate the resolution of deep 3D image, as illustrated in Fig. 10
Fig. 10 The object model of calculated CGH by using RS plane.
. Table 2

Table 2. Parameters of CGH Calculation

table-icon
View This Table
| View All Tables
shows the parameters for this CGH calculation. 128 × 128 projection images, each of which was in 32 × 32 pixels, were Fourier transformed by FFT. Then the total number of pixels in RS plane is 4096 × 4096. The pixel pitch on the RS plane was ΔkRS = ΔlRS = 64μm/32 = 2μm, and the RS plane size is IM⋅ΔkRS = 4096 × 2μm = 256 × 64μm = 8.2mm. The RS plane was placed at 5mm front of each object, while in the conventional HSCGH, the light-rays are directly sampled at the CGH plane. The RS plane could be located behind the object, but placed at 5mm front of each object in all experiments for simplicity. Note that the best location of the RS plane can be determined so that the distance between the object and RS plane becomes minimal. However, the RS plane was defined at the plane apart from the planar objects in this experiment, otherwise the generation of projected images had not been needed to derive the wavefront on the RS plane. The distances of the two objects from the CGH plane were 10mm and 200mm, respectively, as shown in Fig. 10.

In the simulation of image reconstruction, initially, wavefront propagation from CGH plane to the imaging lens was calculated by discrete Fresnel diffraction. The imaging lens corresponds to the human eye at 200mm from the CGH plane. Then, the wavefront inside the lens pupil was multiplied by lens phase function, finally, calculate the wavefront propagation from lens pupil to image plane. The pupil size of the imaging lens was 7mm, which almost equivalent to the human eye.

The results of reconstructed images are shown in Fig. 11
Fig. 11 Reconstructed images by numerical simulation. (a) object 1 reproduced by HSCGH; (b) object 1 reproduced by proposed method; (c) object 2 reproduced by HSCGH; (d) object 2 reproduced by proposed method.
. Comparing the image of object 2 that is far from the CGH plane in (c) and (d), in (c) the edge of the star is blurred, whereas a sharp image is reproduced in (d). When the object is near the CGH plane, the difference was small, though the reconstructed image of HSCGH (a) was slightly blurred. Note that, if the distance of the object would be 5mm from the CGH plane, the HSCGH and the proposed method would become equivalent.

4.3 Reproduction of 3D objects with glossy surface

In the second experiment, 3D objects shown in Fig. 12
Fig. 12 The object models of calculated CGH by proposed method.
were recorded as CGH by the proposed method. There can be observed the glossy surface, rendered by conventional CG software, was also verified. Again the RS plane was defined 5mm front of the object and distance from the RS plane to the CGH plane was 200mm in this case. Table 3

Table 3. Parameters of CGH Calculation

table-icon
View This Table
| View All Tables
shows the parameters for these CGHs calculation.

Reconstructed images were obtained numerically and optically. Numerical simulation was in the same manner as explained in 4.2. In the optical reconstruction, the CGHs were recorded by CGH printer, and reconstructed by plane wave of laser light (wavelength = 532nm). The result of reconstruction is shown in Fig. 13
Fig. 13 Reconstructed image by numerical simulation and optical reconstruction. (a) and (d) Perspective image of each object; (b) and (e) Reconstructed image by simulation; (c) and (f) Optical reconstructed image.
. The reconstructed images including the shades, glossed and highlights were observed and the advantage of the proposed method were confirmed, even though the size and viewing angle were not enough for realistic display. Speckle noise and defects were observed in the reconstructed image of Fig. 13. The defects were mainly due to the error in the CGH printer system, such as the lens aberrations and the nonuniformity of recording light intensity. It remains as an issue for future work.

5. Discussion

5.1 Discussion on the reconstructed image resolution

Figure 11 shows that the conventional method based on the light-ray reconstruction (HSCGH) cannot reproduce high resolution image for the objects far from the hologram plane, while the proposed method achieves high resolution. In this section, let us quantitatively examine the results with using the Eqs. (1)(3).

Viewing distance W is 200mm in the same manner as 4.2. In HSCGH, z in Eqs. (1)(3) corresponds to the distance between the object and CGH plane, and the resolutions for the object 1 and object 2 are estimated as Table 4

Table 4. Estimated Resolution of the Image in Subsection 4.2

table-icon
View This Table
| View All Tables
. The dominant component that limits the resolution for object 1 is δrs, which is equal to 0.06mm, and that for the object 2 is δa and δd, which are equal to 0.83mm. The reason why δa and δd becomes the same amount is that the sampling pitch of light-rays was set equal to the spread caused by the diffraction. In the experiment of Subsection 4.2, the object size was about 8mm, and the resolution limit of HSCGH was about 1/10 of the object size, thus the shape of the object located far from CGH plane was hardly recognized in the reconstructed image.

On the other hand in the proposed method, the light-rays are sampled at the RS plane, 5mm apart from each object, thus z equals 5mm for both objects. The dominant components in both objects are given by δrs, which is much smaller than the case of HSCGH. This result shows that the proposed method can reproduce significantly high resolution images especially for the images distant from the CGH plane.

5.2 Application of the RS plane defined on the object surface

As another application of the proposed approach, the RS plane can be also defined on the object surface, instead of the plane parallel to the CGH plane mentioned above. In this case, the light-rays encoded at the RS plane represent the angular distribution of the light reflected from the surface (Fig. 14
Fig. 14 The case of setting the RS plane on the object surface.
). For example, if we characterize a surface of an object with bidirectional reflection distribution function (BRDF) and define the illumination light source, then we have the directional distribution of reflected light intensity, which is equivalent to the image of directional light-rays. The image of light-rays can be transformed to the wavefront on the surface by Fourier transform as the same process explained in the Subsection 3.2. To obtain the wavefront propagating to the direction of hologram plane, the phase of the wavefront should be modified considering the inclination angle of the surface to the optical axis, similar to the polygon-based method [4

4. K. Matsushima, “Exact hidden-surface removal in digitally synthetic full-parallax holograms,” Proc. SPIE 5742, 25–32 (2005).

]. Thus the CGH can be calculated by the Fresnel diffraction of the wavefront to the CGH plane. In this case, each sampling point on the object surface represents the point light source, and the amplitude and the phase of each point source is defined for the surfaces with arbitrary angular reflection properties, such as glossy or luster characteristics. The implementation of this method has not been done in this paper, and it is remained as a future work.

5.3 Discussion on the calculation cost

The computational cost for CGH calculation is not the main subject of this paper, and the calculation algorithm and implementation should be optimized in future. In this subsection, we tentatively discuss the calculation time of the experiments explained in Section 4. All CGHs in the experiments were calculated by using a PC with a CPU of Intel Westmere-EP (2.93GHz) processor and a shared memory of 24Gbytes.

In the first experiment, the calculation costs of the proposed method and HSCGH were compared. The total computational times of the proposed method and HSCGH were 682sec and 660sec, respectively. In both methods, calculation times about 655sec were consumed by the generation of projection images at the ray sampling points, as the calculation costs were same when the resolutions of projection images were same. 5sec were dedicated for mostly 32 × 32 2D-FFT repeated 128 × 128 times for all projection images; also they were common for both methods. The proposed method required about 22sec more time than HSCGH. Most of this additional time was spent by the calculation of discrete Fresnel diffraction. In this experiment, the discrete Fresnel diffraction was calculated with using 4096 × 4096 2D-FFT, but the algorithm would be able to be improved for better efficiency.

In the CGH calculation of Subsection 4.3, 3D object was used. Calculation time required for rendering 256 × 256 projection images was 2621sec, and for converting light-ray to wavefront and propagating wavefront was about 93sec. Total time for the CGH calculation was 2715sec.

In both experiments, the generation of projection images and the calculation of discrete Fresnel diffraction were calculated without applying the technique of the parallel processing and GPU computing. By application of these techniques, it is possible to decrease the calculation cost. Additionally, if it is possible to obtain the projection images as light-ray information of the object in advance, more time decrease is expected. From these results, we can say that the investigation of reducing the computation time for both projection image generation and Fresnel diffraction should be conducted in future.

5.4 Distance range of objects

If the object is located very far from the hologram plane, i.e., such as the case that Fraunhofer approximation is applicable, it is not required to apply Fresnel diffraction. Calculating Fraunhofer diffraction is appropriate in such case. If we define a background object located at very far from the hologram plane, similar to the case shown in Fig. 8, the wavefront propagation from the object at infinity to the hologram plane (or, RS plane #2 when occlusion processing is applied) can be calculated by taking Fourier transform of the background object, instead of Fresnel diffraction. This will be useful when the viewing field is large, because far objects are easily observable. In the experiment presented in Section 4, this was not applied since the viewing field is small due to the limitation in the resolution of hologram recording system.

On the other hand, the depth range of near objects should be also considered. If it is not significantly large, such that δRS, δa, and δd determined by Eqs. (1)(3) are smaller than a certain size required for the image resolution, they can be included into a single RS plane. In addition, in the case that all the objects are near the hologram plane, ray-based calculation gives reasonable resolution. Thus the proposed method is meaningful when a RS plane should be placed at different depth from the hologram plane. Optimizing the depth range of objects that should be included in a single RS plane is one of the future issues.

6. Summary

Although CGH’s are capable of high-resolution 3D image reproduction, it is not easy to implement the hidden surface removal and the gloss reproduction like a traditional CG techniques by conventional methods for calculating CGH. In this paper we proposed a new algorithm for the calculation of CGH using the RS plane virtually defined near the object. The proposed method can reproduce high resolution image of deep scene with gloss appearance, while conventional 3D displays with light-ray reproduction cannot realize high-resolution in the object located far from the display plane. In the experiment, we demonstrated that the proposed method for CGH calculation enables high resolution image display with gloss appearance in a deep scene by simulated image reconstruction and optical reconstruction.

As a future work, it is required to record and reconstruct the large-sized CGH optically for confirming the advantage of the proposed method, and demonstrating the highly-realistic 3D display by holographic technique.

References and links

1.

M. Lucente, “Optimization of hologram computation for real-time display,” Proc. SPIE 1667, 32–43 (1992).

2.

J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9(11), 405–406 (1966).

3.

J. S. Underkoffler, “Occlusion processing and smooth surface shading for fully computed synthetic holography,” Proc. SPIE 3011, 53–60 (1997).

4.

K. Matsushima, “Exact hidden-surface removal in digitally synthetic full-parallax holograms,” Proc. SPIE 5742, 25–32 (2005).

5.

K. Yamaguchi and Y. Sakamoto, “Computer generated hologram with characteristics of reflection: reflectance distributions and reflected images,” Appl. Opt. 48(34), H203–H211 (2009). [PubMed]

6.

T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [PubMed]

7.

T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45(17), 4026–4036 (2006). [PubMed]

8.

P. W. McOwan, W. J. Hossack, and R. E. Burge, “Three-dimensional stereoscopic display using ray traced computer generated holograms,” Opt. Commun. 82(12), 6–11 (1993).

9.

H. Yoshikawa and H. Kameyama, “Integral holography,” Proc. SPIE 2406, 226–234 (1995).

10.

M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

11.

J. T. McCrickerd, “Comparison of stereograms: pinhole, fly’s eye, and holographic types,” J. Opt. Soc. Am. A 62(1), 64–70 (1972).

12.

L. E. Helseth, “Optical transfer function of three-dimensional display systems,” J. Opt. Soc. Am. A 23(4), 816–820 (2006).

13.

P. S. Hilaire, “Modulation transfer function and optimum sampling of holographic stereograms,” Appl. Opt. 33(5), 768–774 (1994). [PubMed]

14.

I. Glaser and A. A. Friesem, “Imaging properties of holographic stereograms,” Proc. SPIE 120, 150–162 (1977).

15.

M. Yamaguchi, N. Ohyama, and T. Honda, “Imaging characteristics of holographic stereogram,” Jpn. J. Opt. 22(11), 714–720 (1993) (in Japanese).

16.

H. Kang, T. Yamaguchi, and H. Yoshikawa, “Accurate phase-added stereogram to improve the coherent stereogram,” Appl. Opt. 47(19), D44–D54 (2008). [PubMed]

17.

W. Plesniak, M. Halle, V. M. Bove Jr, J. Barabas, and R. Pappu, “Reconfigurable image projection holograms,” Opt. Eng. 45(11), 115801 (2006).

18.

R. P. Muffoletto, J. M. Tyler, and J. E. Tohline, “Shifted Fresnel diffraction for computational holography,” Opt. Express 15(9), 5631–5640 (2007). [PubMed]

19.

H. Yoshikawa and K. Takei, “Development of a compact direct fringe printer for computer-generated holograms,” Proc. SPIE 5290, 114–121 (2004).

20.

R. P. Muffoletto, “Numerical techniques for Fresnel diffraction in computational holography,” Ph. D. Thesis (Louisiana State University, 2006).

21.

K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [PubMed]

OCIS Codes
(090.0090) Holography : Holography
(090.1760) Holography : Computer holography
(090.2870) Holography : Holographic display

ToC Category:
Holography

History
Original Manuscript: March 4, 2011
Revised Manuscript: April 11, 2011
Manuscript Accepted: April 17, 2011
Published: April 25, 2011

Citation
Koki Wakunami and Masahiro Yamaguchi, "Calculation for computer generated hologram using ray-sampling plane," Opt. Express 19, 9086-9101 (2011)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-19-10-9086


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. M. Lucente, “Optimization of hologram computation for real-time display,” Proc. SPIE 1667, 32–43 (1992).
  2. J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9(11), 405–406 (1966).
  3. J. S. Underkoffler, “Occlusion processing and smooth surface shading for fully computed synthetic holography,” Proc. SPIE 3011, 53–60 (1997).
  4. K. Matsushima, “Exact hidden-surface removal in digitally synthetic full-parallax holograms,” Proc. SPIE 5742, 25–32 (2005).
  5. K. Yamaguchi and Y. Sakamoto, “Computer generated hologram with characteristics of reflection: reflectance distributions and reflected images,” Appl. Opt. 48(34), H203–H211 (2009). [PubMed]
  6. T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [PubMed]
  7. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45(17), 4026–4036 (2006). [PubMed]
  8. P. W. McOwan, W. J. Hossack, and R. E. Burge, “Three-dimensional stereoscopic display using ray traced computer generated holograms,” Opt. Commun. 82(12), 6–11 (1993).
  9. H. Yoshikawa and H. Kameyama, “Integral holography,” Proc. SPIE 2406, 226–234 (1995).
  10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).
  11. J. T. McCrickerd, “Comparison of stereograms: pinhole, fly’s eye, and holographic types,” J. Opt. Soc. Am. A 62(1), 64–70 (1972).
  12. L. E. Helseth, “Optical transfer function of three-dimensional display systems,” J. Opt. Soc. Am. A 23(4), 816–820 (2006).
  13. P. S. Hilaire, “Modulation transfer function and optimum sampling of holographic stereograms,” Appl. Opt. 33(5), 768–774 (1994). [PubMed]
  14. I. Glaser and A. A. Friesem, “Imaging properties of holographic stereograms,” Proc. SPIE 120, 150–162 (1977).
  15. M. Yamaguchi, N. Ohyama, and T. Honda, “Imaging characteristics of holographic stereogram,” Jpn. J. Opt. 22(11), 714–720 (1993) (in Japanese).
  16. H. Kang, T. Yamaguchi, and H. Yoshikawa, “Accurate phase-added stereogram to improve the coherent stereogram,” Appl. Opt. 47(19), D44–D54 (2008). [PubMed]
  17. W. Plesniak, M. Halle, V. M. Bove, J. Barabas, and R. Pappu, “Reconfigurable image projection holograms,” Opt. Eng. 45(11), 115801 (2006).
  18. R. P. Muffoletto, J. M. Tyler, and J. E. Tohline, “Shifted Fresnel diffraction for computational holography,” Opt. Express 15(9), 5631–5640 (2007). [PubMed]
  19. H. Yoshikawa and K. Takei, “Development of a compact direct fringe printer for computer-generated holograms,” Proc. SPIE 5290, 114–121 (2004).
  20. R. P. Muffoletto, “Numerical techniques for Fresnel diffraction in computational holography,” Ph. D. Thesis (Louisiana State University, 2006).
  21. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [PubMed]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited