OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 20, Iss. 22 — Oct. 22, 2012
  • pp: 24175–24195
« Show journal navigation

Integral imaging based 3D display of holographic data

Ali Özgür Yöntem and Levent Onural  »View Author Affiliations


Optics Express, Vol. 20, Issue 22, pp. 24175-24195 (2012)
http://dx.doi.org/10.1364/OE.20.024175


View Full Text Article

Acrobat PDF (7946 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

We propose a method and present applications of this method that converts a diffraction pattern into an elemental image set in order to display them on an integral imaging based display setup. We generate elemental images based on diffraction calculations as an alternative to commonly used ray tracing methods. Ray tracing methods do not accommodate the interference and diffraction phenomena. Our proposed method enables us to obtain elemental images from a holographic recording of a 3D object/scene. The diffraction pattern can be either numerically generated data or digitally acquired optical data. The method shows the connection between a hologram (diffraction pattern) and an elemental image set of the same 3D object. We showed three examples, one of which is the digitally captured optical diffraction tomography data of an epithelium cell. We obtained optical reconstructions with our integral imaging display setup where we used a digital lenslet array. We also obtained numerical reconstructions, again by using the diffraction calculations, for comparison. The digital and optical reconstruction results are in good agreement.

© 2012 OSA

1. Introduction

Integral imaging is a promising 3D capture and display system. Conventional integral imaging systems are composed of two stages: a pick-up system to obtain elemental images of a 3D object/scene and a display stage which integrates the elemental images for reconstruction [1

1. G. Lippmann, “La photographie intégrale,” C.R. Hebd. Seances Acad. Sci. 146, 446–451 (1908).

]. These parts are physical optical setups. These setups are usually not end-to-end, that is, two setups are separate. In the capture part, the elemental images are imaged by means of a series of lenses and a lenslet array, on a CCD array or a digital camera. In the display setup, the obtained elemental images are displayed on a LCD and the reconstruction is observed through a lenslet array. It is necessary to match the size of the captured elemental images on the CCD to the displayed ones on the LCD in the display setup since the physical sizes of the devices are usually different. Furthermore, the pixel size of the CCD sensor does matter since the quality of the reconstruction depends on it. Finally, the LCD panel in the display setup should be able to accommodate all of the captured elemental images. To display a good quality still 3D image or a video sequence, both setups require usual adjustments and alignments (imaging distances, magnification ratios, etc.) of optical elements. Such a work is studied rigorously in [2

2. F. Okano, H. Hoshino, H. A. Jun, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on the integral photography,” Appl. Opt. 36, 1–14 (1997). [CrossRef]

]. That work is an example for the case where optically captured elemental images of a physical 3D object are reconstructed optically at the display end. Such integral imaging systems consist of decoupled capture and display units, and therefore, both units need careful adjustments. For applications such as 3D gaming, 3D modeling, animation, etc., the only physically needed part is the display. In those systems, the elemental images are digitally obtained for synthetic 3D objects and then displayed on an optical display setup. Digital techniques are more flexible compared to optical capture processes. If the elemental images are obtained by computation, optical adjustments are needed only for the display part. Ray tracing methods can be used to generate elemental images. There are many reported studies using ray tracing methods to obtain elemental images for computer generated integral imaging systems [3

3. S. S. Athineos, N. P. Sgouros, P. G. Papageorgas, D. E. Maroulis, M. S. Sangriotis, and N. G. Theofanous, “Photorealistic integral photography using a ray-traced model of capturing optics,” J. Electron Imaging 15, 0430071–0430078 (2006). [CrossRef]

7

7. J.-K. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of three-dimensional object and system analysis using ray tracing in practical integral imaging system,” Proc. of SPIE 6695, 6695191–66951912 (2007).

]. The capture process, for computer generated integral imaging systems, is performed using certain computer graphics algorithms such as point retracing rendering, multiple viewpoint rendering, parallel group rendering, viewpoint vector rendering, etc., [8

8. B.-N.-R. Lee, Y. Cho, K.. S. Park, S.-W. Min, J.-S. Lim, M. C. Whang, and K. R. Park, “Design and implementation of a fast integral image rendering method,” Lect. Notes Comput. Sc. 4161, 135–140 (2006). [CrossRef]

]. All of these algorithms are based on ray tracing.

In our work, as an alternative method to generate elemental images, we performed diffraction calculations using wave propagation methods based on the Fresnel kernel. To the best of our knowledge, such an approach is not reported before. One can compute the scalar field distribution in the space using the Fresnel propagation model [9

9. U. Schnars and W. P. O. Jüptner, “Digital recording and numerical reconstruction of holograms,” Meas. Sci. and Tech. 13, R85–R110 (2002). [CrossRef]

, 10

10. L. Onural, “Sampling of the diffraction field,” Appl. Opt. 39, 5929–5935 (2000). [CrossRef]

]. We can generate elemental images by first modeling the optical system with image processing tools and then by applying optical wave propagation principles [11

11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

]. Wave propagation models accommodate diffraction and interference phenomena whereas ray models do not [12

12. B. E. A. Saleh and M. C. Teich, Fundamentals of Photonics (John Wiley and Sons, Inc., 1991). [CrossRef]

,13

13. J. W. Goodman, Introduction to Fourier Optics (Mc-Graw-Hill, 1996).

]. Wave propagation models are especially useful for the cases where we have holographic data of a 3D object/scene. This is in fact an inverse problem of hologram generation from elemental images [5

5. S.-H. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of digital hologram generated by sub-image of integral imaging,” Proc. of SPIE 6912, 69121F1–69121F10 (2008).

, 14

14. T. Mishina, M. Okui, and F. Okano, “Generation of holograms using integral photography,” Proc. of SPIE 5599, 114–122 (2004). [CrossRef]

, 15

15. R. V. Pole, “3-D imagery and holograms of objects illuminated in white light,” Appl. Phys. Lett. 10, 20–22 (1967). [CrossRef]

]; that is, we obtain elemental images from a holographic recording as in [16

16. B. Javidi and S.-H. Hong, “Three-dimensional holographic image sensing and integral imaging display,” J. Disp. Technol 1, 341–346 (2005). [CrossRef]

].

There are certain problems with direct optical reconstruction from holographic data by holographic means, such as speckle noise due to coherent illumination. Thus, certain image processing techniques (filtering and averaging) are usually performed to remove the noise and to reconstruct the data digitally [17

17. C. Quan, X. Kang, and C. J. Tay, “Speckle noise reduction in digital holography by multiple holograms,” Opt. Eng. 461158011–1158016 (2007). [CrossRef]

19

19. T. Baumbach, E. Kolenović, V. Kebbel, and W. Jüptner, “Improvement of accuracy in digital holography by use of multiple holograms,” Appl. Opt. 45, 6077–6085 (2006). [CrossRef] [PubMed]

]. This way, the visibility in digital reconstructions can be improved. However, in holographic optical reconstructions, speckle noise is present due to coherent illumination. In our case, at least on the display side, we do not have additional speckle noise problem since we use incoherent illumination for the reconstructions.

It is not desirable to use lasers for the reconstruction due to potential hazards to the eye, either. It may be possible to use LED illumination to avoid laser hazards while observing the holographic reconstructions [20

20. T. Ito and K. Okano, “Color electroholography by three colored reference lights simultaneously incident upon one hologram panel,” Opt. Express 12, 4320–4325 (2004). [CrossRef] [PubMed]

, 21

21. F. Yaraş and L. Onural, “Color holographic reconstruction using multiple SLMs and LED illumination,” Proc. of SPIE 7237, 72370O1–72370O5 (2010).

]. However, the reconstruction quality would be lower due to spectral properties of the light source.

On the other hand, integral imaging works primarily with incoherent illumination. It may be desirable to reconstruct holographic data by an integral imaging display. A conversion from holographic data to elemental image data is needed to reconstruct the 3D image using incoherent light and integral imaging techniques. Such an idea is studied in [16

16. B. Javidi and S.-H. Hong, “Three-dimensional holographic image sensing and integral imaging display,” J. Disp. Technol 1, 341–346 (2005). [CrossRef]

]. In that work, first a series of images are reconstructed at different depths, creating a set of slices of 3D data. Then, the elemental images are generated using another process which maps each slice to the elemental image plane. Instead of such an approach, we directly use holographic data to display 3D images on an integral imaging setup. For this purpose, we designed a direct pick-up integral imaging capture system, [6

6. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photgraphy,” Proc. of SPIE 4297, 187–195 (2001). [CrossRef]

]. This digital pick-up system is realized solely by a computer program that simulates wave propagation. Lenslet arrays that we used in the design are composed of digital synthetic Fresnel thin lenslets [11

11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

]. We processed the input holographic data with this simulator to obtain computer generated elemental images. This way, we generate the elemental images in one step. We used these computer generated elemental images in a physical display setup to reconstruct optically 3D images. In our proposed display, we used a modified version of the setup given in [11

11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

] where we replaced the analog lenslet array with a digitally controlled synthetic Fresnel lenslet array written on a phase-only LCoS SLM. By this procedure, we can generate elemental images digitally from recorded holographic input data and optically reconstruct a 3D image from them on our integral imaging display. For example, our method can be used to generate elemental images from holograms captured within a diffraction tomography setup [22

22. I. Bergoënd, C. Arfire, N. Pavillon, and C. Depeursinge, “Diffraction tomography for biological cells imaging using digital holographic microscopy,” Proc. of SPIE 7376, 7376131–7376138 (2010).

].

In some cases, diffraction calculation might be slower than ray tracing calculations. There are several fast algorithms which implement diffraction calculations based on the Fresnel kernel [23

23. D. Mas, J. Garcia, C. Ferreira, L. M. Bernardo, and F. Marinho, “Fast algorithms for free-space diffraction patterns calculation,” Opt. Commun. 164, 233–245 (1999). [CrossRef]

]. Even real-time diffraction calculations are possible [24

24. H. Kang, T. Fujii, T. Yamaguchi, and H. Yoshikawa, “Compensated phase-added stereogram for real-time holographic display,” Opt. Eng. 46, 0958021–09580211 (2007). [CrossRef]

]. Indeed, one of the implementations uses the graphical processing unit to further increase the computation speed [25

25. T. Shimobaba, T. Ito, N. Masuda, Y. Abe, Y. Ichihashi, H. Nakayama, N. Takada, A. Shiraki, and T. Sugie, “Numerical calculation library for diffraction integrals using the graphic processing unit: the GPU-based wave optics library,” J. Opt. A-Pure and Appl. Opt. 10, 0753081–0753085 (2009).

]. Our elemental image generation method is quite similar to techniques used in digital hologram generation procedures. We calculated the diffraction fields using DFT. We computed the DFT using an FFT algorithm. It is possible to apply other abovementioned faster algorithms to our case, as well. However, the comparison of the effects of such different computational procedures to the performance is not a part of this study.

Presented numerical and optical results show that the computationally generated elemental images using wave propagation principle from synthetic or real objects can be used to successfully reconstruct 3D images. Furthermore, a digitally controlled synthetic lenslet array can be used at the display stage setup of an integral imaging system [11

11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

, 26

26. J.-S. Jang and B. Javidi, “Three-dimensional integral imaging with electronically synthesized lenslet arrays,” Opt. Lett. 27, 1767–1769 (2002). [CrossRef]

].

In Section 2 we describe the proposed system. We explain the method to obtain elemental images and present the optical setup that we use to reconstruct the 3D objects. In Section 3, we show the optical display experiment results of the proposed system together with the computer simulations. Finally, we draw conclusions, with notes, in the last section.

2. Proposed system

In this section, we present the method for elemental image generation from holographic data and an integral imaging optical setup to reconstruct 3D images from the computer generated elemental images. The holographic data may be acquired either by optical means or computed using digital techniques. We present our method in the first subsection. In the second subsection, we present the algorithm and in the third subsection we present three examples. In the first example, we obtain the elemental images of two letters at different depths. We first generated the diffraction patterns (computer generated holograms) of the letters. The complex diffraction pattern is then used as the input to our algorithm. The output of the algorithm gives the elemental image set of these letters at the imaging distance. For the second example, we obtain the elemental images of a 3D pyramid shaped object. In the last example, we obtain the set of elemental images as the output from a digitally captured optical holographic data which is obtained using a diffraction tomography technique [22

22. I. Bergoënd, C. Arfire, N. Pavillon, and C. Depeursinge, “Diffraction tomography for biological cells imaging using digital holographic microscopy,” Proc. of SPIE 7376, 7376131–7376138 (2010).

]; the object is an epithelium cell. In the last subsection, we describe the optical setup which we used to reconstruct the 3D image from elemental images. Thus we show that the obtained elemental images can be used for optical reconstruction. The object sizes and display distances should match the optical setup requirements. Thus, the holographic data should be further processed if the object sizes and the distances do not match the display system. This processing is especially needed for optically captured holographic data.

2.1. The method

Fig. 1 (a) A generic sketch of holographic recording. The diffraction pattern at z = z0 is captured. (b) A generic sketch of 3D image reconstruction from the captured hologram.
Fig. 2 (a) A generic integral imaging data capture setup. The diffraction pattern in Fig.1 (a) is also depicted. For the same object with the same physical dimensions, the diffraction patterns in both systems are the same. (b) A generic Integral imaging display setup. The reconstruction is pseudoscopic due to employed direct pick-up method.(c) Designed model to calculate elemental images from diffraction (hologram) data.

During the process, we wish to first back propagate the holographic data to a location which we call the “origin”. The origin is defined as the effective depth of the nearest point of the object to the lenslet array.

In case of a mismatch between the physical parameters of the holographic recording step and our display the matching process is equivalent to equating the corresponding discrete Fresnel kernels. To find the relation between the kernels, let us assume that hα1 [n] represents the propagation associated with the holographic input setup parameters and the kernel hα2 [n] represents the propagation with the integral imaging setup parameters. If we equate the quadratic phases in hα1 [n] and hα2 [n], we can find the relation that matches the physical parameters. Let exp [1nT n] be the quadratic phase in the Fresnel kernel representing the 2D diffraction field of the holographic setup where α1=πX12(λ1z1), λ1 is the wavelength, z1 is the propagation distance, X1 is the sampling period of the field in both directions. n = [n1 n2]T where n1, n2 are integers. Let exp [2nT n] be the quadratic phase in the Fresnel kernel representing the 2D diffraction field of the integral imaging setup where α2=πX22(λ2z2). If we equate the parameters of these functions ∀n, we get, α1 = α2 thus π(λ1z1)X12=π(λ2z2)X22. So, we can find that z2=z1.λ1λ2X22X12. So, back-propagating the input data by z2 is equivalent to placing the 3D object effectively at the origin as in Fig. 2(a).

2.2. The algorithm

The algorithm is given by the flowchart shown in Fig. 3. The input of the algorithm is diffraction data. Additional preprocessing steps may be needed depending on the nature of input data and the desired quality of the output display. For example, if the input is not from an object with a diffusing surface, we may need to multiply the associated field with a random phase to improve the visibility at the output. Also, for the cases where the recording physical parameters do not match with the display system parameters and where the object size is small compared to the display size, we may need to pre-process the data. The procedures for such cases will be discussed in detail later in this section. However, here we should mention that for all these cases, we first want to find the complex object field at the origin and then apply the specified processes. Actually, this step is not a necessity. On the contrary, we can generate the elemental images with the given diffraction pattern directly. To cover all cases by a single uniform step, we first back-propagate all input to the origin, and then apply the fixed process as described in Fig.2(c). This will then directly give the elemental images regardless of the properties of the original data.

Fig. 3 The algorithm to generate elemental images from a diffraction pattern.

We use the DFT method to compute the convolution to find outputs of the discrete systems. However, our discretized signals have a support that span both sides of the axes; i.e. n1, n2 can take zero, positive or negative values. Therefore, we must modify the commonly used DFT definition to operate also on such signals. Suppose that for a finite length signal x[n], n1, n2 = −N/2,⋯,N/2 − 1 we define the modified finite length DF̂T X[k] = DF̂T {x[n]}, k1, k2 = −N/2, ⋯,N/2 − 1 as follows: Let the periodic X̃[k] be given by,
X˜[k]=n1=0N1n2=0N1x˜[n]ej2πNkTnk1,k2(,).
(2)
Here, X̃[k] and x̃[n] are periodic extensions of finite length X[k] and x[n], respectively, as, X̃[k1N/2, k2N/2] = X[(k1)modNN/2, (k2)modNN/2], and x̃[n1N/2, n2N/2] = x[(n1)modNN/2, (n2)modNN/2], n1, n2 are integers (−∞, ∞) and consequently, X [k] is one period over k1, k2 = −N/2,⋯,N/2 − 1 of X̃[k] and x[n] is one period over n1, n2 = −N/2,⋯,N/2 − 1 of x̃[n]. In order to avoid aliasing that might be caused by the periodicity associated with DFT, while using this method, the computation window size should be selected sufficiently larger than the signal window in both directions. Outside the signal window, we chose to pad the computation window with zeros (opaque borders). Thus, we compute the linear convolution of the signals by approximating the circular convolution by padding zeros in the computation array. Moreover, this way, we simulate the case where those points on the object surface are the only possible source points. In our examples, the signal window sizes are chosen smaller than 1920 × 1920 while the computation window sizes are chosen as 3840 × 3840. The Fresnel diffraction kernel is used to model wave propagation. Let us denote the signal window by t [n]. The diffraction pattern, td [n], of the signal is calculated by
td[n]=IDF^T{DF^T{wt[n]}Hθ[k]}
(3)
where n = [n1 n2]T and k = [k1 k2]T represent the discrete spatial domain variables and the discrete spatial frequency domain variables, respectively, and n1, n2, k1, k2 are integers; we choose the range for n1, n2, k1 and k2 as [−1920, 1919] in our examples. wt [n] is the computational window and t[n] is centered inside wt [n]. 2D DF̂T and 2D IDF̂T of the matrices are computed using 2D FFT and 2D IFFT algorithms, respectively.

In order to speed up the computations, we used the Fresnel kernel in the spatial frequency domain. The Fourier transform of continuous Fresnel kernel is
H(f)=exp(j2πλz)exp(jπλzfTf)
(4)
where f = [fx fy]T, fx and fy are the spatial frequency domain variables in cycles per unit distance in (−∞, ∞). To compute the discrete Fresnel kernel, we discretize Eq. (4) by substituting f with Uk and we obtain
Hθ[k]=exp(jθkTUTUk)
(5)
where k = [k1 k2]T and k1, k2 = −N/2, ⋯,N/2 − 1, U=[1/(NX)001/(NX)] is the 2D rectangular sampling matrix in the spatial frequency domain and θ =πλd. N is the total number of pixels along one side of the discrete 2D calculation array, X is the spatial sampling period and d is the propagation distance. We omitted the phase constant, which appears in Eq. (4), in order not to clutter the computations.

On the lenslet array plane, we generate the lenslet array complex phase pattern given as in [11

11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

]. A single lenslet of the array is given by
l[n]=exp(jγnTVTVn)
(6)
which is obtained by discretizing
l(x)=exp(jπλfxTx)
(7)
and by substituting x by Vn where n1, n2 are in the interval [−M/2, M/2 − 1] and γ=πλf, V=[X00X] is the 2D rectangular sampling matrix in the spatial domain. We chose the focal length as f=MX2λ to cover the entire normalized frequency range in the interval [−π, π) radians where M is the length of one side of a lenslet. A 2D array of lenslets, LA[n], is generated by replicating l[n] in both directions in a rectangular fashion. LA[n] is centered within the computation window wLA[n]. Also, the lenslet array is large enough to image most of the light scattered from the object. wLA[n] is multiplied with the diffraction pattern, td[n], of the object. Focal length of the lenslets is chosen such that it satisfies the imaging equation 1/f = 1/g+1/d and proper magnification ratios are obtained at the imaging plane. To give numerical examples, we chose f = 10.8mm and d = 7f. Finally, we calculate the diffraction pattern due to the resulting complex field of the multiplication wLA[n]td[n] at the imaging depth, g. The resultant complex diffraction pattern is given by
p[n]=IDF^T{DF^T{wLA[n]td[n]}Hσ[k]}
(8)
where σ = πλg Taking the square magnitude of this pattern simulates the discrete intensity recording,
I=[n]=|p[n]|2.
(9)
As a result, we obtain computer generated elemental images of the 3D object.

2.3. The examples

Now we will proceed with the examples of three different input diffraction patterns, as a proof of the concept, we chose three examples. In the reconstructions, we demonstrate the depth of focus, viewing angle and parallax of our display, qualitatively. The first example is a set of two planar letters; the letters are at different depths. Such an example is extensively used in the literature [4

4. S.-W. Min, K. S. Park, B. Lee, Y. Cho, and M. Hahn, “Enhanced image mapping algorithm for computer-generated integral imaging system,” Jpn. J. Appl. Phys. 45, L744–L747 (2006). [CrossRef]

, 5

5. S.-H. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of digital hologram generated by sub-image of integral imaging,” Proc. of SPIE 6912, 69121F1–69121F10 (2008).

, 7

7. J.-K. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of three-dimensional object and system analysis using ray tracing in practical integral imaging system,” Proc. of SPIE 6695, 6695191–66951912 (2007).

]. This example helps us to understand whether we are able to distinguish different depths in the reconstructions. Also, it gives an idea about the depth of focus of the lenslets. Our second example is an extension of the first one. We sliced down a pyramid object to create several planar objects. This time our aim is to show the parallax that can be obtained using our display. Since, we have a depth variation in the object, it is easier to observe the parallax effect. The last example presents the most important aspect of our method. We used a digitally obtained optical diffraction tomography data as the input of our method. We can generate elemental images even from such physical data.

Fig. 4 Computed and recorded elemental images of two letters at different depths and positions. (We enhanced the brightness of the figure for visual purposes. This is achieved by stretching the contrast. The figure is also used on the LCD display of the integral imaging setup as is. Similar enhancement procedure is used in Figs. 6, 8 and 1417. In Figs. 1417, we enhanced only the computer simulation results.)
Fig. 5 A sketch of the pyramid object. A square pyramid is sampled (sliced) over the z-axis. Base part is a square frame while the edges and the tip of the pyramid are small square patches. For display purposes we showed six slices of the object whereas in the simulations we used nine slices.
Fig. 6 Computed and recorded elemental images of the pyramid object. (We enhanced the brightness of the figure for visual purposes.)
Fig. 7 (a) The amplitude picture of the diffraction pattern of the epithelium cell. (b) The upsampled (interpolated and low pass filtered) version of (a).
Fig. 8 Computed and recorded elemental images of the epithelium cell. (We enhanced the brightness of the figure for visual purposes.)

The intensity distribution on the focused plane is given by
r[n]=|IDF^T{DF^T{wLA[n]td[n]}Hχ[k]}|2
(14)
where χ = πλ(d + Δd). The simulation results for the reconstructions are given in Section 3.

2.4. Optical setup

The optical setup is depicted in Fig. 9. We display the elemental images on a Samsung T240 monitor. The resolution of the monitor is 1920 × 1200 pixels. Our elemental image set size is 1920 × 1080 pixels. So, we fit the image by leaving 60 pixels from top and bottom blank. The pixel size of the monitor is 0.27mm and the dimensions of the active area that we used was 518mm×292mm. The lenslet array is written on a Holoeye HEO 1080P phase-only LCoS SLM, which is a high definition 1920×1080 pixels reflective type SLM. We write 20×12 lenslets on the SLM. Each lenslet has a size of 90 × 90 pixels with a focal length f = 10.8mm. Pixel size of the SLM is 8μm, thus each lenslet size is 0.72mm × 0.72mm. With that many lenslets we can only fit a lenslet array with full size lenslets to an active area of 1800 × 1080 pixels on the SLM. The unused parts (60 pixel each) are left blank equally on the left and right side of the SLM. Thus, the active area size for the lenslet array was 14.4mm × 8.64mm. The lenslet array is shown in Fig. 10. Our setup is a typical integral imaging display setup. However, due to the size difference between the lenslet array and the LCD screen, we need to scale the elemental images on the LCD screen by the help of a projector objective. For this reason, we used a projector objective which is disassembled form a Epson EMP-TW520 projector. Since the SLM is reflective type, we put a non-polarizing beam splitter (NPBS) to illuminate and observe the reconstructed image. However, the NPBS changes the focal point of the lenslets [11

11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

]. Thus, for fine tuning, we tried to find a focused reconstruction while changing the position of the projector objective. The reconstructions are observed at the expected distances. The entire system, its close-up view and the view from the viewing zone perspective are shown in Fig. 11, Fig. 12 and Fig. 13, respectively.

Fig. 9 The optical setup
Fig. 10 A Fresnel lenslet array pattern with 12×20 lenslets. Each lenslet has a focal length of 10.8mm. We excluded the lenslet on either side of the array since they would be cropped if we have included them. Instead we left 60 pixels blank from either side of the array that is written on the 1920 × 1080 pixels phase only LCoS SLM.
Fig. 11 Picture of the entire optical setup.
Fig. 12 Top view of the optical setup. There is a wireframe pyramid object next to the reconstruction zone. It is used to compare the reconstructed 3D images of the pyramid object.
Fig. 13 The viewing zone of the optical setup. We placed cards labeled as “Bilkent University” at different distances in order to check the reconstruction distances.

3. Results

We compared the computer simulation results and the optical reconstructions. Here we present the results for each example given in Section 2.3.

Our first example was two letters at different depths and location. To determine the focused planes we put two cards with “Bilkent University” label on them as shown in Fig. 13. The card, where the label is horizontally aligned, is located 8.4 f distance away from the SLM surface. The one with the label, which is vertically aligned, is located approximately at 13 f away from the SLM surface. When we display the elemental images in Fig. 4, we observed the reconstructions as in Fig. 14. In this figure, the top images are computer simulation results while the bottom images are the optical reconstructions. The images on the left shows the reconstructed object at 8.4 f while the right images are the reconstructions of the object at 13 f. The letter “A” is seen sharper than the letter “Z”. This is due to the depth of focus of the lenslets. We exaggerate the distances to show that the system works. For a closer capture distance, for the letter “Z”, the reconstructions would be sharper. As we explained in Sec. 2.4, the NPBS shifts the focal distance of the lenslets. We also confirmed these shifted location by computer simulations.

Fig. 14 3D reconstruction from the elemental images of Fig. 4. At the top, digital reconstructions are shown while at the bottom we observe the optical counterparts. On the left side, the camera, which took this picture, was focused to a distance 8.4 f and on the right side, it was at 13 f. (We enhanced the brightness of the computer simulation results for visual purposes.)

For the second object, the pyramid, we performed two experiments. The first experiment is to show the depth of the object and the second one is to show the parallax. In Fig. 12, we show how we modified the setup. In Fig. 15, left images are the computer simulation results and the right images are the optical reconstructions together with a physical wireframe pyramid object with the same size as the reconstruction. The top two images show the focusing to the tip of the pyramid. The depth of the object is 24mm as mentioned in Section 2.2. The base part of the pyramid, which is located 8.4 f away from the SLM surface, is shown in focus in the bottom part of Fig. 15. For the parallax experiment, we shoot photos from three different viewing angles from left to right. We focused to the tip in order to show the parallax better. In Fig. 16, the top three images are computer simulations for the parallax, while the bottom pictures are the optical reconstructions. This effect can be seen better with the optical reconstructions. However, the viewing angle of the system, is limited with the maximum diffraction angle of the SLM device, ω=λX=532nm8μm=0.067 rad4°, [6

6. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photgraphy,” Proc. of SPIE 4297, 187–195 (2001). [CrossRef]

, 14

14. T. Mishina, M. Okui, and F. Okano, “Generation of holograms using integral photography,” Proc. of SPIE 5599, 114–122 (2004). [CrossRef]

]. The aliased components appear when we go to higher angles to observe the reconstruction. This is seen both in the optical reconstruction and in the computer simulations. Viewing angle of this system can be improved by decreasing the pixel period, X, of the SLM device or by introducing multiple SLM circular configurations [29

29. F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19, 9147–9156 (2011). [CrossRef]

31

31. D.-H. Shin, B.-G. Lee, J. Hyun, D.-C. Hwang, and E.-S. Kim, “Curved projection integral imaging using an additional large-aperture convex lens for viewing angle improvement,” ETRI J. 31, 105–110 (2009). [CrossRef]

].

Fig. 15 3D reconstruction from the elemental images of Fig. 6. Images at the left are digital reconstructions. Images at the right are optical reconstructions. The top images are focused to the tip of the pyramid object and the images at the bottom are focused to the base of the object. It is clearly seen that the physical (wire) object and the reconstructed 3D images match. (We enhanced the brightness of the computer simulation results for visual purposes.)
Fig. 16 The pictures of the pyramid image taken from three different angles. (All are focused to the tip of the pyramid.) The pictures at the top are the digital reconstructions and the bottom ones are the optical reconstructions. The pictures show the parallax and the viewing angle. (We enhanced the brightness of the computer simulation results for visual purposes.)

The last example was the epithelium cell object. The top image in Fig. 17 shows the computer simulation results. The bottom image shows the reconstruction at 8.4 f. Since the object has a small depth, it is not possible to observe a 3D effect or the parallax. However, we showed with this last example that it is possible to convert holographic recording, regardless of the acquisition method (numerical data generation or digital recording of optical data), to elemental images and reconstruct them successfully by numerical or optical means.

Fig. 17 Reconstruction from the elemental images of Fig. 8. Top picture is the digital reconstruction whereas the bottom one shows the optical reconstruction. Since the object thickness is small relative to the reconstruction distance, a 3D depth is not perceived. However, the planar looking thin object still floats in 3D space. (We enhanced the brightness of the computer simulation results for visual purposes.)

4. Conclusion

We demonstrated a method to convert digitally computed (synthetic) or digitally recorded holographic (physical) data to elemental images. Synthetic (computationally generated) or digitally recorded physical data are processed to obtain elemental images of the original 3D objects. Our proposed method is based on diffraction calculations, instead of commonly used ray tracing methods, to generate elemental images from digitally available 3D data. We showed three examples: two letters at different depths, a pyramid object and a hologram of a real epithelium cell that is obtained by diffraction tomography. Both digitally simulated reconstructions (obtained using diffraction calculations) and optical reconstructions are compared for these three examples. Optical reconstructions are obtained from an integral imaging display setup. The lenslet array of the integral imaging display consists of a phase-only SLM with a Fresnel lenslet array pattern written on it. The optical reconstructions provide satisfactory results. The first example gives an idea about the depth of focus of the digital lenslet array. We showed with the synthetic 3D pyramid object example that the display provides a good parallax, which is limited by the maximum diffraction angle of the SLM. We also compared the reconstruction with a physical wire-frame pyramid object and it confirmed that our system works well within its physical limitations. We also showed that we can use digitally captured optical diffraction data to computationally generate a set of elemental images and reconstruct the 3D image of the cell from these elemental images on a physical integral imaging display. Our proposed display system and the method for obtaining elemental images make it possible to display holographic recordings on an integral imaging display setup.

Acknowledgments

This work is supported by the European Commission within FP7 under grant 216105 with the acronym Real 3D. We thank Isabelle Bergoënd and Christian Depeursinge from EPFL for the epithelium cell diffraction data they provided. A. Özgür YÖNTEM thanks TUBİTAK for the scholarship he received during his doctoral studies.

References and links

1.

G. Lippmann, “La photographie intégrale,” C.R. Hebd. Seances Acad. Sci. 146, 446–451 (1908).

2.

F. Okano, H. Hoshino, H. A. Jun, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on the integral photography,” Appl. Opt. 36, 1–14 (1997). [CrossRef]

3.

S. S. Athineos, N. P. Sgouros, P. G. Papageorgas, D. E. Maroulis, M. S. Sangriotis, and N. G. Theofanous, “Photorealistic integral photography using a ray-traced model of capturing optics,” J. Electron Imaging 15, 0430071–0430078 (2006). [CrossRef]

4.

S.-W. Min, K. S. Park, B. Lee, Y. Cho, and M. Hahn, “Enhanced image mapping algorithm for computer-generated integral imaging system,” Jpn. J. Appl. Phys. 45, L744–L747 (2006). [CrossRef]

5.

S.-H. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of digital hologram generated by sub-image of integral imaging,” Proc. of SPIE 6912, 69121F1–69121F10 (2008).

6.

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photgraphy,” Proc. of SPIE 4297, 187–195 (2001). [CrossRef]

7.

J.-K. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of three-dimensional object and system analysis using ray tracing in practical integral imaging system,” Proc. of SPIE 6695, 6695191–66951912 (2007).

8.

B.-N.-R. Lee, Y. Cho, K.. S. Park, S.-W. Min, J.-S. Lim, M. C. Whang, and K. R. Park, “Design and implementation of a fast integral image rendering method,” Lect. Notes Comput. Sc. 4161, 135–140 (2006). [CrossRef]

9.

U. Schnars and W. P. O. Jüptner, “Digital recording and numerical reconstruction of holograms,” Meas. Sci. and Tech. 13, R85–R110 (2002). [CrossRef]

10.

L. Onural, “Sampling of the diffraction field,” Appl. Opt. 39, 5929–5935 (2000). [CrossRef]

11.

A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28, 2359–2375 (2011). [CrossRef]

12.

B. E. A. Saleh and M. C. Teich, Fundamentals of Photonics (John Wiley and Sons, Inc., 1991). [CrossRef]

13.

J. W. Goodman, Introduction to Fourier Optics (Mc-Graw-Hill, 1996).

14.

T. Mishina, M. Okui, and F. Okano, “Generation of holograms using integral photography,” Proc. of SPIE 5599, 114–122 (2004). [CrossRef]

15.

R. V. Pole, “3-D imagery and holograms of objects illuminated in white light,” Appl. Phys. Lett. 10, 20–22 (1967). [CrossRef]

16.

B. Javidi and S.-H. Hong, “Three-dimensional holographic image sensing and integral imaging display,” J. Disp. Technol 1, 341–346 (2005). [CrossRef]

17.

C. Quan, X. Kang, and C. J. Tay, “Speckle noise reduction in digital holography by multiple holograms,” Opt. Eng. 461158011–1158016 (2007). [CrossRef]

18.

J. G.-Sucerquia, J. A. H. Ramírez, and D. V. Prieto, “Reduction of speckle noise in digital holography by using digital image processing,” Optik 116, 44–48 (2005). [CrossRef]

19.

T. Baumbach, E. Kolenović, V. Kebbel, and W. Jüptner, “Improvement of accuracy in digital holography by use of multiple holograms,” Appl. Opt. 45, 6077–6085 (2006). [CrossRef] [PubMed]

20.

T. Ito and K. Okano, “Color electroholography by three colored reference lights simultaneously incident upon one hologram panel,” Opt. Express 12, 4320–4325 (2004). [CrossRef] [PubMed]

21.

F. Yaraş and L. Onural, “Color holographic reconstruction using multiple SLMs and LED illumination,” Proc. of SPIE 7237, 72370O1–72370O5 (2010).

22.

I. Bergoënd, C. Arfire, N. Pavillon, and C. Depeursinge, “Diffraction tomography for biological cells imaging using digital holographic microscopy,” Proc. of SPIE 7376, 7376131–7376138 (2010).

23.

D. Mas, J. Garcia, C. Ferreira, L. M. Bernardo, and F. Marinho, “Fast algorithms for free-space diffraction patterns calculation,” Opt. Commun. 164, 233–245 (1999). [CrossRef]

24.

H. Kang, T. Fujii, T. Yamaguchi, and H. Yoshikawa, “Compensated phase-added stereogram for real-time holographic display,” Opt. Eng. 46, 0958021–09580211 (2007). [CrossRef]

25.

T. Shimobaba, T. Ito, N. Masuda, Y. Abe, Y. Ichihashi, H. Nakayama, N. Takada, A. Shiraki, and T. Sugie, “Numerical calculation library for diffraction integrals using the graphic processing unit: the GPU-based wave optics library,” J. Opt. A-Pure and Appl. Opt. 10, 0753081–0753085 (2009).

26.

J.-S. Jang and B. Javidi, “Three-dimensional integral imaging with electronically synthesized lenslet arrays,” Opt. Lett. 27, 1767–1769 (2002). [CrossRef]

27.

M. Kovachev, R. Ilieva, P. Benzie, G. B. Esmer, L. Onural, J. Watson, and T. Reyhan, “Holographic 3DTV displays using spatial light modulators,” in Three-Dimensional Television-Capture, Transmission, Display, H. Ozaktas and L. Onural, eds. (Springer, 2008), pp. 529–555.

28.

L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. of IEEE 99, 576–589 (2011). [CrossRef]

29.

F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19, 9147–9156 (2011). [CrossRef]

30.

S.-W. Min, S. Jung, H. Choi, Y. Kim, J.-H. Park, and B. Lee, “Wide-viewing-angle integral three-dimensional imaging system by curving a screen and a lens array,” Appl. Opt. 44, 546–552 (2005). [CrossRef] [PubMed]

31.

D.-H. Shin, B.-G. Lee, J. Hyun, D.-C. Hwang, and E.-S. Kim, “Curved projection integral imaging using an additional large-aperture convex lens for viewing angle improvement,” ETRI J. 31, 105–110 (2009). [CrossRef]

OCIS Codes
(110.0110) Imaging systems : Imaging systems
(110.4190) Imaging systems : Multiple imaging
(110.6880) Imaging systems : Three-dimensional image acquisition

ToC Category:
Imaging Systems

History
Original Manuscript: June 29, 2012
Revised Manuscript: September 24, 2012
Manuscript Accepted: September 24, 2012
Published: October 8, 2012

Citation
Ali Özgür Yöntem and Levent Onural, "Integral imaging based 3D display of holographic data," Opt. Express 20, 24175-24195 (2012)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-20-22-24175


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. G. Lippmann, “La photographie intégrale,” C.R. Hebd. Seances Acad. Sci.146, 446–451 (1908).
  2. F. Okano, H. Hoshino, H. A. Jun, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on the integral photography,” Appl. Opt.36, 1–14 (1997). [CrossRef]
  3. S. S. Athineos, N. P. Sgouros, P. G. Papageorgas, D. E. Maroulis, M. S. Sangriotis, and N. G. Theofanous, “Photorealistic integral photography using a ray-traced model of capturing optics,” J. Electron Imaging15, 0430071–0430078 (2006). [CrossRef]
  4. S.-W. Min, K. S. Park, B. Lee, Y. Cho, and M. Hahn, “Enhanced image mapping algorithm for computer-generated integral imaging system,” Jpn. J. Appl. Phys.45, L744–L747 (2006). [CrossRef]
  5. S.-H. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of digital hologram generated by sub-image of integral imaging,” Proc. of SPIE6912, 69121F1–69121F10 (2008).
  6. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photgraphy,” Proc. of SPIE4297, 187–195 (2001). [CrossRef]
  7. J.-K. Lee, S.-C. Kim, and E.-S. Kim, “Reconstruction of three-dimensional object and system analysis using ray tracing in practical integral imaging system,” Proc. of SPIE6695, 6695191–66951912 (2007).
  8. B.-N.-R. Lee, Y. Cho, K.. S. Park, S.-W. Min, J.-S. Lim, M. C. Whang, and K. R. Park, “Design and implementation of a fast integral image rendering method,” Lect. Notes Comput. Sc.4161, 135–140 (2006). [CrossRef]
  9. U. Schnars and W. P. O. Jüptner, “Digital recording and numerical reconstruction of holograms,” Meas. Sci. and Tech.13, R85–R110 (2002). [CrossRef]
  10. L. Onural, “Sampling of the diffraction field,” Appl. Opt.39, 5929–5935 (2000). [CrossRef]
  11. A. Ö. Yöntem and Levent Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A28, 2359–2375 (2011). [CrossRef]
  12. B. E. A. Saleh and M. C. Teich, Fundamentals of Photonics (John Wiley and Sons, Inc., 1991). [CrossRef]
  13. J. W. Goodman, Introduction to Fourier Optics (Mc-Graw-Hill, 1996).
  14. T. Mishina, M. Okui, and F. Okano, “Generation of holograms using integral photography,” Proc. of SPIE5599, 114–122 (2004). [CrossRef]
  15. R. V. Pole, “3-D imagery and holograms of objects illuminated in white light,” Appl. Phys. Lett.10, 20–22 (1967). [CrossRef]
  16. B. Javidi and S.-H. Hong, “Three-dimensional holographic image sensing and integral imaging display,” J. Disp. Technol1, 341–346 (2005). [CrossRef]
  17. C. Quan, X. Kang, and C. J. Tay, “Speckle noise reduction in digital holography by multiple holograms,” Opt. Eng.461158011–1158016 (2007). [CrossRef]
  18. J. G.-Sucerquia, J. A. H. Ramírez, and D. V. Prieto, “Reduction of speckle noise in digital holography by using digital image processing,” Optik116, 44–48 (2005). [CrossRef]
  19. T. Baumbach, E. Kolenović, V. Kebbel, and W. Jüptner, “Improvement of accuracy in digital holography by use of multiple holograms,” Appl. Opt.45, 6077–6085 (2006). [CrossRef] [PubMed]
  20. T. Ito and K. Okano, “Color electroholography by three colored reference lights simultaneously incident upon one hologram panel,” Opt. Express12, 4320–4325 (2004). [CrossRef] [PubMed]
  21. F. Yaraş and L. Onural, “Color holographic reconstruction using multiple SLMs and LED illumination,” Proc. of SPIE7237, 72370O1–72370O5 (2010).
  22. I. Bergoënd, C. Arfire, N. Pavillon, and C. Depeursinge, “Diffraction tomography for biological cells imaging using digital holographic microscopy,” Proc. of SPIE7376, 7376131–7376138 (2010).
  23. D. Mas, J. Garcia, C. Ferreira, L. M. Bernardo, and F. Marinho, “Fast algorithms for free-space diffraction patterns calculation,” Opt. Commun.164, 233–245 (1999). [CrossRef]
  24. H. Kang, T. Fujii, T. Yamaguchi, and H. Yoshikawa, “Compensated phase-added stereogram for real-time holographic display,” Opt. Eng.46, 0958021–09580211 (2007). [CrossRef]
  25. T. Shimobaba, T. Ito, N. Masuda, Y. Abe, Y. Ichihashi, H. Nakayama, N. Takada, A. Shiraki, and T. Sugie, “Numerical calculation library for diffraction integrals using the graphic processing unit: the GPU-based wave optics library,” J. Opt. A-Pure and Appl. Opt.10, 0753081–0753085 (2009).
  26. J.-S. Jang and B. Javidi, “Three-dimensional integral imaging with electronically synthesized lenslet arrays,” Opt. Lett.27, 1767–1769 (2002). [CrossRef]
  27. M. Kovachev, R. Ilieva, P. Benzie, G. B. Esmer, L. Onural, J. Watson, and T. Reyhan, “Holographic 3DTV displays using spatial light modulators,” in Three-Dimensional Television-Capture, Transmission, Display, H. Ozaktas and L. Onural, eds. (Springer, 2008), pp. 529–555.
  28. L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. of IEEE99, 576–589 (2011). [CrossRef]
  29. F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express19, 9147–9156 (2011). [CrossRef]
  30. S.-W. Min, S. Jung, H. Choi, Y. Kim, J.-H. Park, and B. Lee, “Wide-viewing-angle integral three-dimensional imaging system by curving a screen and a lens array,” Appl. Opt.44, 546–552 (2005). [CrossRef] [PubMed]
  31. D.-H. Shin, B.-G. Lee, J. Hyun, D.-C. Hwang, and E.-S. Kim, “Curved projection integral imaging using an additional large-aperture convex lens for viewing angle improvement,” ETRI J.31, 105–110 (2009). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited