OSA's Digital Library

Virtual Journal for Biomedical Optics

Virtual Journal for Biomedical Optics

| EXPLORING THE INTERFACE OF LIGHT AND BIOMEDICINE

  • Editor: Gregory W. Faris
  • Vol. 3, Iss. 10 — Sep. 22, 2008
« Show journal navigation

Three-dimensional visualization of objects in scattering medium by use of computational integral imaging

Inkyu Moon and Bahram Javidi  »View Author Affiliations


Optics Express, Vol. 16, Issue 17, pp. 13080-13089 (2008)
http://dx.doi.org/10.1364/OE.16.013080


View Full Text Article

Acrobat PDF (1051 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

In this paper, we propose a method to three-dimensionally visualize objects in a scattering medium using integral imaging. Our approach is based on a particular use of the interference phenomenon between the ballistic photons getting through the scattering medium and the scattered photons being scattered by the medium. For three-dimensional (3D) sensing of the scattered objects, the synthetic aperture integral imaging system under coherent illumination records the scattered elemental images of the objects. Then, the computational geometrical ray propagation algorithm is applied to the scattered elemental images in order to eliminate the interference patterns between scattered and object beams. The original 3D information of the scattered objects is recovered by multiple imaging channels, each with a unique perspective of the object. We present both simulation and experimental results with virtual and real objects to demonstrate the proposed concepts.

© 2008 Optical Society of America

1. Introduction

Integral imaging (II) and digital holography (DH) techniques [1–24

1. G. Lippmann, “La photographie intégrale,” Compte-Rendus 146, 446–451 (1908).

] have been studied for real-time three-dimensional (3D) sensing, visualization and recognition of real-world objects. II [1–14

1. G. Lippmann, “La photographie intégrale,” Compte-Rendus 146, 446–451 (1908).

] is a 3D passive imaging technique based on multi-perspective information to extract the depth information of a 3D object. In this system, a micro-lenslet array or an imaging device with a synthetic aperture captures a set of 2D elemental images from slightly different perspectives that together contain the 3D information of an object. The computational modeling of integral imaging for 3D visualization of the object can be performed by using virtual ray propagation algorithm. In DH [15–24

15. J. W. Goodman and R. W. Lawrence, “Digital image formation from electronically detected holograms,” Appl. Phy. Lett. 11, 77–79 (1967). [CrossRef]

], a digital hologram, i.e. the diffraction pattern of the object illuminated by coherent light, is recorded on an image sensor. The original 3D field of the object is computationally reconstructed from the digital hologram of the object by using virtual Fresnel propagation algorithm. These 3D optical imaging systems have found a variety of applications including 3D image recognition [17

17. Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, “Three dimensional imaging and display using computational holographic imaging,” Proceedings of IEEE 94, 636–654 (2006). [CrossRef]

], occluded 3D object visualization [14

14. Y. S. Hwang, S. -H. Hong, and B. Javidi, “Free view 3-D visualization of occluded objects by using computational synthetic aperture integral imaging,” J. Display Technol. 3, 64–70 (2007). [CrossRef]

], automatic analysis of 3D microscopic image data [10

10. B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14, 12096–12108 (2006). [CrossRef] [PubMed]

, 18

18. I. Moon and B. Javidi, “Volumetric 3D recognition of biological microorganisms using multivariate statistical method and digital holography,” J. Biomed. Opt. 11, 064004 (2006). [CrossRef]

], holographic tomography [21–24

21. L. Yu and Z. Chen, “Improved tomographic imaging of wavelength scanning digital holographic microscopy by use of digital spectral shaping,” Opt. Express 15, 878–886 (2007). [CrossRef] [PubMed]

], and 3D display [25–29

25. R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Display Technol. 1, 321–327 (2005). [CrossRef]

].

There are many benefits in developing optical imaging systems for visualization of the objects in a scattering medium [30

30. J. Rosen and D. Abookasis, “Seeing through biological tissues using the fly eye principle,” Opt. Express 11, 3605–3611 (2003). [CrossRef] [PubMed]

]. Such systems can be used in various applications such as medical diagnostics, bio-medical imaging, and security and defense. Also, the method can lead to the fast, automated, non-invasive and relatively inexpensive visualization of scattered objects.

In this paper, we present the methods and apparatus in order to three-dimensionally visualize objects in a scattering medium under coherent illumination by using computational synthetic aperture integral imaging (SAII) algorithms. The proposed method can overcome a difficulty in imaging through a scattering medium because we take into account the difference in perspective between elemental images and we exploit this for 3D reconstruction of the object of interest. Computer simulations and optical experiments illustrate and analyze the proposed method’s concepts for 3D visualization of the scattered objects. In the computer simulations, we obtain the elemental images that have different perspectives of the 3D object using SAII technique, where the elemental images are generated by computer program which three-dimensionally models objects in the virtual space. Then the elemental images of the object and scattered beams are interfered or modulated by assuming the original beam to pass through a scattering medium with a uniform random phase distribution. Next, the original object is three-dimensionally restored from the scattered elemental images on a number of image planes over reconstruction depths by using the virtual ray propagation algorithm. For optical experiments, we obtain optically the scattered elemental images of an object by moving an image sensor array such as a CCD camera, where the object is positioned between two conventional optical diffusers. Then we calculate the virtual ray propagation of the scattered elemental images over the reconstruction depths for 3D visualization of the scattered object.

This paper is organized as follows. The principle of synthetic aperture integral imaging (SAII) recording and reconstruction is described in Section 2. In Section 3, we explain the basic concepts of the proposed synthetic aperture coherent integral imaging (SACII) system for 3D visualization of objects in scattering media. Experimental results are illustrated in Section 4. Conclusions follow in Section 5.

2. Principle of synthetic aperture integral imaging (SAII) recording and reconstruction

In SAII, image sensor channels capture light rays emanating from 3D objects [11

11. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proceedings of the IEEE 94, 591–607 (2006). [CrossRef]

, 28–29

28. M. Levoy, “Light fields and computional imaging,” IEEE Computer 39, 46–55 (2006). [CrossRef]

]. Therefore, each channel generates a 2D elemental image containing directional information of the 3D object. The captured elemental images have different perspectives of the 3D object according to the location of multiple image channels.

In conventional integral imaging (II), the resolution of each elemental image may be very low when the lenslet array is used to capture the 3D image scene, because the total number of pixels of image acquisition system such as a charge-coupled device (CCD) is divided by the number of perspectives.

Synthetic aperture integral imaging (SAII) can be used to obtain the elemental images with large number of pixels. Since SAII captures each elemental image by moving a CCD at corresponding position of each lenslet, the resolution of each elemental image in SAII can be equal to the resolution of the sensor.

Fig. 1. Computational reconstruction in integral imaging (II).

For computational reconstruction of a 3D object [7–9

7. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. 26, 157–159 (2001). [CrossRef]

] in SAII, the reverse of the pick up process is computationally simulated by using geometrical ray optics. In this method, a 2D sectional image of the 3D object located at a particular distance from the sensor is reconstructed by back propagating the elemental images through the virtual pin-hole arrays as shown in Fig. 1. The ray back propagation algorithm magnifies each elemental image at the desired reconstruction distance. The magnified elemental images overlap on the reconstruction image plane such that the original object placed at that distance is reconstructed. This algorithm consists of two parts: shifting of each magnified elemental image and averaging of the overlapped pixels in the integral image over the reconstruction distance. The 3D image reconstruction of the SAII can be described as follows:

I(x,y,z0)=1Nsi=0Ni1j=0Nj1MEij(x+z0pxfi,y+z0pyfj),
(1)

where i and j are the index of the each elemental image, Nx and Ny are the number of elemental images in x and y directions, MEij(·) is a magnified elemental image, px and py are the shifted value of the sensor channels in x and y directions, and Ns is the number of the overlapped pixels for the magnified elemental images as shown in Fig. 1. The total image size projected by each magnified elemental image at a reconstruction II plan is given by [px·(z0f+Nx1)]×[py·(z0f+Ny1)] . In the computational SAII, a 3D image reconstruction can be obtained because the overlapping factor in the elemental images is changed according to the reconstruction distance.

3. Synthetic aperture coherent integral imaging (SACII) system for 3D visualization of objects in scattering media

The synthetic aperture coherent integral imaging (SACII) system is proposed in order to sense and reconstruct the object in the scattering media as shown in Fig. 2. In the proposed SACII system, the object embedded between two scattering layers is illuminated by a coherent beam in order to three-dimensionally visualize objects in a scattering medium. The scattered elemental image set which contains different depth information with different perspectives of a 3D object is recorded under the coherent illumination. Each elemental image projects the shifted interference patterns between scattered and original object beams. In the recording plane, the scattered beam S interferes at each CCD pixel point with object field E. Therefore, the irradiance image recorded by the CCD has the following form as a function of the phase modulation:

I(rp)n(S2+E2+2SEcos[(kskE)rp])n,
(2)

where |S|2 and |E|2 are the scattered and object beam intensities, n is the elemental image number, and r⃗p is a position vector in the elemental image, and k⃗ is the wave-number. In general, the fluctuation of |S|2 is slow compared with |E|2 due to scattering. The second term in Eq. (2) contains the perspective information of a 3D object. In this paper, we assume that the 3D object between two scattering layers is distorted. The original 3D object is recovered from the distorted perspective images of the 3D object by using multiple imaging channels based on integral imaging. The 2|S||E|cos[(k⃗S-k⃗E)∙r⃗p] term in Eq. (2) denotes interference patterns between original object and scattered beams. We consider that this term is the primary cause for the distortion of the original object in the proposed SACII system. The set of the corresponding pixels in different imaging channels can be modeled as samples of a random intensity distribution due to the random variation of cosine term in Eq. (2). In other words, each pixel get the scattering contribution from a scattered waves with a random k vector, and thus by adding up the pixel values, the effect of the scatter wave will diminish whereas the effect of the ballistic wave will constructively add up. Therefore, it can be assumed that the object distortion is a result of the interferences due to many scattered waves with different phases.

The SACII system captures N measurements through multiple imaging channels so that the image of the scattered object at the p th pixel position, corresponding to one point in the object space, can be described as follows:

Ips(i)=Ipo+wp(i)fori=1,...,N,
(3)

where Isp(i) and Iop(i) are scattered and original object beam intensities, respectively and wp(i) is random variable following independent and identically distributed (IID) statistical model [31

31. N. Mukhopadhyay, ed., Probability and Statistical Inference (Marcel Dekker, 2000).

]. Due to the fact that the wp(i) is IID, the recorded samples, Isp(i), are also statistically independent. In order to recover the original intensity of one point of the object, one can use the statistical independence of Isp(i) by adding up the corresponding N samples of a single object point captured by N different imaging channels [See Fig. 1] such that the expectation of the cosine term in Eq. (2) diminishes to zero given the fact that the argument of cosine follows uniform distribution from -π to π [32–33

32. Neal C. Gallagher, “Optimum quantization in digital holography,” Appl. Opt. 17, 109–115 (1978). [CrossRef] [PubMed]

]. Therefore, three-dimensional visualization of the object in the scattering medium can be obtained by shifting and summing the scattered elemental images using the computational ray back propagation algorithm [7–9

7. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. 26, 157–159 (2001). [CrossRef]

]. The original 3D information can then be presented over a number of reconstruction depths. Finally, the superimposed or the integrated image can be written as:

I=n=1NI(rp)nN(S2+E2),
(4)

where N is total number of elemental image. According to Eq. (4), we believe that sufficient number of elemental images allow the optical imaging through scattering medium even if the unscattered beam information is weak owing to the fact that the distortion term in Eq. (2) is averaged out to zero resulting in the original unscattered beam.

Fig. 2. A schematic setup of proposed SACII system for 3D visualization of scattered objects.

4. Experimental results

Both computer simulations and optical experiments are used to illustrate the proposed approach.

A. Computer simulations for 3D visualization of scattered objects

Computer simulations results are presented in this section, which show that SACII system may be used to three-dimensionally visualize objects in a scattering medium. In one experiment, the object in the scattering medium was computationally recorded as an elemental image set by the SACII technique in the setup of Fig. 2. The 3D-Max computer simulation program has recorded a 100×100 elemental image set as shown in Figs. 3 and 4. The UCONN characters were located at distance z0=156cm, 168cm, 180cm, 192cm, and 204cm (see Fig. 3), where z0 is the distance between the virtual pin-hole array and the image plane.

Fig. 3. Experimental setup for SAII recording and reconstruction.

For scattering, the elemental images were modulated by a computer-generated scattered beam with the random phase uniformly distributed in [0, 2π], where we set a scattering ratio (a scattered beam intensity to a maximum value of normalized object beam intensity) to 5. The interference pattern according to Eq. (2) was generated. Then the virtual ray propagation of the scattered elemental images was calculated over a distance z0=156cm, 168cm, 180cm, 192cm, and 204cm for the 3D reconstruction.

Fig. 4. The elemental image set generated by computer simulation program. (a) non-scattered elemental images and (b) scattered elemental images.

Figure 4(a) shows the elemental image set generated by the computer simulation program. Figure 4(b) shows the scattered elemental image set pattern which appear as white-noise. The mean-square-error (MSE), MSE=x=1Nxy=1Ny[Ro(x,y,z=z0)Rs(x,y,z=z0)](Nx×Ny) between the first original and scattered elemental image data was approximately 0.113. Figure 5(a) shows the sectional images reconstructed from the elemental image set of Fig. 4(a) by using the computational SACII algorithm. Figure 5(b) shows the sectional images reconstructed from the scattered elemental image set of Fig. 4(b) by using the computational SACII algorithm. It is noted that the MSE between the original (non-scattered) and restored (scattered) II data was approximately 3.8×10-6, where we calculated the MSE between 3D images reconstructed at the 180cm. It is shown in the experimental results that the proposed SACII method can reduce the interference due to scattering in Eq. (2) so that the objects in the scattering medium can be three-dimensionally restored.

Fig. 5. The sectional images reconstructed from the elemental image set of Fig. 4 by using the computational SACII algorithm. (a) non-scattered object and (b) scattered object.
Fig. 6. The 3D images reconstructed at different distances by using the computational SACII algorithm. The first elemental images of (a) non-scattered volumetric object and (b) scattered volumetric object were recorded by Autodesk 3DMax© computer program, respectively. The sectional images reconstructed at distance (c) 550mm and (e) 600mm from the non-scattered elemental image set, respectively. The sectional images reconstructed at distance (d) 550mm and (f) 600mm from the scattered elemental image set, respectively.

We have conducted the computer simulation experiments for the 3D visualization of the solid volumetric object in scattering medium. Similarly, the 3D-Max computer simulation program has recorded a 100×100 scattered elemental image, where we set the scattering ratio to 5. Figure 6 show the experimental results of the 3D image reconstruction from the scattered volumetric object. The first elemental 2D images of non-scattered and scattered volumetric objects are shown in Fig. 6 (a) and (b), respectively. Figure 6 (c), (d), (e), and (f) show the sectional images reconstructed at different distances from the non-scattered and scattered elemental image sets by using the computational SACII algorithm, respectively. As it is evident from the simulation results, that 3D visualization of a solid volumetric object behind a scattering medium is possible by using the proposed SACII method even if the original object beam information is weak, i.e. high scattering ratio. The quality of the reconstructed image can be improved by additional image processing.

B. Optical experiments for 3D visualization of scattered objects

In this section, optical experiments for 3D visualization of scattered objects by the proposed SACII method are presented. In order to evaluate the SACII system, we recorded the object’s elemental image set with a 2028×2044 image sensor array with a pixel size of 9µm×9µm moving the image sensor array with an interval of 0.1mm. A transparent film in Fig. 7(a) was used as a test object, where the round features in the film are about 4mm in diameter. The test object was sandwiched between two conventional optical diffusers to obtain the scattered image, where the gap between the diffusers is approximately 5mm. The imaging lens is used in place of the virtual pinholes in the experiments. The distance between the test object and the CCD is approximately 220mm. Each imaging channel in the SACII system captured a 2D scattered elemental image containing directional information of the object and then a 441 (21×21) elemental images were generated. An argon laser with a central wavelength of 514nm was used to illuminate the object located between two scattering layers.

Fig. 7. The first (a) non-scattered and (b) scattered elemental images.
Fig. 8. The data distributions of the first (a) non-scattered and (b) scattered elemental images in the area marked in white as shown in Fig. 7, where the 500 pixel points was randomly selected over the area.

Figure 7(a)–(b) shows the first non-scattered and scattered elemental images of the 21×21 elemental image sets recorded by the optical experiments, respectively. The MSE between the original and scattered elemental image data in the area marked in white in Fig. 7 was approximately 0.104. Figure 8 shows the data distributions of the first non-scattered and scattered elemental images in the area marked in white in Fig. 7, where 500 pixels were randomly selected over the area. The statistical standard deviation of the non-scattered and the scattered elemental image data were approximately 9.073 and 86.086, respectively.

Fig. 9. The sectional images reconstructed at different distances from the (a) non-scattered and (b) scattered elemental image sets by using the SACII algorithm.
Fig. 10. The data distributions of the (a) original II and (b) restored II data in the area marked in white as shown in Fig. 9, where the 500 pixel points was randomly selected over the area.

For 3D visualization of the scattered object, multiple-layers of images were reconstructed at different depths z0=130, 150, and 170mm by using the virtual ray propagation algorithm, where z0 is the distance between the virtual pin-hole array and the reconstructed image plane. Figure 9 shows the sectional images reconstructed from the non-scattered and scattered elemental image sets by using the SACII algorithm, respectively. It is noted that the MSE between the original and restored II data in the area marked in white as shown in Fig. 9 was approximately 0.017, where we calculated the MSE between sectional mages reconstructed at z0=150mm. Figure 10 shows the data distributions of the original II and restored II data in the area marked in white as shown in Fig. 9, where 500 pixel points were randomly selected over the area. It is noted that the distortion indexes (ratio of standard deviation to mean) for the original and restored II data were approximately 0.0618 and 0.0398, respectively. It is shown that the standard deviation of the restored II data was much less than the scattered elemental image in Fig. 8(b), which illustrates that the proposed SACII method averages out the modulation terms due to scattering in Eq. (2) and can visualize the 3D objects in the scattering medium. We believe that the experimental results support the concept proposed in this paper.

5. Conclusions

In this paper, we have proposed 3D visualization of objects in scattering medium by use of computational integral imaging. Using SAII techniques, the scattered elemental images of a 3D object are captured from different perspectives. This is accomplished by moving an image sensor array. The sectional images of the original object was numerically reconstructed from the scattered elemental image set by back propagation of rays to an arbitrary plane with a virtual wave of a given wavelength. This proposed concept allows the 3D imaging and visualization in scattering medium by averaging out the distortion patterns induced by the cross-interference of the scattering and object beam.

We have presented computer simulations and optical experiments in order to verify the proposed method and analyzed the reconstructed 3D image data measuring the statistical characteristics between original and restored II data. It was shown in the experiments that the object in the scattering medium can be three-dimensionally visualized by the proposed SACII method if sufficient number of elemental images is obtained.

Acknowledgments

This work has been supported in part by Defense Advanced Research Projects Agency (DARPA). The authors wish to thank the anonymous reviewers for their comments and suggestions and Mr. Myungjin Cho for his assistance with experiments.

References and links

1.

G. Lippmann, “La photographie intégrale,” Compte-Rendus 146, 446–451 (1908).

2.

A. P. Sokolov, ed., Autostereoscopy and integral photography by Professor Lippmanns method (Moscow State Univ. Press, 1911).

3.

H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. 21, 171–176 (1931). [CrossRef]

4.

Y. Igarishi, H. Murata, and M. Ueda, “3D display system using a computer-generated integral photograph,” Jpn. J. Appl. Phys. 17, 1683–1684 (1978). [CrossRef]

5.

H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15, 2059–2065 (1998). [CrossRef]

6.

R. Martinez, A. Pons, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Optically-corrected elemental images for undistorted integral image display,” Opt. Express 14, 9657–9663 (2006). [CrossRef]

7.

H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. 26, 157–159 (2001). [CrossRef]

8.

A. Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express 11, 2446–2451 (2003). [CrossRef] [PubMed]

9.

B. Tavakoli, B. Javidi, and E. Watson, “Three dimensional visualization by photon counting computational Integral Imaging,” Opt. Express 16, 4426–4436 (2008). [CrossRef] [PubMed]

10.

B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14, 12096–12108 (2006). [CrossRef] [PubMed]

11.

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proceedings of the IEEE 94, 591–607 (2006). [CrossRef]

12.

A. Castro, Y. Frauel, and B. Javidi, “Integral imaging with large depth of field using an asymmetric phase mask,” Opt. Express 15, 10266–10273 (2007). [CrossRef] [PubMed]

13.

O. Matoba, E. Tajahuerce, and B. Javidi, “Real-time three-dimensional object recognition with multiple perspectives imaging,” Appl. Opt. 40, 3318–3325 (2001). [CrossRef]

14.

Y. S. Hwang, S. -H. Hong, and B. Javidi, “Free view 3-D visualization of occluded objects by using computational synthetic aperture integral imaging,” J. Display Technol. 3, 64–70 (2007). [CrossRef]

15.

J. W. Goodman and R. W. Lawrence, “Digital image formation from electronically detected holograms,” Appl. Phy. Lett. 11, 77–79 (1967). [CrossRef]

16.

L. Martínez-León and B. Javidi, “Synthetic aperture single-exposure on-axis digital holography,” Opt. Express 16, 161–169 (2008). [CrossRef] [PubMed]

17.

Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, “Three dimensional imaging and display using computational holographic imaging,” Proceedings of IEEE 94, 636–654 (2006). [CrossRef]

18.

I. Moon and B. Javidi, “Volumetric 3D recognition of biological microorganisms using multivariate statistical method and digital holography,” J. Biomed. Opt. 11, 064004 (2006). [CrossRef]

19.

T. Kreis, ed., Handbook of holographic interferometry (Wiley, 2005).

20.

W. Osten, T. Baumbach, and W. Juptner, “Comparative digital holography,” Opt. Lett. , 27, 1764–1766 (2002). [CrossRef]

21.

L. Yu and Z. Chen, “Improved tomographic imaging of wavelength scanning digital holographic microscopy by use of digital spectral shaping,” Opt. Express 15, 878–886 (2007). [CrossRef] [PubMed]

22.

L. Yu and Z. Chen, “Digital holographic tomography based on spectral interferometry,” Opt. Lett. 32, 3005–3007 (2007). [CrossRef] [PubMed]

23.

J. H. Massig, “Digital off-axis holography with a synthetic aperture,” Opt. Lett. 27, 2179–2181 (2002). [CrossRef]

24.

L. Yu and M. K. Kim, “Wavelength-scanning digital interference holography for tomographic 3D imaging using the angular spectrum method,” Opt. Lett. 30, 2092–2094 (2005). [CrossRef] [PubMed]

25.

R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Display Technol. 1, 321–327 (2005). [CrossRef]

26.

B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. 45, 2986–2994 (2006). [CrossRef] [PubMed]

27.

T. Okoshi, ed., Three-dimensional imaging techniques (Academic, 1976).

28.

M. Levoy, “Light fields and computional imaging,” IEEE Computer 39, 46–55 (2006). [CrossRef]

29.

B. Javidi and F. Okano eds, Three dimensional television, video, and display technologies (Springer, 2002).

30.

J. Rosen and D. Abookasis, “Seeing through biological tissues using the fly eye principle,” Opt. Express 11, 3605–3611 (2003). [CrossRef] [PubMed]

31.

N. Mukhopadhyay, ed., Probability and Statistical Inference (Marcel Dekker, 2000).

32.

Neal C. Gallagher, “Optimum quantization in digital holography,” Appl. Opt. 17, 109–115 (1978). [CrossRef] [PubMed]

33.

P. Réfrégier, ed., Noise theory and application to physics: from fluctuations to information (Springer, 2004).

OCIS Codes
(110.6150) Imaging systems : Speckle imaging
(110.6880) Imaging systems : Three-dimensional image acquisition
(170.3880) Medical optics and biotechnology : Medical and biological imaging

ToC Category:
Imaging Systems

History
Original Manuscript: March 19, 2008
Revised Manuscript: May 29, 2008
Manuscript Accepted: July 18, 2008
Published: August 12, 2008

Virtual Issues
Vol. 3, Iss. 10 Virtual Journal for Biomedical Optics

Citation
Inkyu Moon and Bahram Javidi, "Three-dimensional visualization of objects in scattering medium by use of computational integral imaging," Opt. Express 16, 13080-13089 (2008)
http://www.opticsinfobase.org/vjbo/abstract.cfm?URI=oe-16-17-13080


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. G. Lippmann, "La photographie intégrale," Compte-Rendus 146, 446-451 (1908).
  2. A. P. Sokolov, ed., Autostereoscopy and integral photography by Professor Lippmanns method (Moscow State Univ. Press, 1911).
  3. H. E. Ives, "Optical properties of a Lippman lenticulated sheet," J. Opt. Soc. Am. 21, 171-176 (1931). [CrossRef]
  4. Y. Igarishi, H. Murata, and M. Ueda, "3D display system using a computer-generated integral photograph," Jpn. J. Appl. Phys. 17, 1683-1684 (1978). [CrossRef]
  5. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, "Analysis of resolution limitation of integral photography," J. Opt. Soc. Am. A 15, 2059-2065 (1998). [CrossRef]
  6. R. Martinez, A. Pons, G. Saavedra, M. Martinez-Corral, and B. Javidi, "Optically-corrected elemental images for undistorted integral image display," Opt. Express 14, 9657-9663 (2006). [CrossRef]
  7. H. Arimoto and B. Javidi, "Integral three-dimensional imaging with digital reconstruction," Opt. Lett. 26, 157-159 (2001). [CrossRef]
  8. A. Stern and B. Javidi, "3-D computational synthetic aperture integral imaging (COMPSAII)," Opt. Express 11, 2446-2451 (2003). [CrossRef] [PubMed]
  9. B. Tavakoli, B. Javidi, and E. Watson, "Three dimensional visualization by photon counting computational Integral Imaging," Opt. Express 16, 4426-4436 (2008). [CrossRef] [PubMed]
  10. B. Javidi, I. Moon, and S. Yeom, "Three-dimensional identification of biological microorganism using integral imaging," Opt. Express 14, 12096-12108 (2006). [CrossRef] [PubMed]
  11. A. Stern and B. Javidi, "Three-dimensional image sensing, visualization, and processing using integral imaging," Proceedings of the IEEE 94, 591-607 (2006). [CrossRef]
  12. A. Castro, Y. Frauel, and B. Javidi, "Integral imaging with large depth of field using an asymmetric phase mask," Opt. Express 15, 10266-10273 (2007). [CrossRef] [PubMed]
  13. O. Matoba, E. Tajahuerce, and B. Javidi, "Real-time three-dimensional object recognition with multiple perspectives imaging," Appl. Opt. 40, 3318-3325 (2001). [CrossRef]
  14. Y. S. Hwang, S. -H. Hong, and B. Javidi, "Free view 3-D visualization of occluded objects by using computational synthetic aperture integral imaging," J. Display Technol. 3, 64-70 (2007). [CrossRef]
  15. J. W. Goodman and R. W. Lawrence, "Digital image formation from electronically detected holograms," Appl. Phy. Lett. 11, 77-79 (1967). [CrossRef]
  16. L. Martínez-León and B. Javidi, "Synthetic aperture single-exposure on-axis digital holography," Opt. Express 16, 161-169 (2008). [CrossRef] [PubMed]
  17. Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, "Three dimensional imaging and display using computational holographic imaging," Proceedings of IEEE 94, 636-654 (2006). [CrossRef]
  18. I. Moon and B. Javidi, "Volumetric 3D recognition of biological microorganisms using multivariate statistical method and digital holography," J. Biomed. Opt. 11, 064004 (2006). [CrossRef]
  19. T. Kreis, ed., Handbook of holographic interferometry (Wiley, 2005).
  20. W. Osten, T. Baumbach, and W. Juptner, "Comparative digital holography," Opt. Lett.  27, 1764-1766 (2002). [CrossRef]
  21. L. Yu and Z. Chen, "Improved tomographic imaging of wavelength scanning digital holographic microscopy by use of digital spectral shaping," Opt. Express 15, 878-886 (2007). [CrossRef] [PubMed]
  22. L. Yu and Z. Chen, "Digital holographic tomography based on spectral interferometry," Opt. Lett. 32, 3005-3007 (2007). [CrossRef] [PubMed]
  23. J. H. Massig, "Digital off-axis holography with a synthetic aperture," Opt. Lett. 27, 2179-2181 (2002). [CrossRef]
  24. L. Yu and M. K. Kim, "Wavelength-scanning digital interference holography for tomographic 3D imaging using the angular spectrum method," Opt. Lett. 30, 2092-2094 (2005). [CrossRef] [PubMed]
  25. R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, "Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools," J. Display Technol. 1, 321- 327 (2005). [CrossRef]
  26. B. Javidi, S. H. Hong, and O. Matoba, "Multidimensional optical sensor and imaging system," Appl. Opt. 45, 2986-2994 (2006). [CrossRef] [PubMed]
  27. T. Okoshi, ed., Three-dimensional imaging techniques (Academic, 1976).
  28. M. Levoy, "Light fields and computional imaging," IEEE Computer 39, 46-55 (2006). [CrossRef]
  29. B. Javidi and F. Okano eds, Three dimensional television, video, and display technologies (Springer, 2002).
  30. J. Rosen and D. Abookasis, "Seeing through biological tissues using the fly eye principle," Opt. Express 11, 3605-3611 (2003). [CrossRef] [PubMed]
  31. N. Mukhopadhyay, ed., Probability and Statistical Inference (Marcel Dekker, 2000).
  32. Neal C. Gallagher, "Optimum quantization in digital holography," Appl. Opt. 17, 109-115 (1978). [CrossRef] [PubMed]
  33. P. Réfrégier, ed., Noise theory and application to physics: from fluctuations to information (Springer, 2004).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited