OSA's Digital Library

Virtual Journal for Biomedical Optics

Virtual Journal for Biomedical Optics

| EXPLORING THE INTERFACE OF LIGHT AND BIOMEDICINE

  • Editor: Gregory W. Faris
  • Vol. 3, Iss. 4 — Apr. 23, 2008
« Show journal navigation

Three dimensional visualization by photon counting computational Integral Imaging

Behnoosh Tavakoli, Bahram Javidi, and Edward Watson  »View Author Affiliations


Optics Express, Vol. 16, Issue 7, pp. 4426-4436 (2008)
http://dx.doi.org/10.1364/OE.16.004426


View Full Text Article

Acrobat PDF (867 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

In this paper, we present three dimensional (3D) object reconstruction using photon-counted elemental images acquired by a passive 3D Integral Imaging (II) system. The maximum likelihood (ML) estimator is derived to reconstruct the irradiance of the 3D scene pixels and the reliability of the estimator is described by confidence intervals. For applications in photon scarce environments, our proposed technique provides 3D reconstruction for better visualization as well as significant reduction in the computational burden and required bandwidth for transmission of integral images. The performance of the reconstruction is illustrated qualitatively and compared quantitatively with Peak to Signal to Noise Ratio (PSNR) criterion.

© 2008 Optical Society of America

1. Introduction

There has been growing interest in three-dimensional (3D) imaging systems recently [1–4

1. Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, “Three Dimensional Imaging and Display Using Computational Holographic Imaging,” Proc. IEEE Journal 94636–654, (2006). [CrossRef]

] and one of the promising methods for 3D sensing and visualization is Integral Imaging (II) [5–8

5. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94, 591–607 (2006). [CrossRef]

] based on Integral Photography [8

8. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15, 2059–2065 (1998). [CrossRef]

,9

9. M. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. 7, 821–825 (1908).

]. In this technique in addition to irradiance, directional information of the rays is recorded by acquiring two dimensional images from different perspectives of the scene. In particular, in Synthetic Aperture Integral Imaging (SAII) systems [11

11. S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27, 1144–1146 (2002). [CrossRef]

], an imaging device scans a planar grid in order to capture high resolution 2D images. These elemental images are used for reconstruction which can be performed optically [12

12. T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, 548–564 (1980). [CrossRef]

,13

13. Y. Igarishi, H. Murata, and M. Ueda, “3D display system using a computer-generated integral photograph,” Jpn. J. Appl. Phys. 17, 1683–1684 (1978). [CrossRef]

] or computationally by applying the inverse procedure of the recording [6–7

6. H. Arimoto and B. Javidi, “Integrate three-dimensional imaging with computed reconstruction,” Opt. Lett. 26, 157–159 (2001). [CrossRef]

,11

11. S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27, 1144–1146 (2002). [CrossRef]

] The sensitivity of SAII reconstruction results to pickup position uncertainty is studied in [14

14. Tavakoli, M. Danesh Panah, B. Javidi, and E. Watson, “Performance of 3D integral imaging with position uncertainty,” Opt. Express 15, 11889–11902 (2007). [CrossRef] [PubMed]

]. The application of the II system has been extended to object recognition, depth estimation, occlusion removal and multiple viewing point generation [15–19

15. Wilburn, N. Joshi, V. Vaish, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” Proc. of the ACM 24, 765–776 (2005).

]. Multiple image acquisition is also applied to retrieve high spatial-resolution information from low resolution elemental images [20

20. Erdmann and K. J. Gabriel, “High resolution digital photography by use of a scanning microlens array,” Appl. Opt. 40, 5592–5599 (2001). [CrossRef]

, 21

21. K. Nitta, R. Shogenji, S. Miyatake, and J. Tanida, “Image reconstruction for thin observation module by bound optics by using the iterative backprojection method,” Appl. Opt. 45, 2893–2900 (2006). [CrossRef] [PubMed]

].

In some applications, low illumination levels can lead to small irradiances at the image plane. There are various applications for photon-counting imaging such as night vision [22–24

22. G. M. Morris, “Scene matching using photon-limited images,” J. Opt. Soc. Am. A. 1, 482–488 (1984). [CrossRef]

], laser radar imaging [27

27. P. A. Hiskett, G. S. Buller, A. Y. Loudon, J. M. Smith, I Gontijo, A. C. Walker, P. D. Townsend, and M. J. Robertson, “Performance and design of InGaAs/InP photodiodes for single-photon counting at 1.55 um,” Appl. Opt. 39, 6818–6829 (2000). [CrossRef]

,28

28. L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, “Photon Counting in the 1540-nm Wavelength Region with a Germanium Avalanche photodiode,” IEEE J. Quantum Electron. 37, 75–79 (2001). [CrossRef]

], single photon emission tomography [29

29. K. Lange and R. Carson, “EM reconstruction algorithms for emission and transmission tomography,” Proc. IEEE J. Comput. Assist. Tomogr. 8, 306–316 (1984).

] and astronomical imaging [30

30. M. Guillaume, P. Melon, and P. Refregier, “Maximum-likelihood estimation of an astronomical image from a sequence at low photon levels,” J. Opt. Soc. Am. A. 15, 2841–2848 (1998). [CrossRef]

]. Photon-counting imaging systems in general require less received power than conventional imaging systems that generate gray-scale irradiance images. Computational burden for processing photon-counted images with very low number of photons is much less than processing regular gray scale images and they have a potential to be compressed with higher compression ratio which consequently require less bandwidth for transmission.

There are numerous techniques for estimating an irradiance image from the counts of photon detectors in the two dimensional cases. The approaches are based on the statistical model of light fluctuation measurements which declare that given practical assumptions on the imaging conditions, the photo-count statistics follows a Poisson distribution [31

31. J. W. Goodman, Statistical optics (John Wiley & Sons, inc., 1985), Chap 9.

]. Therefore the problem of reconstruction of photon-counted images is equivalent to a Poisson inverse problem [32

32. E. Kolaczyk, “Bayesian multi-scale models for Poisson processes,” J. Amer. Stat. Assoc. 94, 920–933 (1999). [CrossRef]

].

Object recognition and classification for 3D objects using photon counting integral imaging is explored in [33

33. S. Yeom, B. Javidi, and E. Watson, “Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging,” Opt. Express 15, 1513–1533 (2007). [CrossRef] [PubMed]

]. In this paper, we study the Maximum Likelihood (ML) irradiance estimation of 3D objects using photon-counted elemental images captured by SAII system and the results are compared with the computational reconstruction using gray scale irradiance elemental images. The sections of this paper are as follows. Section 2 is an overview of the SAII and computational reconstruction. In Section 3 a model for photon-counted image is described and section 4 presents maximum likelihood estimation of 3D objects’ irradiance and the uncertainty surrounding the estimate is expressed with confidence intervals. Section 5 shows the experimental results and section 6 is the conclusion.

2. Three dimensional imaging and computational reconstruction

Synthetic Aperture Integral Imaging (SAII) is a three dimensional passive imaging technique, in which in addition to the irradiance, directional information of the rays are acquired. For this reason, an imaging device such as a digital camera is moved across a synthetic aperture to capture 2D images from different perspectives of the scene. The synthetic aperture is a planar grid with defined separations on x-y direction while the normal to the grid is parallel to the optical axes of the cameras’ lens. The output is a series of high resolution two dimensional images called elemental images, which are used for reconstruction. The pickup process of SAII is illustrated in Fig. 1 in which a simple model for the 3D scene is chosen which consists of two planar objects located at two different distances from the pick up plane.

Fig. 1. Pickup process of three dimensional synthetic aperture integral imaging. The imaging device (Lens and sensor) translates on the pick up plane to capture elemental images.

The lens along with its sensor translates on the grid with the separation of Sx and Sy in x and y directions respectively in order to capture the elemental images. An array of these elemental images and the reconstruction at two planes where the objects are located is shown in Fig. 2.

One of the possible approaches in the 3D II reconstruction computationally is to simulate the reverse of the pick up process using geometrical optics. In this method, a 2D plane of the 3D scene located at a particular distance is reconstructed by back propagating the elemental images on to that distance, through the simulated pinhole arrays. The back projection process consist of magnifying each elemental image with respect to the distance of the desired reconstruction plane and shifting according to the location of its associated imaging device on the pick up plane. The magnified elemental images overlap on the desired plane such that the objects originally located at that distance are reconstructed properly and appeared in focus while other objects become smeared. The full 3D scene is the collection of the reconstruction at all of the 2D planes.

Magnifying large number of high resolution elemental images and handling them is a computationally intensive process. For an object located at the distance z=z0 from the lens of the imaging device, the magnification factor is M0=z 0/g where g is the distance between pick up grid and image plane [see Fig.1]. Thus another approach for computational reconstruction is applied exclusive of magnification. It is illustrated in Fig. 2 that if the camera shifts by Sx, the image of the object located at z=z 0, shifts by Sx/M0. Thus, the objects are reconstructed by shifting the elemental images opposite to the direction of the image shift due to camera motion. This is equivalent to shrinking the pickup grid with the factor of Sx/M0 and Sy/M0 in x and y direction respectively.

Fig. 2. Arrangement of elemental images at pickup plane (left), Schematic of II reconstruction process at two different distances (right).

The final reconstruction plane consists of partial overlap of shifted elemental images expressed as following:

I(x,y,z0)=1KLk=0K1l=0L1Ikl(x+(1M0)Sxk,y+(1M0)Syl),
(1)

in which subscripts k and l indicate the location of elemental image, Ikl, in the pickup grid. Notice that the size of the objects reconstructed with this method is 1/M 0 of their actual size and if the actual size is required the reconstructed plane should be magnified.

3. Photon-counting detection model

Association of the irradiance image with the photon-counted image is studied in order to estimate the irradiance from photon counts. Each photon of the light carries energy , where h is Plank’s constant (6.6262×10-34) and υ is the mean frequency of the quasi monochromatic light source. Suppose that the energy incident on one pixel of the photosurface during the time ΔT be Ex, then the mean number of photons detected during this time interval can be expressed as(ηEx)/(), where η≤1 is the sensor quantum efficiency and represents the average number of photoelectrons generated by each incident photon [31

31. J. W. Goodman, Statistical optics (John Wiley & Sons, inc., 1985), Chap 9.

].

The incident energy is the integrated irradiance over time interval ΔT, i.e. Ex=IxΔT. Consequently the irradiance is proportional to the mean number of photoevents. On the other hand, the statistical properties of the photoevents show that the probability of the number of photons detected in a time interval, ΔT, smaller than the coherence time of the light, by the photosurface, smaller than the coherence area of the incident light, follows Poisson density function [31

31. J. W. Goodman, Statistical optics (John Wiley & Sons, inc., 1985), Chap 9.

]. However, in practical cases of our interest, neither of the above conditions hold, i.e. the pixel area and exposure time are significantly larger than the coherence area and coherence time of the passive illumination. Nevertheless, for polarized thermal illumination with high degrees of coherence freedom, the degeneracy parameter approaches zero so the probability of detecting C photons at pixel x during the exposure time, Cx, given the irradiance Ix follows Poisson distribution as expressed in Eq. (2) [31

31. J. W. Goodman, Statistical optics (John Wiley & Sons, inc., 1985), Chap 9.

].

Since Eq. (2) is a realistic model for passive integral imaging, we simulate the photon limited images from the irradiance images according to this probability density function with a constraint on the total number of photons detected by each sensor.

Pr(Cx|Ix)=[Ix]CxeIxCx!,Cx=0,1,2,....
(2)

Assume that Ix is the normalized irradiance at pixel x such that x=1NTIx=1 , where NT is the total number of pixels of the image. In order to simulate a photon-counted image that has Np number of photons in average, a Poisson Random number Cx with the mean parameter NpIx is generated, Cx|Ix~ Poisson (NpIx). It is confirmed with Eq. (3) that the expected number of photons in the generated image is Np. So the normalization is required to meet the constraint on the number of photons per image.

Np=E[x=1NTCx]=x=1NTNpIx=Npx=1NTIx
(3)

4. Three Dimensional reconstruction using photon-counted elemental images

Assume that an object is being imaged with an SAII system and the image of a single pixel of that object which is located at distance z0 from the pick up grid, is captured at pixel p≡(x,y) of the first sensor. As discussed in section 2, the image of this object pixel appears on the elemental images periodically in the positions as follows:

{p+Δpkl}{(x+Sxgkz0,y+Syglz0)}fork=0,...,K1,l=0,...,L1
(4)

If the intensities falling on all such pixels of irradiance elemental images are assumed to be equal, i.e. Ikl(p+Δpkl)=Ipz0 , then the collection of the values of such pixels in photon-counted elemental images obtained according to Eq. (2), represent the ensemble of realizations from a Poisson random variables with mean NpIpz0 which are denoted by{Ckl(ppkl)} where Np is constant and equal to the expected number of photons per elemental image. Our purpose is to estimate the irradiance based on these set of photon counts for each pixel of the 3D object. The likelihood estimation for the hypothesis Ipz0 can be calculated as following:

L(Ipz0{Ckl(p+Δpkl)})=Πk=0K1Πl=0L1Pr(Ckl(p+Δp)Ipz0)
(5)

Then the log likelihood is:

l(Ipz0)=k=0K1l=0L1log[Pr(Ckl(p+Δpkl)Ipz0)]
CklIpz0~Poisson(NpIpz0)
l(Ipz0)=kKlL(NpIpz0+Ckl(p+Δpkl)log(NpIpz0)log(Ckl(p+Δpkl)!))
(6)

The Maximum Likelihood (ML) estimate will obtained d by l(Ipz0)Ipz0=0 which leads to

MLE{Ipz0}=I~pz0=1NpKLk=0K1l=0L1Ckl(p+Δpkl)
(7)

So the ML estimate of the irradiance, I~pz0 , is proportional to the average of the corresponding observed samples in the elemental images. This average is the minimal sufficient statistics of the real valued independent identically distributed Poisson random variables and it is proven statistically that this is the Uniformly Minimum Variance Unbiased Estimator (UMVUE) for the mean of Poisson data [34

34. N. Mukhopadhyay, Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

].

Alternatively, computational reconstruction of SAII can be explained point by point in which in order to reconstruct one point of the object located at the specific distance, the elemental images are shifted in a way that all the pixels contain the image of that object point overlap and the average of the associated intensities of these overlapped pixels is the reconstructed point Using the new notation in Eq. (4), computational reconstruction of SAII calculated from Eq. (1) can be written as Ipz0=1KLk=0K1l=0L1Ikl(p+Δpkl) and it can be seen that the ML estimate of the irradiance for a single point is similar to the computational reconstruction of that point which means that the result of the computational reconstruction for one plane of the scene located at the specific distance, using photon-counted elemental images is also the ML irradiance estimate of the scene at that plane. Thus, by photon-counted elemental images, the irradiance of the three dimensional objects can be estimated.

It should be noted that in practical cases, sensors see the scene from different perspectives and a particular object pixel may not appear inside the field of view of all sensors or might be covered by other objects in some perspectives. As a result, the number of samples of the Poisson random variable may not be exactly equal to the number of elemental images for all object pixels. However, as long as the number of samples remains above 30 it can be considered in the large sample inference domain [34

34. N. Mukhopadhyay, Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

]. In our experiments we assume that such condition holds for all object pixels.

The reliability of an estimator is indicated through a parameter called the confidence interval that dictates the bounds of the estimation error [34

34. N. Mukhopadhyay, Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

]. Estimation error is defined as the absolute difference between an estimated parameter and its actual value. Smaller confidence intervals mean that there is smaller estimation error and therefore the estimator is more reliable. The probability that the estimation error is within the bounds determined by the confidence interval is assigned to be 1-α, whereα ∈(0,1) and is chosen according to the desired accuracy. It is shown in appendix A how one gets the following expression for the confidence intervals of the grayscale irradiance; Ipz0

I~pz0±zα2(KLNp)12I~pz0,
(8)

where KL is the total number of elemental images, and z α/2 is the upper 100(α/2)% point of the standard normal distribution and is known for each desired α as shown in Appendix A.

Equation (8) indicates that by increasing the number of photons, Np, or number of elemental images, KL, confidence intervals shrink that is equivalent to the decrease of the estimation uncertainty. On the other hand, decrease of the number of photons can be compensated by increasing the number of elemental images.

Another significant factor which affects the irradiance estimation is the number of object pixels or the total number of pixels per elemental image, NT. Suppose two objects which are identical except in size, are being imaged separately while the number of photons per elemental image is the same for both of them. Intuitively, we expect more error for the irradiance estimation of the larger object. This fact is explained according to the confidential intervals as follows. The larger object has more number of pixels and consequently to meet the condition explained in section 3; i.e. x=1NTIx=1 , the normalized pixels of the larger object have smaller values in compare with the small object. Assume that the normalized irradiance of a single pixel of the large object is Ipz0γ , therefore the associated pixels in the photon-counted elemental images follows Poisson with the mean (NpIpz0γ) , so the confidence interval of the irradiance of that pixel is computed as:

I~pz0±zα2(KLNp)12γI~pz0
(9)

By increasing the number of object pixels, γ increases and according to Eq. (9) confidence interval expands and the irradiance estimation error increases. As a result, in order to get the same accuracy for the irradiance estimation of the large object with the irradiance estimation of the small object, we need to increase the number of photons per elemental image or equivalently increase the number of captured elemental images.

5. Experimental results

We compare the 3D computational reconstruction using irradiance elemental images of SAII experiment with the reconstruction using corresponding photon-counted elemental images qualitatively and quantitatively.

The experimental scene is composed of two toy cars and a model helicopter located at different distances from the sensor plane. The closest and farthest objects are located 24cm and 40cm away form the center of the pickup grid. The scene is illuminated with diffused incoherent light. The imaging device is a digital camera with the focal plane array size of 22.7×15.6mm and a 10µm pixel pitch. The effective focal length of the camera lens is about 20mm. The camera is shifted with the equal separations of 5mm in both x and y directions on a planar grid. The size of the synthetic aperture is 80×80 mm2 with and 16×16 elemental images are captured.

Figure 3 shows the 3D reconstruction of the scene in two different distances of the objects according to Eq. (1) using 256 irradiance elemental images. As is clear, at each distance one of the objects is in focus while the others appear washed out.

According to section 3, we generate photon-counted elemental images from grayscale irradiance images which are captured experimentally, with the constraint on the total number of photons per elemental image. Consistent with the ML estimator calculated in section 4, the irradiance of the 3D scene pixels is reconstructed at different distances using these photon-counted elemental images.

Fig. 3. Reconstruction using irradiance elemental images, (a) reconstruction at =240mm, (b) reconstruction at z=360mm.

In Fig. 4 two sample of the photon-counted elemental images are represented with the expected number of photons Np=103 and Np=105 in part (a) and (d) respectively while the corresponding irradiance image is captured from the center of the pick up grid. Clearly, recognizing the objects visually is not trivial in the 2D image with Np=103. It is illustrated that the objects become recognizable after 3D computational reconstruction using all the gathered 2D photon-counted elemental images and the results can be compared qualitatively with the reconstruction using irradiance elemental images presented in Fig. 3. The movies of reconstruction from z=24cm to z=40cm is presented in Fig. 5 which shows that the objects become in focus at their corresponding distances even by using elemental images with very low number of photons.

Fig. 4. Reconstruction using photon-counted elemental images, (a) central elemental image with Np=103, (b) corresponding reconstruction at =240mm (c), reconstruction at z=360mm, (d) central elemental image with Np=105 (e), corresponding reconstruction at z=240 mm (f), reconstruction at z=360mm.
Fig. 5. Movie of Reconstruction at z=240 mm to z=360mm using photon-counted elemental images with (a) Np=103 (The movie file size: 442 KB) and (b) with Np=105 (The movie file size: 613 KB). [Media 1][Media 2]

We use Peak signal to noise ratio to compare computational reconstruction using original grayscale elemental images with the reconstructions using photon-counted elemental images quantitatively as given in Eq. (10) where MSE is the Mean Square Error which provides an estimate of average error per reconstructed pixels of the 3D scene and Imax is the maximum irradiance of the gray scale elemental images.

PSNR=10log10(ImaxMSE(I,I~)),
(10)

Figure 6 shows that as is anticipated, with increasing the expected number of photons per elemental image i.e. Np, the error decreases and as a result PSNR increases exponentially. The increase of the PSNR is confirmed with the confidence intervals, Eq. (8) which proves that the estimation error is proportional to 1Np and consequently we expect that PSNR be proportional tolog(Np).

Fig. 6. PSNR of the reconstruction with photon-counted elemental images versus the total number of photons per elemental image

In photon-counted image generation, the pixel values of the normalized irradiance images are in the range [0, 2.5×10-5], so with Np=103, the probability of counting more than one photon per pixel is less than 0.03%, i.e. Pr(C>2|Imax)<0.03 as calculated by Eq. (2). Therefore, we can assume to good approximation that the count for each pixel is either zero or one. Thus, with little error we treat photon-counted images with very low number of photons as binary images.

According to this fact, a possible performance criterion can be comparison of the reconstructions using binarized photon-counted images with the reconstructions using binarized elemental images based on Otsu’s thresholding method [35

35. N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man. Cybern 9, 62–66 (1979). [CrossRef]

]. This method finds the threshold such that the intraclass variance of the black and white pixels approaches minimum. Reconstruction using Otsu’s method at z=240mm is shown in Fig. 7 in which it can be seen that the details of the image are lost. This result is compared with the reconstruction using photon-counted and irradiance elemental images respectively in Fig.8. For quantitative comparison we have computed the PSNR at different planes of the reconstructed image resulting from these binarized elemental images. We find the PSNR is approximately 5db smaller than the PSNR using photon-counted elemental images with Np=103, shown in Fig. 5. It is interesting to note that using a 16×16 array of elemental images, 1000 photo-counts per elemental image (average), and a mean illumination wavelength of 500 nm, the total received energy would only be approximately 10-16 J.

Fig. 7. (a) Binerized central irradiance elemental image by thresholding method, (b) corresponding reconstruction at z=240 mm.
Fig. 8. Cropped image of reconstruction at z=240mm, (a) using binary elemental images, (b) photon-counted elemental images with Np=103 (c) grayscale irradiance elemental images.

6. Conclusion

In this paper, we have proposed the irradiance estimation of the three dimensional objects using photon-counted elemental images obtained by SAII system. The irradiance at different distances from the pick up plane is estimated using Maximum Likelihood estimator while photons counts are follow Poisson distribution with the mean parameter proportional to the desired irradiance. Confidential intervals are used to investigate the reliability of the estimator which states that the parameters of the imaging system can be modified in order to get the desired accuracy for the irradiance estimation. It is shown that computational reconstruction of the 3D objects in integral imaging systems coincide with Maximum likelihood irradiance estimation using photon-counted elemental images. The qualitative results of irradiance estimation illustrate that visualization improves using multi perspective photon-counted images. The performance of results is studied quantitatively using PSNR which increases exponentially with increasing number of photons per elemental image. Photon-counted images are the result of reduced input light or can be extracted from the irradiance values using the Poisson model for photon events. In this case, we may take advantage of the low number of photons generated by the photon-counting detector to improve the computational efficiency for data processing.

Appendix A

The ensemble of realizations from Poisson random variables with mean NpIpz0 is denoted by{Ckl(ppkl)} where Ipz0 is the irradiance of one pixel of the 3D object located at z0.

Ckl|Ipz0~Poisson(NpIpz0)fork=0,...,K1;l=0,...,L1
(A.1)

The size of this ensemble is equal to the number of elemental images, KL. By applying Central Limit Theorem to this KL independent identically distributed random variables we have:

KL(NpI~pz0NpIpz0)NpI~pz0N(0,1)asKL
(A.2)

As a result, for the pre assigned α∈(0,1), we claim that:

Pr{KLNp(I~pz0Ipz0)I~pz0<zα2}1α
(A.3)

which leads to the following confidence interval for Ipz0 :

I~pz0±zα2(KLNp)12I~pz0,
(A.4)

where z α/2 is the upper 100(α/2)% point of the standard normal distribution that is obtained for each desired α as follows:

zα212πexp(x22)dx=α
(A.5)

References and links

1.

Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, “Three Dimensional Imaging and Display Using Computational Holographic Imaging,” Proc. IEEE Journal 94636–654, (2006). [CrossRef]

2.

Javidi and F. Okano, eds., Three Dimensional Television, Video, and Display Technologies (Springer, Berlin, 2002).

3.

M. Levoy and P. Hanrahan. “Light field rendering” Proc. ACM Siggarph, ACM Press , 31–42 (1996).

4.

B. Javidi, S.-H. Hong, and O. Matoba, “Multi dimensional optical sensors and imaging systems,” Appl. Opt. 45, 2986–2994 (2006). [CrossRef] [PubMed]

5.

Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94, 591–607 (2006). [CrossRef]

6.

H. Arimoto and B. Javidi, “Integrate three-dimensional imaging with computed reconstruction,” Opt. Lett. 26, 157–159 (2001). [CrossRef]

7.

Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express 11, 2446–2451 (2003). [CrossRef] [PubMed]

8.

Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15, 2059–2065 (1998). [CrossRef]

9.

M. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. 7, 821–825 (1908).

10.

H. E. Ives, “Optical properties of a Lippmann lenticuled sheet,” J. Opt. Soc. Am. 21, 171–176 (1931). [CrossRef]

11.

S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27, 1144–1146 (2002). [CrossRef]

12.

T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, 548–564 (1980). [CrossRef]

13.

Y. Igarishi, H. Murata, and M. Ueda, “3D display system using a computer-generated integral photograph,” Jpn. J. Appl. Phys. 17, 1683–1684 (1978). [CrossRef]

14.

Tavakoli, M. Danesh Panah, B. Javidi, and E. Watson, “Performance of 3D integral imaging with position uncertainty,” Opt. Express 15, 11889–11902 (2007). [CrossRef] [PubMed]

15.

Wilburn, N. Joshi, V. Vaish, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” Proc. of the ACM 24, 765–776 (2005).

16.

M. Martínez-Corral, B. Javidi, R. Martínez-Cuenca, and G. Saavedra, “Integral imaging with improved depth of field by use of amplitude modulated microlens array,” Appl. Opt. 43, 5806–5813 (2004). [CrossRef] [PubMed]

17.

O. Matoba, E. Tajahuerce, and B. Javidi, “Real-time three-dimensional object recognition with multiple perspectives imaging,” Appl. Opt. 40, 3318–3325 (2001). [CrossRef]

18.

Y. Frauel and B. Javidi, “Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,” Appl. Opt. 41, 5488–5496 (2002). [CrossRef] [PubMed]

19.

F. A. Sadjadi and A. Mahalanobis, “Target-adaptive polarimetric synthetic aperture radar target discrimination using maximum average correlation height filters,” Appl. Opt. 45, 3063–3070 (2006). [CrossRef] [PubMed]

20.

Erdmann and K. J. Gabriel, “High resolution digital photography by use of a scanning microlens array,” Appl. Opt. 40, 5592–5599 (2001). [CrossRef]

21.

K. Nitta, R. Shogenji, S. Miyatake, and J. Tanida, “Image reconstruction for thin observation module by bound optics by using the iterative backprojection method,” Appl. Opt. 45, 2893–2900 (2006). [CrossRef] [PubMed]

22.

G. M. Morris, “Scene matching using photon-limited images,” J. Opt. Soc. Am. A. 1, 482–488 (1984). [CrossRef]

23.

G. M. Morris, “Image correlation at low light levels: a computer simulation,” Appl. Opt. 23, 3152–3159 (1984). [CrossRef] [PubMed]

24.

Watson and G. M. Morris, “Comparison of infrared up conversion methods for photon-limited imaging,” J. Appl. Phys. 67, 6075–6084 (1990). [CrossRef]

25.

Watson and G. M. Morris, “Imaging thermal objects with photon-counting detector,” Appl. Opt. 31, 4751–4757 (1992). [CrossRef] [PubMed]

26.

D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, “Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs,” J. Mod. Opt. 48, 1967–1981 (2001). [CrossRef]

27.

P. A. Hiskett, G. S. Buller, A. Y. Loudon, J. M. Smith, I Gontijo, A. C. Walker, P. D. Townsend, and M. J. Robertson, “Performance and design of InGaAs/InP photodiodes for single-photon counting at 1.55 um,” Appl. Opt. 39, 6818–6829 (2000). [CrossRef]

28.

L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, “Photon Counting in the 1540-nm Wavelength Region with a Germanium Avalanche photodiode,” IEEE J. Quantum Electron. 37, 75–79 (2001). [CrossRef]

29.

K. Lange and R. Carson, “EM reconstruction algorithms for emission and transmission tomography,” Proc. IEEE J. Comput. Assist. Tomogr. 8, 306–316 (1984).

30.

M. Guillaume, P. Melon, and P. Refregier, “Maximum-likelihood estimation of an astronomical image from a sequence at low photon levels,” J. Opt. Soc. Am. A. 15, 2841–2848 (1998). [CrossRef]

31.

J. W. Goodman, Statistical optics (John Wiley & Sons, inc., 1985), Chap 9.

32.

E. Kolaczyk, “Bayesian multi-scale models for Poisson processes,” J. Amer. Stat. Assoc. 94, 920–933 (1999). [CrossRef]

33.

S. Yeom, B. Javidi, and E. Watson, “Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging,” Opt. Express 15, 1513–1533 (2007). [CrossRef] [PubMed]

34.

N. Mukhopadhyay, Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

35.

N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man. Cybern 9, 62–66 (1979). [CrossRef]

OCIS Codes
(030.5260) Coherence and statistical optics : Photon counting
(100.3010) Image processing : Image reconstruction techniques
(100.6890) Image processing : Three-dimensional image processing
(110.6880) Imaging systems : Three-dimensional image acquisition

ToC Category:
Image Processing

History
Original Manuscript: January 3, 2008
Revised Manuscript: March 8, 2008
Manuscript Accepted: March 9, 2008
Published: March 17, 2008

Virtual Issues
Vol. 3, Iss. 4 Virtual Journal for Biomedical Optics

Citation
Behnoosh Tavakoli, Bahram Javidi, and Edward Watson, "Three dimensional visualization by photon counting computational Integral Imaging," Opt. Express 16, 4426-4436 (2008)
http://www.opticsinfobase.org/vjbo/abstract.cfm?URI=oe-16-7-4426


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, "Three Dimensional Imaging and Display Using Computational Holographic Imaging," Proc. IEEE Journal 94 636-654, (2006). [CrossRef]
  2. Javidi and F. Okano, eds., Three Dimensional Television, Video, and Display Technologies (Springer, Berlin, 2002).
  3. M. Levoy and P. Hanrahan. "Light field rendering" Proc. ACM Siggarph, ACM Press, 31-42 (1996).
  4. B.  Javidi, S.-H.  Hong, and O.  Matoba, "Multi dimensional optical sensors and imaging systems," Appl. Opt.  45, 2986-2994 (2006). [CrossRef] [PubMed]
  5. Stern and B. Javidi, "Three-dimensional image sensing, visualization, and processing using integral imaging," Proc. IEEE  94, 591-607 (2006). [CrossRef]
  6. H. Arimoto and B. Javidi, "Integrate three-dimensional imaging with computed reconstruction," Opt. Lett. 26, 157-159 (2001). [CrossRef]
  7. Stern and B. Javidi, "3-D computational synthetic aperture integral imaging (COMPSAII)," Opt. Express  11, 2446-2451 (2003). [CrossRef] [PubMed]
  8. Hoshino, F.  Okano, H. Isono, and I. Yuyama, "Analysis of resolution limitation of integral photography," J. Opt. Soc. Am. A  15, 2059-2065 (1998). [CrossRef]
  9. M. G. Lippmann, "Epreuves reversibles donnant la sensation du relief," J. Phys. 7, 821-825 (1908).
  10. H. E. Ives, "Optical properties of a Lippmann lenticuled sheet," J. Opt. Soc. Am.  21, 171-176 (1931). [CrossRef]
  11. S. Jang and B. Javidi, "Three-dimensional synthetic aperture integral imaging," Opt. Lett. 27, 1144-1146 (2002). [CrossRef]
  12. T. Okoshi, "Three-dimensional displays," Proc. IEEE 68, 548-564 (1980). [CrossRef]
  13. Y. Igarishi, H. Murata, and M. Ueda, "3D display system using a computer-generated integral photograph," Jpn. J. Appl. Phys. 17, 1683-1684 (1978). [CrossRef]
  14. Tavakoli, M. Danesh Panah, B. Javidi, and E. Watson, "Performance of 3D integral imaging with position uncertainty," Opt. Express 15, 11889-11902 (2007). [CrossRef] [PubMed]
  15. Wilburn, N.  Joshi, V. Vaish, A. Barth, A. Adams, M. Horowitz, and M. Levoy, "High performance imaging using large camera arrays," Proc. of the ACM 24, 765-776 (2005).
  16. M.  Martínez-Corral, B.  Javidi, R.  Martínez-Cuenca, and G.  Saavedra, "Integral imaging with improved depth of field by use of amplitude modulated microlens array," Appl. Opt. 43, 5806-5813 (2004). [CrossRef] [PubMed]
  17. O. Matoba, E. Tajahuerce, and B. Javidi, "Real-time three-dimensional object recognition with multiple perspectives imaging," Appl. Opt. 40, 3318-3325 (2001). [CrossRef]
  18. Y. Frauel and B. Javidi, "Digital three-dimensional image correlation by use of computer-reconstructed integral imaging," Appl. Opt. 41, 5488-5496 (2002). [CrossRef] [PubMed]
  19. F. A. Sadjadi and A. Mahalanobis, "Target-adaptive polarimetric synthetic aperture radar target discrimination using maximum average correlation height filters," Appl. Opt. 45, 3063-3070 (2006). [CrossRef] [PubMed]
  20. Erdmann and K. J. Gabriel, "High resolution digital photography by use of a scanning microlens array," Appl. Opt. 40, 5592-5599 (2001). [CrossRef]
  21. K. Nitta, R. Shogenji, S. Miyatake, and J. Tanida, "Image reconstruction for thin observation module by bound optics by using the iterative backprojection method," Appl. Opt. 45, 2893-2900 (2006). [CrossRef] [PubMed]
  22. G. M. Morris, "Scene matching using photon-limited images," J. Opt. Soc. Am. A. 1, 482-488 (1984). [CrossRef]
  23. G. M. Morris, "Image correlation at low light levels: a computer simulation," Appl. Opt. 23, 3152-3159 (1984). [CrossRef] [PubMed]
  24. Watson and G. M. Morris, "Comparison of infrared up conversion methods for photon-limited imaging," J. Appl. Phys. 67, 6075-6084 (1990). [CrossRef]
  25. Watson and G. M. Morris, "Imaging thermal objects with photon-counting detector," Appl. Opt. 31, 4751-4757 (1992). [CrossRef] [PubMed]
  26. D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, "Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs," J. Mod. Opt. 48, 1967-1981 (2001). [CrossRef]
  27. P. A. Hiskett, G. S. Buller, A. Y. Loudon, J. M. Smith, I Gontijo, A. C. Walker, P. D. Townsend, and M. J. Robertson, "Performance and design of InGaAs/InP photodiodes for single-photon counting at 1.55 um," Appl. Opt. 39, 6818-6829 (2000). [CrossRef]
  28. L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, "Photon Counting in the 1540-nm Wavelength Region with a Germanium Avalanche photodiode," IEEE J. Quantum Electron. 37, 75-79 (2001). [CrossRef]
  29. K. Lange and R. Carson, "EM reconstruction algorithms for emission and transmission tomography," Proc. IEEE J. Comput. Assist. Tomogr. 8, 306-316 (1984).
  30. M.  Guillaume, P.  Melon, and P.  Refregier, "Maximum-likelihood estimation of an astronomical image from a sequence at low photon levels," J. Opt. Soc. Am. A. 15, 2841-2848 (1998). [CrossRef]
  31. J. W. Goodman, Statistical optics (John Wiley & Sons, inc., 1985), Chap 9.
  32. E. Kolaczyk, "Bayesian multi-scale models for Poisson processes," J. Amer. Stat. Assoc. 94, 920-933 (1999). [CrossRef]
  33. S. Yeom, B. Javidi, and E. Watson, "Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging," Opt. Express 15, 1513-1533 (2007). [CrossRef] [PubMed]
  34. N.  Mukhopadhyay, Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).
  35. N. Otsu, "A Threshold Selection Method from Gray-Level Histograms," IEEE Trans. Syst. Man. Cybern 9, 62-66 (1979). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Supplementary Material


» Media 1: AVI (442 KB)     
» Media 2: AVI (613 KB)     

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited