## Three-dimensional imaging and recognition of microorganism using single-exposure on-line (SEOL) digital holography

Optics Express, Vol. 13, Issue 12, pp. 4492-4506 (2005)

http://dx.doi.org/10.1364/OPEX.13.004492

Acrobat PDF (2650 KB)

### Abstract

We address three-dimensional (3D) visualization and recognition of microorganisms using single-exposure on-line (SEOL) digital holography. A coherent 3D microscope-based Mach-Zehnder interferometer records a single on-line Fresnel digital hologram of microorganisms. Three-dimensional microscopic images are reconstructed numerically at different depths by an inverse Fresnel transformation. For recognition, microbiological objects are segmented by processing the background diffraction field. Gabor-based wavelets extract feature vectors with multi-oriented and multi-scaled Gabor kernels. We apply a rigid graph matching (RGM) algorithm to localize predefined shape features of biological samples. Preliminary experimental and simulation results using sphacelaria alga and tribonema aequale alga microorganisms are presented. To the best of our knowledge, this is the first report on 3D visualization and recognition of microorganisms using on-line digital holography with single-exposure.

© 2005 Optical Society of America

## 1. Introduction

6. O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, and B. Javidi, “Real-time three-dimensional object reconstruction by use of a phase-encoded digital hologram,” Appl. Opt. **41**, 6187–6192 (2002). [CrossRef] [PubMed]

4. B. Javidi, ed., *Image Recognition and Classification: Algorithms, Systems, and Applications*, (Marcel Dekker, New York, 2002). [CrossRef]

5. B. Javidi and E. Tajahuerce, “Three dimensional object recognition using digital holography,” Opt. Lett. **25**, 610–612 (2000). [CrossRef]

13. Alexander Stadelmaier and Jurgen H. Massig, “Compensation of lens aberrations in digital holography,” Opt. Lett. **25**, 1630 (2000). [CrossRef]

15. M. G. Forero, F. Sroubek, and G. Cristobal, “Identification of tuberculosis bacteria based on shape and color,” Real-time imaging **10**, 251–262 (2004) [CrossRef]

16. J. Alvarez-Borrego, R. R. Mourino-Perez, G. Cristobal-Perez, and J. L. Pech-Pacheco, “Invariant recognition of polychromatic images of Vibrio cholerae 01,” Opt. Eng. **41**, 872–833 (2002) [CrossRef]

17. A. L. Amaral, M. da Motta, M. N. Pons, H. Vivier, N. Roche, M. Moda, and E. C. Ferreira, “Survey of protozoa and metazoa populations in wastewater treatment plants by image anlaysis and discriminant analysis,” Environmentrics **15**, 381–390 (2004) [CrossRef]

18. S.-K. Treskatis, V. Orgeldinger, H. wolf, and E. D. Gilles, “Morphological characterization of filamentous microorganisms in submerged cultures by on-line digital image analysis and Pattern recognition,” Biotechnology and Bioengineering **53**, 191–201 (1997). [CrossRef] [PubMed]

19. T. Luo, K. Kramer, D. B. Goldgof, L. O. Hall, S. Samson, A. Remsen, and T. Hopkins, “Recognizing plankton images from the shadow image particle profiling evaluation recorder,” IEEE Trans. on systems, man, and cybernetics Part B **34**, 1753–1762 (2004). [CrossRef]

20. J. M. S. Cabral, M. Mota, and J. Tramper eds., *Multiphase bioreactor design: chap2 image analysis and multiphase bioreactor*, (Taylor & Francis, London2001) [CrossRef]

21. B. Javidi and D. Kim, “Three-dimensional-object recognition by use of single-exposure on-axis digital holography,” Opt. Lett. **30**, 236–238 (2005). [CrossRef] [PubMed]

22. D. Kim and B. Javidi, “Distortion-tolerant 3-D object recognition by using single exposure on-axis digital holography,” Opt. Express **12**, 5539–5548 (2005), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-22-5539 [CrossRef]

23. J. G. Daugman, “Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filters,” J. Opt. Soc. Am. **2**, 1160–1169 (1985). [CrossRef]

24. T. S. Lee, “Image representation using 2D Gabor wavelets,” IEEE Trans. on PAMI. **18**, 959–971 (1996). [CrossRef]

25. M. Lades, J. C. Vorbruggen, J. Buhmann, J. Lange, C. v.d. Malsburg, R. P. Wurtz, and W. Konen, “Distortion invariant object recognition in the dynamic link architecture,” IEEE Trans. Computers **42**, 300–311 (1993). [CrossRef]

26. S. Yeom and B. Javidi, “Three-dimensional object feature extraction and classification with computational holographic imaging,” Appl. Opt. **43**, 442–451 (2004). [CrossRef] [PubMed]

## 2. Single exposure on-line (SEOL) digital holography

*nm*, is expanded by use of a spatial filter and a collimating lens to provide spatial coherence. A beam splitter divides the expanded beam into object and reference beam. The object beam illuminates the microorganism sample and the microscope objective produces a magnified image positioned at the image plane of the microscope [see Fig. 3]. The reference beam forms an on-axis interference pattern together with the light diffracted by the microorganism sample which is recorded by the CCD camera. Our system uses no optical components for the phase retardation in the reference beam which the phase-shifting digital holography technique requires. Also, only a single exposure is recorded in our system. In the following, we describe both on-axis phase-shifting digital holography and SEOL.

12. T. Zhang and I. Yamaguchi, “Three-dimensional microscopy with phase-shifting digital holography,” Opt. Lett. **23**, 1221 (1998). [CrossRef]

*A*

_{H}(

*x*,

*y*) and Φ

_{H}(

*x*,

*y*) are the amplitude and phase, respectively, of the Fresnel complex-amplitude distribution of the micro objects at the recording plane generated by the object beam;

*A*

_{R}is the amplitude of the reference distribution;

*φ*

_{R}denotes the constant phase of the reference beam; and Δ

*φ*

_{p}, where the subscript

*p*is an integer from 1 and 4, denoting the four possible phase shifts required for on-axis phase-shifting digital holography. The desired biological object Fresnel wave function,

*A*

_{H}(

*x*,

*y*) and Φ

_{H}(

*x*,

*y*) can be obtained by use of the four interference patterns with different phase shifts Δ

*φ*

_{p}=0,

*π*/2,

*π*and 3

*π*/2.

*π*/2 phase difference, 2) the information about a reference beam, and 3) information about the diffracted biological object beam intensity. The complex amplitude of the microscopic 3D biological object wave at the hologram plane from the double-exposure method is represented by:

*H*

_{1}(

*x*,

*y*) and

*H*

_{2}(

*x*,

*y*) can be obtained from Eq. (1). We assume that the recording between two holograms is uniform and reference beam is plane wave. The former assumption requires stable recording environment and stationary objects.

21. B. Javidi and D. Kim, “Three-dimensional-object recognition by use of single-exposure on-axis digital holography,” Opt. Lett. **30**, 236–238 (2005). [CrossRef] [PubMed]

*H*

_{1}(

*x*,

*y*) can be obtained from Eq. (1). To remove DC terms in Eq. (3), the reference beam intensity |

*A*

_{R}|

^{2}is removed by only a one time measurement in the experiment. The object beam intensity |

*A*

_{H}(

*x*,

*y*)|

^{2}can be considerably reduced by use of signal processing (for example, an averaging technique). Even though SEOL digital holography originally contains a conjugate image, we can utilize the conjugate image in the interferogram in recognition experiments since it has information about the biological object. Thus, the 3D biological object wave function

*U*

_{h′}(

*x*,

*y*) including a conjugate component in Eq. (3) can be obtained by use of SEOL digital holography. In this paper, we show that the index

*U*

_{h′}(

*x*,

*y*) in Eq. (3) obtained by a SEOL hologram can be used for 3D biological object recognition and 3D image formation. The results will be compared with that of index

*U*

_{h}(

*x*,

*y*) in Eq. (2) obtained by on-line phase-shifting holography which requires multiple recordings. The microscopic 3D biological object can be restored by Fresnel propagation of

*U*

_{h′}(

*x*,

*y*) which is the biological object wave information in the hologram plane. We can numerically reconstruct 3D section images on any parallel plane perpendicular to the optical axis by computing the following Fresnel transformation with a 2D FFT algorithm:

*U*

_{o′}(

*m′*,

*n′*) and (Δ

*X*,Δ

*Y*) are the reconstructed complex amplitude distribution and resolution at the plane in the biological object beam, respectively;

*U*

_{h′}(

*m*,

*n*) and (Δ

*x*,Δ

*y*) are the object wave function including a conjugate component and resolution at the hologram plane, respectively; and

*d*represents the distance between the image plane and hologram plane.

## 3. Segmentation

*o*) is defined as:

*o′*(

*m*,

*n*) is the intensity of the holographic image; and

*m*and

*n*are 2D discrete coordinates in

*x*and

*y*directions, respectively. The threshold

*I*

_{s}is decided from the histogram analysis and the maximum intensity rate:

*r*

_{max}is the maximum intensity rate of coherent light after scattering by the microorganisms. The threshold

*P*

_{s}is a predetermined probability;

*N*

_{T}is the number of pixels;

*h*(

*τ*

_{i}) is the histogram, i.e., the number of pixels of which intensity is between

*τ*

_{1-i}and

*τ*

_{i};

*τ*

_{i}is the

*i*-th quantized intensity level; and

*κ*

_{min}is the minimum number of pixels that satisfies Eq. (7). For the experiments, the total number of intensity levels is set at 256.

*P*

_{s}and

*r*

_{max}can be decided according to the prior knowledge of the spatial distribution and transmittance of the microorganisms.

## 4. Gabor-based wavelets and feature vector extraction

### 4.1 Gabor-based wavelets

**x**is a position vector,

**k**is a wave number vector, and

*σ*is the standard deviation of the Gaussian envelope. By changing the magnitude and direction of the vector

**k**, we can scale and rotate the Gabor kernel to make self-similar forms.

*g*

_{uv}(

*m*,

*n*) at

**k**=

**k**

_{uv}and

**x**=(

*m*,

*n*), where

*m*and

*n*are discrete coordinates in 2D space in

*x*and

*y*directions, respectively. Sampling of

**k**is done as

**k**

_{uv}=

*k*

_{0u}[cos

*ϕ*

_{v}sin

*ϕ*

_{v}]

^{t},

*k*

_{0u}=

*k*

_{0}/

*δ*

^{u-1}, and

*ϕ*

_{v}=[(

*v*-1)/

*V*]

*π*,

*u*=1,…,

*U*and

*v*=1,…,

*V*, where

*k*

_{0u}is the magnitude of the wave number vector;

*ϕ*

_{v}is the azimuth angle of the wave number vector;

*k*

_{0}is the maximum carrier frequency of the Gabor kernels;

*δ*is the spacing factor in the frequency domain;

*u*and

*v*are the indexes of the Gabor kernels;

*U*and

*V*are the total numbers of decompositions along the radial and tangential axes, respectively; and

*t*stands for the matrix transpose.

23. J. G. Daugman, “Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filters,” J. Opt. Soc. Am. **2**, 1160–1169 (1985). [CrossRef]

*u*) kernels and global features with low frequency bandwidth (large

*u*) kernels. It is noted that the Gabor-based wavelet has strong response to the edges if the wave number vector

**k**is perpendicular to the direction of edges.

### 4.2 Feature vector extraction

*h*

_{uv}(

*m*,

*n*) be the filtered output of the image

*o*(

*m*,

*n*) after it is convolved with the Gabor kernel

*g*

_{uv}(

*m*,

*n*):

*o*(

*m*,

*n*) is the normalized image between 0 and 1 after the segmentation; and

*N*

_{m}, and

*N*

_{n}are the size of reconstructed images in

*x*and

*y*directions, respectively.

*h*

_{uv}(

*m*,

*n*) is also called the “Gabor coefficient.”

*m*,

*n*) is composed of a set of the Gabor coefficients and the segmented image. The rotation-invariant property can be achieved simply by adding up all the Gabor coefficients along the tangential axes in the frequency domain. Therefore, we define a rotation-invariant feature vector

**v**as:

**v**is

*U*+1. In the experiments, we use only real parts of the feature vector since they are more suitable to recognize filamentous structures. There is no optimal way to choose the parameters for the Gabor kernels, but several values are widely used heuristically depending on the applications. The parameters are set up at

*σ*=

*π*,

*k*

_{0}=

*π*/2,

*δ*=2√2,

*U*=3,

*V*=6 in this paper.

## 5. Rigid Graph Matching (RGM)

25. M. Lades, J. C. Vorbruggen, J. Buhmann, J. Lange, C. v.d. Malsburg, R. P. Wurtz, and W. Konen, “Distortion invariant object recognition in the dynamic link architecture,” IEEE Trans. Computers **42**, 300–311 (1993). [CrossRef]

*R*and

*S*be two identical and rigid graphs placed on the reference (o

_{r}) object and unknown input image (o

_{s}), respectively. The location of the reference graph

*R*is pre-determined by the translation vector

**p**

_{r}and the clock wise rotation angle

*θ*

_{r}. A position vector of the node

*k*in the graph

*R*is:

*k*and the center of the graph which is located at the origin without rotation;

*K*is the total number of nodes in the graph; and A is a rotation matrix.

*R*covers a designated shape of the representing characteristic in the reference microorganism, we search the similar local shape by translating and rotating the graph

*S*on unknown input images. We describe any rigid motion of the graph

*S*by translation vector

**p**and clock wise rotation angle

*θ*:

*k*in the graphs

*S*. The transformation in Eq. (13) allows robustness in detection of rotated and shifted reference objects.

*R*and

*S*is defined as:

**v**[

**v**[

*θ*,

**p**)] are feature vectors defined at

*θ*,

**p**), respectively.

*R*and

*S*as:

*R*on the image

*S*on the image os. The graph

*R*covers the fixed region in the reference images,

*S*is identified with the reference shape which is covered by the graph

*R*if two conditions are satisfied as follows:

*ĵ*is the index of the reference image which produces the maximum similarity between the graph

*R*and the graph

*S*with the translation vector

**p**and the rotation angle

*;*θ ^

_{j}*α*

_{Γ}and

*α*

_{C}are thresholds for the similarity function and the difference cost, respectively; and

*is obtained by searching the best matching angle to maximize the similarity function:*θ ^

_{j}## 6. Experiments and simulation result

### 6.1 3D imaging with SEOL digital holography

*µm*×9

*µm*. The microorganisms are sandwiched between two transparent cover slips. The diameter of the sample is around 10~50

*µm*. We generate two holograms for the alga samples. The microscopic 3D biological object was placed at a distance 500

*mm*from the CCD array as shown in Fig. 2. The results of the reconstructed images from the hologram of the alga samples are shown in Fig. 4. Figure 4(a) and (b) shows sphacelaria’s 2D image and the digital hologram by SEOL digital holography technique, respectively. Figure 4(c) and (d) are sphacelaria’s reconstructed images from the blurred digital holograms at distance of

*d*=180

*mm*and 190

*mm*, respectively using the SEOL digital holography. Figure 4(e) shows the sphacelaria’s reconstructed image at distance

*d*=180

*mm*using phase-shifting on-line digital holography with two interferograms, and Fig. 4(f) is tribonema aequale’s reconstructed image at distance

*d*=180

*mm*using SEOL digital holography. In the experiments, we use a weak reference beam for the conjugate image which overlaps the original image. As shown in Fig. 4, we obtained the sharpest reconstruction at distance d that is between 180

*mm*and 190

*mm*for both holographic methods. The reconstruction results indicate that we obtain the focused image by use of SEOL digital holography as well as from the phase-shifting digital holography. We will show that SEOL digital holography may be a useful method for 3D biological object recognition. That is because the conjugate image in the hologram contains information about the 3D biological object. In addition, SEOL digital holography can be performed without stringent environmental stability requirements.

### 6.2 3D Microorganism reconstruction and feature extraction

*mm*, A4–A6 are reconstructed at 200

*mm*, and A7 and A8 are reconstructed at 300

*mm*and all samples of tribonema aequale (B1–B8) are reconstructed at 180

*mm*for the sharpest images.

*P*

_{s}and the maximum intensity rate

*r*

_{max}for the segmentation are set at 0.25 and 0.45, respectively. We assume less than 25% of lower intensity region is occupied by microorganisms and the intensity of microorganisms is less than 45% of the background diffraction field. Figures 5(a) and (b) show the reconstructed and segmented image of a sphacelaria sample (A1), respectively. Figures 5(c)–(e) show the real parts of Gabor coefficients in Section 4b when

*u*=1,2, and 3.

### 6.3 Recognition of sphacelaria alga

*x*and

*y*directions. Therefore, the total number of nodes in the graph is 75. The reference graph

*R*is located in the sample A1 with

**p**

_{s}=[81, 75]

^{t}and

*θ*

_{r}=135° as shown in Fig. 6(a). To utilize the depth information, 4 reference images are used. They are reconstructed at

*d*=170, 180, 190, and 200

*mm*, respectively. The threshold

*α*

_{Γ}and

*α*

_{C}are set at 0.65 and 1, respectively. Thresholds are selected heuristically to produce better results.

*S*is translated by every 3 pixels in

*x*and

*y*directions for measuring its similarity and difference with the graph

*R*. To search the best matching angles, the graph

*S*is rotated by 7.5° from 0 to 180° at every translated location. When the positions of rotated nodes are not integers, they are replaced with the nearest neighbor nodes.

### 6.4 Recognition of tribonema aequale alga

*x*direction and 8 pixels in

*y*direction, therefore, the total number of nodes in the graph is 60. The reference graph

*R*is located in the sample B1 with

**p**

_{s}=[142, 171]

^{t}and

*θ*

_{r}=90° as shown in Fig. 7(a). Four reference images are used which are reconstructed at

*d*=170, 180, 190, and 200

*mm*, respectively. The threshold

*α*

_{Γ}and

*α*

_{C}are set at 0.8 and 0.65, respectively.

*N*)=

*N*log

_{2}

*N*, where

*N*is the total number of pixels in the holographic image. For the graph matching, the computational time depends on the shape and the size of the graph, the dimension of the feature vector, searching steps for the translation vector and the rotation angle. Since the largest operation is caused by searching the translation vector, that is O(

*N*)=

*N*

^{2}, the proposed system requires quadratic computational complexity. Therefore, real-time processing can be achieved by developing parallel processing. Real-time operation is possible because SEOL holography requires a single exposure. Thus, with high speed electronics, it is possible to have real-time detection. This would not be possible with phase-shift holography which requires multiple exposures.

### 7. Conclusion

## References and links

1. | A. Mahalanobis, R. R. Muise, S. R. Stanfill, and A. V. Nevel, “Design and application of quadratic correlation filters for target detection,” IEEE Trans. on AES. |

2. | F. A. Sadjadi, “Infrared target detection with probability density functions of wavelet transform subbands,” Appl. Opt. |

3. | H. Sjoberg, F. Goudail, and P. Refregier, “Optimal algorithms for target location in nonhomogeneous binary images,” J. Opt. Soc. Am. A. |

4. | B. Javidi, ed., |

5. | B. Javidi and E. Tajahuerce, “Three dimensional object recognition using digital holography,” Opt. Lett. |

6. | O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, and B. Javidi, “Real-time three-dimensional object reconstruction by use of a phase-encoded digital hologram,” Appl. Opt. |

7. | F. Sadjadi, “Improved target classification using optimum polarimetric SAR signatures,” IEEE Trans. on AES. |

8. | B. Javidi and F. Okano, eds., |

9. | J. W. Goodman and R. W. Lawrence, “Digital image formation from electronically detected holograms,” Appl. Phy. Lett. |

10. | T. M. Kreis and W. P. O. Juptner, “Suppression of the dc term in digital holography,” Opt. Eng. |

11. | G. Pedrini and H. J. Tiziani, “Short-coherence digital microscopy by use of a lensless holographic imaging system,” Appl. Opt. |

12. | T. Zhang and I. Yamaguchi, “Three-dimensional microscopy with phase-shifting digital holography,” Opt. Lett. |

13. | Alexander Stadelmaier and Jurgen H. Massig, “Compensation of lens aberrations in digital holography,” Opt. Lett. |

14. | J. W. Lengeler, G. Drews, and H. G. Schlegel, |

15. | M. G. Forero, F. Sroubek, and G. Cristobal, “Identification of tuberculosis bacteria based on shape and color,” Real-time imaging |

16. | J. Alvarez-Borrego, R. R. Mourino-Perez, G. Cristobal-Perez, and J. L. Pech-Pacheco, “Invariant recognition of polychromatic images of Vibrio cholerae 01,” Opt. Eng. |

17. | A. L. Amaral, M. da Motta, M. N. Pons, H. Vivier, N. Roche, M. Moda, and E. C. Ferreira, “Survey of protozoa and metazoa populations in wastewater treatment plants by image anlaysis and discriminant analysis,” Environmentrics |

18. | S.-K. Treskatis, V. Orgeldinger, H. wolf, and E. D. Gilles, “Morphological characterization of filamentous microorganisms in submerged cultures by on-line digital image analysis and Pattern recognition,” Biotechnology and Bioengineering |

19. | T. Luo, K. Kramer, D. B. Goldgof, L. O. Hall, S. Samson, A. Remsen, and T. Hopkins, “Recognizing plankton images from the shadow image particle profiling evaluation recorder,” IEEE Trans. on systems, man, and cybernetics Part B |

20. | J. M. S. Cabral, M. Mota, and J. Tramper eds., |

21. | B. Javidi and D. Kim, “Three-dimensional-object recognition by use of single-exposure on-axis digital holography,” Opt. Lett. |

22. | D. Kim and B. Javidi, “Distortion-tolerant 3-D object recognition by using single exposure on-axis digital holography,” Opt. Express |

23. | J. G. Daugman, “Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filters,” J. Opt. Soc. Am. |

24. | T. S. Lee, “Image representation using 2D Gabor wavelets,” IEEE Trans. on PAMI. |

25. | M. Lades, J. C. Vorbruggen, J. Buhmann, J. Lange, C. v.d. Malsburg, R. P. Wurtz, and W. Konen, “Distortion invariant object recognition in the dynamic link architecture,” IEEE Trans. Computers |

26. | S. Yeom and B. Javidi, “Three-dimensional object feature extraction and classification with computational holographic imaging,” Appl. Opt. |

**OCIS Codes**

(070.2580) Fourier optics and signal processing : Paraxial wave optics

(090.2880) Holography : Holographic interferometry

(100.5010) Image processing : Pattern recognition

(100.6890) Image processing : Three-dimensional image processing

**ToC Category:**

Research Papers

**History**

Original Manuscript: April 21, 2005

Revised Manuscript: May 23, 2005

Published: June 13, 2005

**Citation**

Bahram Javidi, Inkyu Moon, Seokwon Yeom, and Edward Carapezza, "Three-dimensional imaging and recognition of microorganism using single-exposure on-line (SEOL) digital holography," Opt. Express **13**, 4492-4506 (2005)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-13-12-4492

Sort: Journal | Reset

### References

- A. Mahalanobis, R. R. Muise, S. R. Stanfill, and A. V. Nevel, �??Design and application of quadratic correlation filters for target detection,�?? IEEE Trans. on AES. 40, 837-850 (2004)
- F. A. Sadjadi, �??Infrared target detection with probability density functions of wavelet transform subbands,�?? Appl. Opt. 43, 315-323 (2004). [CrossRef] [PubMed]
- H. Sjoberg, F. Goudail, and P. Refregier, �??Optimal algorithms for target location in nonhomogeneous binary images,�?? J. Opt. Soc. Am. A. 15, 2976-2985 (1998) [CrossRef]
- B. Javidi, ed., Image Recognition and Classification: Algorithms, Systems, and Applications, (Marcel Dekker, New York, 2002). [CrossRef]
- B. Javidi and E. Tajahuerce, �??Three dimensional object recognition using digital holography,�?? Opt. Lett. 25, 610-612 (2000). [CrossRef]
- O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, and B. Javidi, �??Real-time three-dimensional object reconstruction by use of a phase-encoded digital hologram,�?? Appl. Opt. 41, 6187-6192 (2002). [CrossRef] [PubMed]
- F. Sadjadi, �??Improved target classification using optimum polarimetric SAR signatures,�?? IEEE Trans. On AES. 38, 38-49 (2002).
- B. Javidi and F. Okano, eds., Three-dimensional television, video, and display technologies, (Springer, New York, 2002).
- J. W. Goodman and R. W. Lawrence, �??Digital image formation from electronically detected holograms, Appl. Phy. Lett. 11, 77-79 (1967). [CrossRef]
- T. M. Kreis and W. P. O. Juptner, �??Suppression of the dc term in digital holography,�?? Opt. Eng. 36, 2357- 2360 (1997). [CrossRef]
- G. Pedrini and H. J. Tiziani, �??Short-coherence digital microscopy by use of a lensless holographic imaging system,�?? Appl. Opt. 41, 4489-4496 (2002). [CrossRef] [PubMed]
- T. Zhang and I. Yamaguchi, �??Three-dimensional microscopy with phase-shifting digital holography,�?? Opt Lett. 23, 1221 (1998). [CrossRef]
- Alexander Stadelmaier and Jurgen H. Massig, �??Compensation of lens aberrations in digital holography,�?? Opt. Lett. 25, 1630 (2000). [CrossRef]
- J. W. Lengeler, G. Drews, and H. G. Schlegel, Biology of the prokaryotes, (Blackwell science, New York, 1999).
- M. G. Forero, F. Sroubek, and G. Cristobal, �??Identification of tuberculosis bacteria based on shape and color,�?? Real-time imaging 10, 251-262 (2004) [CrossRef]
- J. Alvarez-Borrego, R. R. Mourino-Perez, G. Cristobal-Perez, and J. L. Pech-Pacheco, �??Invariant recognition of polychromatic images of Vibrio cholerae 01,�?? Opt. Eng. 41, 872-833 (2002) [CrossRef]
- A. L. Amaral, M. da Motta, M. N. Pons, H. Vivier, N. Roche, M. Moda, and E. C. Ferreira, �??Survey of protozoa and metazoa populations in wastewater treatment plants by image anlaysis and discriminant analysis,�?? Environmentrics 15, 381-390 (2004) [CrossRef]
- S.-K. Treskatis, V. Orgeldinger, H. wolf, and E. D. Gilles, �??Morphological characterization of filamentous microorganisms in submerged cultures by on-line digital image analysis and Pattern recognition,�?? Biotechnology and Bioengineering 53, 191-201 (1997) [CrossRef] [PubMed]
- T. Luo, K. Kramer, D. B. Goldgof, L. O. Hall, S. Samson, A. Remsen, and T. Hopkins, �??Recognizing plankton images from the shadow image particle profiling evaluation recorder,�?? IEEE Trans. on systems, man, and cybernetics Part B 34, 1753-1762 (2004). [CrossRef]
- J. M. S. Cabral, M. Mota, and J. Tramper eds., Multiphase bioreactor design: chap2 image analysis and multiphase bioreactor, (Taylor & Francis, London 2001) [CrossRef]
- B. Javidi and D. Kim, �??Three-dimensional-object recognition by use of single-exposure on-axis digital holography,�?? Opt. Lett. 30, 236-238 (2005). [CrossRef] [PubMed]
- D. Kim and B. Javidi, �??Distortion-tolerant 3-D object recognition by using single exposure on-axis digital holography,�?? Opt. Express 12, 5539-5548 (2005), <a/ href= "http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-22-5539">http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-22-5539</a> [CrossRef]
- J. G. Daugman, �??Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filters,�?? J. Opt. Soc. Am. 2, 1160-1169 (1985). [CrossRef]
- T. S. Lee, �??Image representation using 2D Gabor wavelets,�?? IEEE Trans. on PAMI. 18, 959-971 (1996) [CrossRef]
- M. Lades, J. C. Vorbruggen, J. Buhmann, J. Lange, C. v.d. Malsburg, R. P. Wurtz, and W. Konen, �??Distortion invariant object recognition in the dynamic link architecture,�?? IEEE Trans. Computers 42, 300- 311 (1993). [CrossRef]
- S. Yeom and B. Javidi, �??Three-dimensional object feature extraction and classification with computational holographic imaging,�?? Appl. Opt. 43, 442-451 (2004). [CrossRef] [PubMed]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.