## Size and shape recognition using measurement statistics and random 3D reference structures

Optics Express, Vol. 11, Issue 20, pp. 2606-2618 (2003)

http://dx.doi.org/10.1364/OE.11.002606

Acrobat PDF (306 KB)

### Abstract

Three dimensional (3D) reference structures segment source spaces based on whether particular source locations are visible or invisible to the sensor. A lensless 3D reference structure based imaging system measures projections of this source space on a sensor array. We derive and experimentally verify a model to predict the statistics of the measured projections for a simple 2D object. We show that the statistics of the measurement can yield an accurate estimate of the size of the object without ever forming a physical image. Further, we conjecture that the measured statistics can be used to determine the shape of 3D objects and present preliminary experimental measurements for 3D shape recognition.

© 2003 Optical Society of America

## 1. Introduction

1. D.J. Brady and Z.U. Rahman, “Integrated analysis and design of analog and digital processing in imaging systems: introduction to feature issue,” Appl. Opt. **41**, 6049–6049, (2002). [CrossRef] [PubMed]

2. W.T. Cathey and E.R. Dowski, “New paradigm for Imaging systems,” Appl. Opt. **41**, 6080–6092, (2002). [CrossRef] [PubMed]

3. D.L. Marks, R.A. Stack, D.J. Brady, D.C. Munson, and R.B. Brady, “Visible Cone beam tomography with a lensless interferometric camera,” Science **284**, 1561–1564, (1999). [CrossRef]

4. G. Barbastathis and D.J. Brady, “Multidimensional tomographic imaging using volume holography,” Proceedings of the IEEE **87**, 2098–2120, (1999). [CrossRef]

7. T.M. Cannon and E.E. Fenimore, “Tomographical imaging using uniformly redundant arrays,” Appl. Opt. **18**, 1052–1057, (1979). [CrossRef] [PubMed]

8. E.E. Fenimore, “Coded aperture imaging-predicted performance of uniformly redundant arrays,” Appl. Opt. **17**, 3562–3570 (1978). [CrossRef] [PubMed]

9. A.R. Gourlay and J.B. Stephen, “Geometric coded aperture masks,” Appl. Opt. **22**, 4042–4047, (1983). [CrossRef] [PubMed]

10. K.A. Nugent, “Coded aperture imaging- a fourier space analysis,” Appl. Opt. **26**, 563–569, (1999). [CrossRef]

13. U. Gopinathan, D. J. Brady, and N. P. Pitsianis, “Coded apertures for efficient pyroelectric motion tracking,” Opt. Express **11**, 2142–2152 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2142. [CrossRef] [PubMed]

11. P. Potuluri, M.R. Fetterman, and D.J. Brady, “High depth of field microscopic imaging using an interferometric camera,” Opt. Express **8**, 624–630, (2001), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-8-11-624. [CrossRef] [PubMed]

12. P. Potuluri, U. Gopinathan, J. R. Adleman, and D.J. Brady, “Lensless sensor system using a reference structure,” Opt. Express **11**, 965–974 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-965. [CrossRef] [PubMed]

14. P. Potuluri, M. Xu, and D.J. Brady, “Imaging with random 3D reference structures,” Opt. Express **11**, 2134–2141 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2134. [CrossRef] [PubMed]

16. G. Barbastathis, M. Balberg, and D. J. Brady, “Confocal microscopy with a volume holographic filter,” Opt. Lett. **24**, 811–813 (1999). [CrossRef]

17. A. Sinha and G. Barbastathis, “Volume holographic telescope,” Opt. Lett. **27**, 1690–1692 (2002). [CrossRef]

*et al.*[14

14. P. Potuluri, M. Xu, and D.J. Brady, “Imaging with random 3D reference structures,” Opt. Express **11**, 2134–2141 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2134. [CrossRef] [PubMed]

## 2. Size recognition using measurement statistics

**r**and

**r**respectively, one can associate a visibility function

_{m}*v*(

**r**,

_{m}**r**). For opaque obscurants, the visibility function is binary-valued depending on whether the source located at

**r**is visible or invisible to the measurement point

**r**. Thus, the visibility function imposed by the reference structure modulates the measurement by segmenting the source. For a measurement point

_{m}**r**, the value of measurement

_{m}*m*, can be obtained by integrating over this segmented source space

*s*(

**r**) is the density function of sources over the source space. We can discretize (1) byassuming that the source space is segmented into non-overlapping cells [14

14. P. Potuluri, M. Xu, and D.J. Brady, “Imaging with random 3D reference structures,” Opt. Express **11**, 2134–2141 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2134. [CrossRef] [PubMed]

*i*is a dummy variable that spans all the possible source cells and

*v*represents the visibility of source

_{m,i}*s*to the measurement point

_{i}**r**. If there are multiple measurement points

_{m}*j*=1…

*M*available, (2) can be written as

*v*represents the visibility of the

_{j,i}*i*

^{th}source cell to the

*j*

^{th}measurement point. The above analysis is applicable for a deterministic reference structure for which the exact locations of the obscurants are known [13

13. U. Gopinathan, D. J. Brady, and N. P. Pitsianis, “Coded apertures for efficient pyroelectric motion tracking,” Opt. Express **11**, 2142–2152 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2142. [CrossRef] [PubMed]

*v*for every source-sensor pair. Instead, we can efficiently determine the nature of the object based on the statistics of the measurements.We associate with each source cell and measurement pair a probability p

_{j,i}*that the*

_{m,i}*i*

^{th}source is visible to the sensor located at

**r**. The expected value of the measurement 〈

_{m}*m*〉, is then given by

*can be determined based on the nature of the reference structure. We will assume that the obscurants are uniformly and randomly distributed within the reference structure. Consider a reference structure that occupies a volume*

_{m,i}*V*. There are small obscurants of average volume

*v*dispersed throughout the entire reference structure volume. Hence, the maximum number of obscurants

*N*, that can be accommodated within the reference structure volume is given by

*ψ*of the entire volume is actually filled with the obscurants. Thus, the number of obscurants present in the reference structure is

*ψN*. For the purposes of this paper, we have used

*ψ*=0.1 for both theory and experiment. Now consider Fig. 2. The space enclosed by the lines joining the edges of source cell (1) to the sensor point intercepts a certain volume

*σ*of the reference structure. The number of obscurants that could possibly be accommodated within this volume is

_{m,(1)}can be determined as the answer to the question: Given that out of

*N*available locations within the reference structure only

*ψN*actually contain obscurants, what is the probability that out of

*K*randomly selected locations there are no obscurants? This probability can easily be determined for each source cell and sensor pair in a manner similar to determining the probability in a Bernoulli distribution. However, it is not necessary to do this. From the question posed previously, it is obvious that the probability would change only if the values of either

*ψ*or

*K*change.

*ψ*is fixed for an imaging system and from (8), we know that

*K*depends only on

*σ*. Thus if

*σ*were to remain constant, the probability would also stay unchanged. This observation implies that closely spaced source cells will have approximately equal probabilities of being visible to the same sensor. For a random reference structure, we define the uniform field of view (UFOV) of a single sensor as the angular range over which

*σ*changes by less than 1%. Based on this, we calculate that the UFOV of a single sensor to be

*ϕ*≈16.14° for

*ψ*=0.1. For an object located within the UFOV of the sensor, (4) can again be simplified by assuming that p

*remains constant over the object to yield*

_{m,i}*n*

^{th}order statistics of the measurement

*m*〉 is the

^{n}*n*

^{th}moment of the measurement located at

**r**. Consider a continuous, self luminescent 2D object; if the luminous flux Φ of the object is constant, the intensity emitted by the object is a function of only the object’s area

_{m}*A i.e.*(∑

*)*

_{i}s_{i}*=Φ*

^{n}*yielding*

^{n}A^{n}*c*is a proportionality constant. In other words, the

_{n}*n*

^{th}order statistic of the measurement varies directly as the

*n*

^{th}power of the object area for a simple 2D object. This analysis is also applicable for a system in which the visibility of the individual sensors is unobscured (i.e. without the reference structure between the source and the sensors). However in this case,

*p*=1 and the variance 〈

_{m}*m*

^{2}〉-〈

*m*〉

^{2}=0. This scaling of 〈

*m*〉 in

^{n}*A*varies substantially from focal systems. For a focal sensor 〈

^{n}*m*〉 is linear in

^{n}*A*or can be derived for specific illumination cases [18]. By determining

*p*, a reference structure imposes certain statistics on the image intensity. This report experimentally demonstrates the dependance of moments of

_{m}*m*and its variance on

*A*for reference structure based sensors.

## 3. Experimental verification of size recognition using reference structure tomography

*σ*. However, instead of looking back at the source space from the sensor, we look forward from the source space into the measurement space. In a method similar to that described in section 2, we can calculate the UFOV of each source cell over which the probability of visibility would not change. Now, if we placed multiple sensors within this UFOV, we could acquire multiple measurements at one shot without sampling over an ensemble of many random reference structures. In other words, the measurement process is actually ergodic and an expectation over an ensemble of reference structures is the same as an expectation over an ensemble of many sensors provided the sensors lie within the UFOV of the source object. Based on the above discussion, we used an experimental setup similar to the schematic shown in Fig. 3. The object used was the output of an incandescent fiber light source. An iris was placed in front of the end of the fiber to alter the diameter of the object and thus change its size. The object was placed at a distance of 61 cm from the CCD camera. The camera was a 1320×1040 pixel cooled scientific camera manufactured by Roper Scientific Instruments. The pixels of the camera provided the sensors to acquire the multiple measurements. The random reference structure was created by printing opaque dots on a slide transparency with a 10% fill factor. It was made 3D by folding several layers of transparency on top of each other. The experimental reference structure had opaque dots of average diameter 100?m with 16 layers, each 100

*µ*m thick folded on top of each other. The reference structure was located 2 cm away from the pixel array of the camera. We took measurements for objects of different sizes both with and without a reference structure present. Figure 4 shows the measurements on the CCD with and without a reference structure. Notice that the measurement with the reference structure has regions of high and low brightness corresponding to the visibility imposed by the obscurants. However, the measurement without the reference structure shows no such structure.

*i.e.*the shape of the histogram changes as the object size changes. Figure 6 shows the higher moments of the distributions as a function of the object areas. From the figure, we see that the moments fit well with the polynomial power corresponding to that particular moment. Further Fig. 7 is a plot of the normalized measurement variance versus the 2D object area shown for the RST system.We see that the variance increases quadratically for the RST system. The higher central moments (skewness, kurtosis etc.) also show similar behavior hence they can be used to obtain an accurate representation of the object size consistent with the theoretical predictions.

## 4. Shape recognition using measurement statistics

*i.e*we obtain a function

*A*=

*A*(

*θ*) corresponding to the extent of the object in the direction specified by

*θ*. Based on this information

*A*(

*θ*), it is possible to obtain the convex hull of the object shape by using the shadow-backprojection algorithm described below. For simplicity we discuss reconstruction of a 2D convex hull based on line projections, however, the approach is identical for reconstructing a 3D shape using area projections. The 2

*D*statistical RST system returns a function

*l*(

*θ*) that describes the length of the projection of the curve

*l*for a particular direction

*θ*. To reconstruct the 2D object, we start with any two projections say

*l*(

*θ*=0) and

*l*(

*θ*=

*π*/2). Now, based on the geometry of the projections, the object is constrained to lie inside the region specified by

*θ*with projection length

*l*(

*θ*).We can show that the innermost allowable edges of the 2D object inclined at

*θ*are constrained by equations:

*l*(

*θ*) and the corresponding convex hull of the shape and identify the shape’s convex hull based on the statistical signature

*l*(

*θ*) alone. Thus, the statistical signature

*l*(

*θ*) now identifies the convex hull of a particular object and this can be used for RST based shape recognition. We would like to mention here that the convex hull of an object shape is not necessarily unique. Consequently, we plan to use correlations in between the measured pixel values of the projections and then try to relate these correlations to a particular object shape. Research in this area is currently ongoing. Object analysis in sensor systems has conventionally been implemented in image post-processing, although optical processing based on field correlation has also been implemented [19

19. J. Rosen, “Three-dimensional optical Fourier transform and correlation,” Opt. Lett. **22**, 964–966 (1993). [CrossRef]

20. B. Javidi and E. Tajahuerce, “Three-dimensional object recognition by use of digital holography,” Opt. Lett. **22**, 610–612 (2000). [CrossRef]

21. J.J. Esteve-Taboada, D. Mas, and J. Garca, “Three-dimensional object recognition by Fourier transform profilometry,” Appl. Opt. **38**, 4760–4765 (1999). [CrossRef]

22. Y. Frauel and B. Javidi, “Digital Three-dimensional image correlation using computer-reconstructed integral imaging,” Jrnl. of Appl. Optics **41**, 5488–5496 (2002). [CrossRef]

## 5. Conclusions and Future work

13. U. Gopinathan, D. J. Brady, and N. P. Pitsianis, “Coded apertures for efficient pyroelectric motion tracking,” Opt. Express **11**, 2142–2152 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2142. [CrossRef] [PubMed]

## Acknowledgment

## References and links

1. | D.J. Brady and Z.U. Rahman, “Integrated analysis and design of analog and digital processing in imaging systems: introduction to feature issue,” Appl. Opt. |

2. | W.T. Cathey and E.R. Dowski, “New paradigm for Imaging systems,” Appl. Opt. |

3. | D.L. Marks, R.A. Stack, D.J. Brady, D.C. Munson, and R.B. Brady, “Visible Cone beam tomography with a lensless interferometric camera,” Science |

4. | G. Barbastathis and D.J. Brady, “Multidimensional tomographic imaging using volume holography,” Proceedings of the IEEE |

5. | M.R. Descour, C.E. Volin, E.L. Derenaiak, T.M. Gleeson, M.F. Hopkins, D.W. Wilson, and P.D. Maker, “Demonstration of a computed tomography imager spectrometer using a computer-generated hologram dispenser,” Appl. Opt. |

6. | E.R. Dowski and W.T. Cathey, “Extended depth of field through wave-front coding,” Appl. Opt. |

7. | T.M. Cannon and E.E. Fenimore, “Tomographical imaging using uniformly redundant arrays,” Appl. Opt. |

8. | E.E. Fenimore, “Coded aperture imaging-predicted performance of uniformly redundant arrays,” Appl. Opt. |

9. | A.R. Gourlay and J.B. Stephen, “Geometric coded aperture masks,” Appl. Opt. |

10. | K.A. Nugent, “Coded aperture imaging- a fourier space analysis,” Appl. Opt. |

11. | P. Potuluri, M.R. Fetterman, and D.J. Brady, “High depth of field microscopic imaging using an interferometric camera,” Opt. Express |

12. | P. Potuluri, U. Gopinathan, J. R. Adleman, and D.J. Brady, “Lensless sensor system using a reference structure,” Opt. Express |

13. | U. Gopinathan, D. J. Brady, and N. P. Pitsianis, “Coded apertures for efficient pyroelectric motion tracking,” Opt. Express |

14. | P. Potuluri, M. Xu, and D.J. Brady, “Imaging with random 3D reference structures,” Opt. Express |

15. | T. Cannon and E. Fenimore, “Coded aperture imaging-many holes make light work,” Optical Engineering |

16. | G. Barbastathis, M. Balberg, and D. J. Brady, “Confocal microscopy with a volume holographic filter,” Opt. Lett. |

17. | A. Sinha and G. Barbastathis, “Volume holographic telescope,” Opt. Lett. |

18. | J. W. Goodman, “Statistical Optics,” John Wiley & sons Ch.6, 237 (2000). |

19. | J. Rosen, “Three-dimensional optical Fourier transform and correlation,” Opt. Lett. |

20. | B. Javidi and E. Tajahuerce, “Three-dimensional object recognition by use of digital holography,” Opt. Lett. |

21. | J.J. Esteve-Taboada, D. Mas, and J. Garca, “Three-dimensional object recognition by Fourier transform profilometry,” Appl. Opt. |

22. | Y. Frauel and B. Javidi, “Digital Three-dimensional image correlation using computer-reconstructed integral imaging,” Jrnl. of Appl. Optics |

23. | N. Saitoet al., “Discriminant feature extraction using empirical probability density estimation and a local basis library,” Patten Recognition |

**OCIS Codes**

(100.5010) Image processing : Pattern recognition

(110.0110) Imaging systems : Imaging systems

(110.2990) Imaging systems : Image formation theory

**ToC Category:**

Research Papers

**History**

Original Manuscript: August 26, 2003

Revised Manuscript: September 29, 2003

Published: October 6, 2003

**Citation**

Arnab Sinha and David Brady, "Size and shape recognition using measurement statistics and random 3D reference structures," Opt. Express **11**, 2606-2618 (2003)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-11-20-2606

Sort: Journal | Reset

### References

- D.J. Brady and Z.U. Rahman, �??Integrated analysis and design of analog and digital processing in imaging systems: introduction to feature issue,�?? Appl. Opt. 41, 6049-6049, (2002). [CrossRef] [PubMed]
- W.T. Cathey and E.R. Dowski, �??New paradigm for Imaging systems,�?? Appl. Opt. 41, 6080-6092, (2002). [CrossRef] [PubMed]
- D.L. Marks, R.A. Stack, D.J. Brady, D.C. Munson and R.B. Brady, �??Visible Cone beam tomography with a lensless interferometric camera,�?? Science 284, 1561-1564, (1999). [CrossRef]
- G. Barbastathis and D.J. Brady, �??Multidimensional tomographic imaging using volume holography,�?? Proceedings of the IEEE 87, 2098-2120, (1999). [CrossRef]
- M.R. Descour, C.E. Volin, E.L. Derenaiak, T.M. Gleeson, M.F. Hopkins, D.W.Wilson and P.D. Maker, �??Demonstration of a computed tomography imager spectrometer using a computer-generated hologram dispenser,�?? Appl. Opt. 36, 3694-3698, (1997). [CrossRef] [PubMed]
- E.R. Dowski and W.T. Cathey, �??Extended depth of field through wave-front coding,�?? Appl. Opt. 34, 1859-1866 (1995). [CrossRef] [PubMed]
- T.M. Cannon and E.E. Fenimore, �??Tomographical imaging using uniformly redundant arrays,�?? Appl. Opt. 18, 1052-1057, (1979). [CrossRef] [PubMed]
- E.E. Fenimore, �??Coded aperture imaging - predicted performance of uniformly redundant arrays,�?? Appl. Opt. 17, 3562-3570 (1978). [CrossRef] [PubMed]
- A.R. Gourlay and J.B. Stephen, �??Geometric coded aperture masks,�?? Appl. Opt. 22, 4042-4047, (1983). [CrossRef] [PubMed]
- K.A. Nugent, �??Coded aperture imaging- a fourier space analysis,�?? Appl. Opt. 26, 563-569, (1999). [CrossRef]
- P. Potuluri, M.R. Fetterman and D.J. Brady, �??High depth of field microscopic imaging using an interferometric camera,�?? Opt. Express 8, 624-630, (2001), <a href=http://www.opticsexpress.org/abstract.cfm?URI=OPEX-8-11-624>http://www.opticsexpress.org/abstract.cfm?URI=OPEX-8-11-624</a> [CrossRef] [PubMed]
- P. Potuluri, U. Gopinathan, J. R. Adleman and D.J. Brady, �??Lensless sensor system using a reference structure,�?? Opt. Express 11, 965-974 (2003), <a href="http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-965">http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-965</a> [CrossRef] [PubMed]
- U. Gopinathan, D. J. Brady and N. P. Pitsianis, �??Coded apertures for efficient pyroelectric motion tracking,�?? Opt. Express 11, 2142-2152 (2003), <a href="http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2142">http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2142</a> [CrossRef] [PubMed]
- P. Potuluri, M. Xu and D.J. Brady, �??Imaging with random 3D reference structures,�?? Opt. Express 11, 2134-2141 (2003)<a href="http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2134.">http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2134.</a> [CrossRef] [PubMed]
- T. Cannon and E. Fenimore, �??Coded aperture imaging - many holes make light work,�?? Optical Engineering 19, 283-289, (1980).
- G. Barbastathis, M. Balberg and D. J. Brady, �??Confocal microscopy with a volume holographic filter,�?? Opt. Lett. 24, 811-813 (1999). [CrossRef]
- A. Sinha and G. Barbastathis, �??Volume holographic telescope,�?? Opt. Lett. 27, 1690-1692 (2002). [CrossRef]
- J. W. Goodman, �??Statistical Optics,�?? John Wiley & sons Ch.6, 237 (2000).
- J. Rosen,�??Three-dimensional optical Fourier transform and correlation,�?? Opt. Lett. 22, 964-966 (1993). [CrossRef]
- B. Javidi and E. Tajahuerce,�??B. Javidi and E. Tajahuerce,�??Three-dimensional object recognition by use of digital holography,�?? Opt. Lett. 22, 610-612 (2000).,�?? Opt. Lett. 22, 610-612 (2000). [CrossRef]
- J.J. Esteve-Taboada, D. Mas and J. Garca,�??Three-dimensional object recognition by Fourier transform profilometry,�?? Appl. Opt. 38, 4760-4765 (1999). [CrossRef]
- Y. Frauel and B. Javidi,�??Digital Three-dimensional image correlation using computer-reconstructed integral imaging,�?? Jrnl. of Appl. Optics 41, 5488-5496 (2002). [CrossRef]
- N. Saito et al.,�??Discriminant feature extraction using empirical probability density estimation and a local basis library,�?? Patten Recognition 35, 2841-2852 (2002).

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

### Figures

Fig. 1. |
Fig. 2. |
Fig. 3. |

Fig. 4. |
Fig. 5. |
Fig. 6. |

Fig. 7. |
Fig. 8. |
Fig. 9. |

Fig. 10. |
Fig. 11. |
Fig. 12. |

« Previous Article | Next Article »

OSA is a member of CrossRef.