## Simultaneous multiple view high resolution surface geometry acquisition using structured light and mirrors |

Optics Express, Vol. 21, Issue 6, pp. 7222-7239 (2013)

http://dx.doi.org/10.1364/OE.21.007222

Acrobat PDF (6887 KB)

### Abstract

Knowledge of the surface geometry of an imaging subject is important in many applications. This information can be obtained via a number of different techniques, including time of flight imaging, photogrammetry, and fringe projection profilometry. Existing systems may have restrictions on instrument geometry, require expensive optics, or require moving parts in order to image the full surface of the subject. An inexpensive generalised fringe projection profilometry system is proposed that can account for arbitrarily placed components and use mirrors to expand the field of view. It simultaneously acquires multiple views of an imaging subject, producing a cloud of points that lie on its surface, which can then be processed to form a three dimensional model. A prototype of this system was integrated into an existing Diffuse Optical Tomography and Bioluminescence Tomography small animal imaging system and used to image objects including a mouse-shaped plastic phantom, a mouse cadaver, and a coin. A surface mesh generated from surface capture data of the mouse-shaped plastic phantom was compared with ideal surface points provided by the phantom manufacturer, and 50% of points were found to lie within 0.1mm of the surface mesh, 82% of points were found to lie within 0.2mm of the surface mesh, and 96% of points were found to lie within 0.4mm of the surface mesh.

© 2013 OSA

## 1. Introduction

1. J. Guggenheim, H. Dehghani, H. Basevi, I. Styles, and J. Frampton, “Development of a multi-view, multi-spectral bioluminescence tomography small animal imaging system,” Proc. SPIE **8088**, 80881K (2011) [CrossRef] .

3. C. Kuo, O. Coquoz, T. Troy, H. Xu, and B. Rice, “Three-dimensional reconstruction of in vivo bioluminescent sources based on multispectral imaging,” J. Biomed. Opt. **12**, 024007 (2007) [CrossRef] [PubMed] .

4. A. Gibson, J. Hebden, and S. Arridge, “Recent advances in diffuse optical imaging,” Phys. Med. Biol. **50**, R1–R43 (2005) [CrossRef] [PubMed] .

5. A. Cong, W. Cong, Y. Lu, P. Santago, A. Chatziioannou, and G. Wang, “Differential evolution approach for regularized bioluminescence tomography,” IEEE Trans. Biomed. Eng. **57**, 2229–2238 (2010) [CrossRef] [PubMed] .

6. S. Arridge and M. Schweiger, “Image reconstruction in optical tomography.” Phil. Trans. R. Soc. B **352**, 717–726 (1997) [CrossRef] [PubMed] .

8. M. Allard, D. Côté, L. Davidson, J. Dazai, and R. Henkelman, “Combined magnetic resonance and bioluminescence imaging of live mice,” J. Biomed. Opt. **12**, 034018 (2007) [CrossRef] [PubMed] .

9. T. Lasser, A. Soubret, J. Ripoll, and V. Ntziachristos, “Surface reconstruction for free-space 360° fluorescence molecular tomography and the effects of animal motion,” IEEE Trans. Med. Imag. **27**, 188–194 (2008) [CrossRef] .

11. A. Kumar, S. Raymond, A. Dunn, B. Bacskai, and D. Boas, “A time domain fluorescence tomography system for small animal imaging,” IEEE Trans. Med. Imag. **27**, 1152–1163 (2008) [CrossRef] .

1. J. Guggenheim, H. Dehghani, H. Basevi, I. Styles, and J. Frampton, “Development of a multi-view, multi-spectral bioluminescence tomography small animal imaging system,” Proc. SPIE **8088**, 80881K (2011) [CrossRef] .

## 2. Surface capture techniques

### 2.1. Light detection and ranging

12. R. Lange and P. Seitz, “Solid-state time-of-flight range camera,” IEEE J. Quantum Electron. **37**, 390–397 (2001) [CrossRef] .

13. A. Dorrington, M. Cree, A. Payne, R. Conroy, and D. Carnegie, “Achieving sub-millimetre precision with a solid-state full-field heterodyning range imaging camera,” Meas. Sci. Technol. **18**, 2809–2816 (2007) [CrossRef] .

12. R. Lange and P. Seitz, “Solid-state time-of-flight range camera,” IEEE J. Quantum Electron. **37**, 390–397 (2001) [CrossRef] .

13. A. Dorrington, M. Cree, A. Payne, R. Conroy, and D. Carnegie, “Achieving sub-millimetre precision with a solid-state full-field heterodyning range imaging camera,” Meas. Sci. Technol. **18**, 2809–2816 (2007) [CrossRef] .

### 2.2. Photogrammetry

14. K. Kraus, *Photogrammetry: Geometry from Images and Laser Scans* (de Gruyter, 2007) [CrossRef] .

15. J. Salvi, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. **37**, 827–849 (2004) [CrossRef] .

### 2.3. Fringe projection profilometry

16. J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photon. **3**, 128–160 (2011) [CrossRef] .

17. V. Srinivasan, H. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt. **23**, 3105–3108 (1984) [CrossRef] [PubMed] .

19. X. Su and W. Chen, “Fourier transform profilometry:: a review,” Opt. Laser. Eng. **35**, 263–284 (2001) [CrossRef] .

*x*is the spatial coordinate in the plane,

*r*(

*x*) is a function representing the spatially dependent reflectance of the plane,

*f*is the spatial frequency of the fringe pattern, and

*ψ*(

_{b}*x*) is the change in apparent phase of the pattern as a result of the plane not being orthogonal to the projector orientation. It is possible to calculate

*y*, the spatial coordinate perpendicular to the plane, through knowledge of

*ψ*(

_{b}*x*).

*ψ*(

_{o}*x*) is the apparent phase change as a result of the object, and

*r*(

*x*) now contains the spatially dependent reflectance of the surface and object. The value of

*y*at various points on the object can be calculated using

*ψ*(

_{o}*x*), but first it is necessary to extract

*ψ*(

_{o}*x*) from

*I*(

*x*). This task is non-trivial as the

*ψ*(

_{o}*x*) function is one of three additive terms operated on by a cosine function, which is itself coupled to a spatial reflectance term. A number of methods can be used to extract the argument of the cosine function, including Fourier Filtering [19

19. X. Su and W. Chen, “Fourier transform profilometry:: a review,” Opt. Laser. Eng. **35**, 263–284 (2001) [CrossRef] .

20. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shape,” Appl. Opt. **22**, 3977–3982 (1983) [CrossRef] [PubMed] .

17. V. Srinivasan, H. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt. **23**, 3105–3108 (1984) [CrossRef] [PubMed] .

*πfx*+

*ψ*(

_{b}*x*)+

*ψ*(

_{o}*x*)) mod 2

*π*. This must then be “unwrapped” in order to extract 2

*πfx*+

*ψ*(

_{b}*x*) +

*ψ*(

_{o}*x*) which can then be used in a height calculation.

21. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A **72**, 156–160 (1982) [CrossRef] .

24. E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to fourier-transform profilometry,” Opt. Laser. Eng. **46**, 106–116 (2008) [CrossRef] .

*π*to −

*π*, or vice versa). Once these locations have been determined, and a reference pixel has been selected for which it is assumed that no phase wrapping has occurred, then offsets can be added to regions isolated by phase wrapping events to correct for the lost multiples of 2

*π*. The unwrapping process moves outward from the reference pixel in an iterative manner, so that the correction applied to a pixel is derived from an adjacent pixel which has already been phase unwrapped [24

24. E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to fourier-transform profilometry,” Opt. Laser. Eng. **46**, 106–116 (2008) [CrossRef] .

*π*. It is not possible to uniquely unwrap phase changes of more than 2

*π*between adjacent pixels resulting from large distances between the spatial points imaged by the pixels, and so these events necessarily create errors. Finally, due to the iterative nature of the unwrapping process, errors made during the unwrapping process propagate to affect the unwrapping of subsequent pixels.

15. J. Salvi, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. **37**, 827–849 (2004) [CrossRef] .

*π*) are illuminated it is possible to uniquely specify the degree of phase wrapping at each pixel using ⌈log

_{2}

*n*⌉ extra images, where there are

*n*phase periods of size 2

*π*[25

25. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. **38**, 6565–6573 (1999) [CrossRef] .

*π*. While this phase map may be too noisy to use for the intended application, it can be used in the unwrapping process of a higher frequency. The use of one frequency or several frequencies to aid the unwrapping process is called temporal phase unwrapping [23

23. H. Saldner and J. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. **36**, 2770–2775 (1997) [CrossRef] [PubMed] .

*ψ*(

_{o}*x*) as an input to the process. The coordinates of camera pixels imaging the background plane are another prerequisite, and must be obtained prior to imaging. If the phase extracted by the previous processes results in 2

*πfx*+

*ψ*(

_{b}*x*)+

*ψ*(

_{o}*x*), then it is necessary to first separate

*ψ*(

_{o}*x*) from 2

*πfx*+

*ψ*(

_{b}*x*). This can be accomplished by imaging the scene without the object of interest, and then subtracting the resulting phase map from the original one. Using the projector approximation, the height of the object from the background,

*h*(

*x*), can be expressed as a function of the phase,

*ψ*(

_{o}*x*), the height of the camera and projector pupils from the background plane,

*l*, the distance between the camera and projector pupils,

*d*, and the frequency of the pattern being projected,

*f*[26

26. F. Berryman, P. Pynsent, and J. Cubillo, “A theoretical comparison of three fringe analysis methods for determining the three-dimensional shape of an object in the presence of noise,” Opt. Laser. Eng. **39**, 35–50 (2003) [CrossRef] .

27. Z. Wang, H. Du, and H. Bi, “Out-of-plane shape determination in generalized fringe projection profilometry,” Opt. Express **14**, 12122–12133 (2006) [CrossRef] [PubMed] .

28. Z. Wang, H. Du, S. Park, and H. Xie, “Three-dimensional shape measurement with a fast and accurate approach,” Appl. Opt. **48**, 1052–1061 (2009) [CrossRef] .

29. X. Mao, W. Chen, and X. Su, “Improved fourier-transform profilometry,” Appl. Opt. **46**, 664–668 (2007) [CrossRef] [PubMed] .

### 2.4. Post-processing

30. H. Dehghani, M. Eames, P. Yalavarthy, S. Davis, S. Srinivasan, C. Carpenter, B. Pogue, and K. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Methods En. **25**, 711–732 (2009) [CrossRef] .

### 2.5. Small animal imaging instruments

11. A. Kumar, S. Raymond, A. Dunn, B. Bacskai, and D. Boas, “A time domain fluorescence tomography system for small animal imaging,” IEEE Trans. Med. Imag. **27**, 1152–1163 (2008) [CrossRef] .

31. R. Schulz, J. Ripoll, and V. Ntziachristos, “Noncontact optical tomography of turbid media,” Opt. Lett. **28**, 1701–1703 (2003) [CrossRef] [PubMed] .

10. C. Li, G. Mitchell, J. Dutta, S. Ahn, R. Leahy, and S. Cherry, “A three-dimensional multispectral fluorescence optical tomography imaging system for small animals based on a conical mirror design,” Opt. Express **17**, 7571–7585 (2009) [CrossRef] [PubMed] .

9. T. Lasser, A. Soubret, J. Ripoll, and V. Ntziachristos, “Surface reconstruction for free-space 360° fluorescence molecular tomography and the effects of animal motion,” IEEE Trans. Med. Imag. **27**, 188–194 (2008) [CrossRef] .

36. H. Meyer, A. Garofalakis, G. Zacharakis, S. Psycharakis, C. Mamalaki, D. Kioussis, E. Economou, V. Ntziachristos, and J. Ripoll, “Noncontact optical imaging in mice with full angular coverage and automatic surface extraction,” Appl. Opt. **46**, 3617–3627 (2007) [CrossRef] [PubMed] .

38. G. Zavattini, S. Vecchi, G. Mitchell, U. Weisser, R. Leahy, B. Pichler, D. Smith, and S. Cherry, “A hyperspectral fluorescence system for 3d in vivo optical imaging,” Phys. Med. Biol. **51**, 2029–2043 (2006) [CrossRef] [PubMed] .

41. PerkinElmer, “Ivis 200 series,” http://www.perkinelmer.com/Catalog/Product/ID/IVIS200.

41. PerkinElmer, “Ivis 200 series,” http://www.perkinelmer.com/Catalog/Product/ID/IVIS200.

42. Berthold Technologies, “Nightowl lb 983 in vivo imaging system,” https://www.berthold.com/en/bio/in_vivo_imager_NightOWL_LB983.

43. Biospace Lab, “Photonimager,” http://www.biospacelab.com/m-31-optical-imaging.html.

31. R. Schulz, J. Ripoll, and V. Ntziachristos, “Noncontact optical tomography of turbid media,” Opt. Lett. **28**, 1701–1703 (2003) [CrossRef] [PubMed] .

33. R. Schulz, J. Ripoll, and V. Ntziachristos, “Experimental fluorescence tomography of tissues with noncontact measurements,” IEEE Trans. Med. Imag. **23**, 492–500 (2004) [CrossRef] .

38. G. Zavattini, S. Vecchi, G. Mitchell, U. Weisser, R. Leahy, B. Pichler, D. Smith, and S. Cherry, “A hyperspectral fluorescence system for 3d in vivo optical imaging,” Phys. Med. Biol. **51**, 2029–2043 (2006) [CrossRef] [PubMed] .

10. C. Li, G. Mitchell, J. Dutta, S. Ahn, R. Leahy, and S. Cherry, “A three-dimensional multispectral fluorescence optical tomography imaging system for small animals based on a conical mirror design,” Opt. Express **17**, 7571–7585 (2009) [CrossRef] [PubMed] .

42. Berthold Technologies, “Nightowl lb 983 in vivo imaging system,” https://www.berthold.com/en/bio/in_vivo_imager_NightOWL_LB983.

44. A. Chaudhari, F. Darvas, J. Bading, R. Moats, P. Conti, D. Smith, S. Cherry, and R. Leahy, “Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging,” Phys. Med. Biol. **50**, 5421–5441 (2005) [CrossRef] [PubMed] .

45. Biospace Lab, “4-view module,” http://www.biospacelab.com/m-89-4-view-module.html.

## 3. Methods

- The use of a mirror introduces a virtual camera position from which rays arriving at the mirror appear to be observed. The standard crossed-axis instrument geometry is less appropriate in this case, as its use imposes restrictions on the positions of any mirrors and requires the use of background planes placed in specific locations and orientations.
- The imaging of two or more distinct regions on the camera adds complexity to the use of Fourier domain phase extraction techniques.
- The addition of extra views increases the probability of imaging regions which appear to be discontinuous in height with respect to the camera’s perspective, and which are spatially separate from the other regions, which prevents the use of standard phase-unwrapping techniques.

### 3.1. Instrument geometry and conversion of phase information to spatial coordinates

26. F. Berryman, P. Pynsent, and J. Cubillo, “A theoretical comparison of three fringe analysis methods for determining the three-dimensional shape of an object in the presence of noise,” Opt. Laser. Eng. **39**, 35–50 (2003) [CrossRef] .

**c**, and a projector with a pupil at

**p**. The projector projects a sinusoidal pattern with a known spatial frequency

*f*in a plane centred around the point

**o**(at which location the pattern phase is zero), with

**v̂**the direction of increasing projector pattern phase. Then, the pattern projected (see Fig. 3) can be described: where

**y**is a point within the projected plane.

**x**. If

**r̂**is the direction of the ray from the camera to

**x**(which can be deduced via knowledge of the camera), a ray from the projector may also intersect

**x**. Let the ray from the projector be

**q̂**, and the intersection of that ray with the projector plane be

**y**. Then: where: The phase,

*ψ*, associated with

**y**is: Using Eqs. (5) to (7), and the knowledge that

**x**lies on the line defined by

**c**and

**r̂**, it is possible to derive an expression for

**x**: where:

### 3.2. Extraction of wrapped phase

26. F. Berryman, P. Pynsent, and J. Cubillo, “A theoretical comparison of three fringe analysis methods for determining the three-dimensional shape of an object in the presence of noise,” Opt. Laser. Eng. **39**, 35–50 (2003) [CrossRef] .

*p*, at spatial frequency

_{n}*f*and phase offset

*ϕ*can be expressed: The imaged patterns,

_{n}*g*, take the form: where: and

_{n}**y**is a function of

**x**. It is not possible to extract

*ψ*(

**x**,

*f*) from

*g*(

_{n}**x**,

*f*) directly, but it is possible to extract

*ψ*(

**x**,

*f*) (mod 2

*π*) using the following equation:

*π*. Examples of wrapped phase maps can be found in Fig. 4.

### 3.3. Phase unwrapping

*π*. Accordingly, errors in the determination of phase wrapping events propagate through spatial regions. It is also necessary to know a priori the phase of one pixel in the image so that it can be used as a reference. Without this, only relative phase can be determined. In addition, it is not possible to unwrap measurements suffering from phase wrapping that results from phase discontinuities of 2

*πn*where

*n*> 1, as these discontinuities are degenerate in the wrapped phase representation. An advantage of these techniques however, is that they require only a single wrapped phase map to unwrap (which itself can be calculated from a single image).

46. E. Li, X. Peng, J. Xi, J. Chicharo, J. Yao, and D. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3d profilometry,” Opt. Express **13**, 1561–1569 (2005) [CrossRef] [PubMed] .

*n*is the number of sinusoidal periods across the field of projection, and

*d*is the size of the field of projection in a plane of interest. The maximum phase difference for the spatial frequency

*f*

_{1}is 2

*π*, and so phase wrap events cannot occur when using this frequency. The wrapped phase measurement error is constant in average absolute magnitude independent of frequency, and so at low frequencies is relatively large in comparison to the range of phase values. Consequently, it is undesirable to reconstruct surface geometry at

*f*

_{1}. However, it is possible to use this as prior information to aid in unwrapping phase maps for higher spatial frequencies because the phase at a spatial point

**x**is linearly dependent on the spatial frequency.

*f*

_{1}and

*f*

_{2}we can obtain

*ψ*(

**x**,

*f*

_{1}) and

*ψ*(

**x**,

*f*

_{2}) (mod 2

*π*). Due to the linear relationship, we know that

*ψ*(

**x**,

*f*

_{2})

**=**2

*ψ*(

**x**,

*f*

_{1}) in the absence of measurement noise. In the presence of measurement noise, we can estimate

*ψ*(

**x**,

*f*

_{2}) as: where

*ψ̃*(

**x**,

*f*

_{2}) is the estimate of

*ψ*(

**x**,

*f*

_{2}). By then simulating the wrapping process on

*ψ̃*(

**x**,

*f*

_{2}), it is possible to compare

*ψ̃*(

**x**,

*f*

_{2}) (mod 2

*π*) and

*ψ*(

**x**,

*f*

_{2}) (mod 2

*π*) and correct

*ψ̃*(

**x**,

*f*

_{2}) accordingly for the difference and so calculate

*ψ*(

**x**,

*f*

_{2}). It is possible that error in

*ψ̃*(

**x**,

*f*

_{2}) or

*ψ*(

**x**,

*f*

_{2}) (mod 2

*π*) may be sufficient to remove a phase wrapping event or add one extra. In these cases it is necessary that the spatial frequencies chosen are close enough that it is possible to identify and correct for this.

### 3.4. Point cloud processing

47. P. Cignoni, “Meshlab home page,” http://meshlab.sourceforge.net/.

### 3.5. Experimental instrument and test cases

- one C9100-14 ImageEM-1K camera (Hamamatsu Photonics K.K., Hamamatsu City, Japan), which is used in both the surface capture and bioluminescence imaging, but was chosen due to the high sensitivity requirement of bioluminescence imaging
- two Pocket Projector MPro120 (3M United Kingdom, Berkshire, United Kingdom) units to project the patterns used in the surface capture process
- one L490MZ Motorised Lab Jack (Thorlabs, Ely, United Kingdom), which is utilised in the surface capture system calibration procedure, and is also used in bioluminescence imaging
- one NT59-871 25mm Compact Fixed Focal Length Lens (Edmund Optics, York, United Kingdom), which is also used in bioluminescence imaging
- two N-BK7 75mm Enhanced Aluminum Coated Right Angle Mirrors (Edmund Optics, York, United Kingdom), which can be placed in any suitable position within the surface capture system, and are also used in bioluminescence imaging
- one FB580-10 Bandpass Filter (Thorlabs, Ely, United Kingdom), which is used to attenuate the projector signal to prevent saturation of the camera

## 4. Results and discussion

47. P. Cignoni, “Meshlab home page,” http://meshlab.sourceforge.net/.

## 5. Conclusions

## Acknowledgments

## References and links

1. | J. Guggenheim, H. Dehghani, H. Basevi, I. Styles, and J. Frampton, “Development of a multi-view, multi-spectral bioluminescence tomography small animal imaging system,” Proc. SPIE |

2. | J. Guggenheim, H. Basevi, I. Styles, J. Frampton, and H. Dehghani, “Multi-view, multi-spectral bioluminescence tomography,” in |

3. | C. Kuo, O. Coquoz, T. Troy, H. Xu, and B. Rice, “Three-dimensional reconstruction of in vivo bioluminescent sources based on multispectral imaging,” J. Biomed. Opt. |

4. | A. Gibson, J. Hebden, and S. Arridge, “Recent advances in diffuse optical imaging,” Phys. Med. Biol. |

5. | A. Cong, W. Cong, Y. Lu, P. Santago, A. Chatziioannou, and G. Wang, “Differential evolution approach for regularized bioluminescence tomography,” IEEE Trans. Biomed. Eng. |

6. | S. Arridge and M. Schweiger, “Image reconstruction in optical tomography.” Phil. Trans. R. Soc. B |

7. | B. Brooksby, H. Dehghani, B. Pogue, and K. Paulsen, “Near-infrared (nir) tomography breast image reconstruction with a priori structural information from mri: algorithm development for reconstructing heterogeneities,” IEEE J. Sel. Topics Quantum Electron. |

8. | M. Allard, D. Côté, L. Davidson, J. Dazai, and R. Henkelman, “Combined magnetic resonance and bioluminescence imaging of live mice,” J. Biomed. Opt. |

9. | T. Lasser, A. Soubret, J. Ripoll, and V. Ntziachristos, “Surface reconstruction for free-space 360° fluorescence molecular tomography and the effects of animal motion,” IEEE Trans. Med. Imag. |

10. | C. Li, G. Mitchell, J. Dutta, S. Ahn, R. Leahy, and S. Cherry, “A three-dimensional multispectral fluorescence optical tomography imaging system for small animals based on a conical mirror design,” Opt. Express |

11. | A. Kumar, S. Raymond, A. Dunn, B. Bacskai, and D. Boas, “A time domain fluorescence tomography system for small animal imaging,” IEEE Trans. Med. Imag. |

12. | R. Lange and P. Seitz, “Solid-state time-of-flight range camera,” IEEE J. Quantum Electron. |

13. | A. Dorrington, M. Cree, A. Payne, R. Conroy, and D. Carnegie, “Achieving sub-millimetre precision with a solid-state full-field heterodyning range imaging camera,” Meas. Sci. Technol. |

14. | K. Kraus, |

15. | J. Salvi, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. |

16. | J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photon. |

17. | V. Srinivasan, H. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt. |

18. | S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Laser. Eng. |

19. | X. Su and W. Chen, “Fourier transform profilometry:: a review,” Opt. Laser. Eng. |

20. | M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shape,” Appl. Opt. |

21. | M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A |

22. | T. Judge and P. Bryanston-Cross, “A review of phase unwrapping techniques in fringe analysis,” Opt. Laser. Eng. |

23. | H. Saldner and J. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. |

24. | E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to fourier-transform profilometry,” Opt. Laser. Eng. |

25. | G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. |

26. | F. Berryman, P. Pynsent, and J. Cubillo, “A theoretical comparison of three fringe analysis methods for determining the three-dimensional shape of an object in the presence of noise,” Opt. Laser. Eng. |

27. | Z. Wang, H. Du, and H. Bi, “Out-of-plane shape determination in generalized fringe projection profilometry,” Opt. Express |

28. | Z. Wang, H. Du, S. Park, and H. Xie, “Three-dimensional shape measurement with a fast and accurate approach,” Appl. Opt. |

29. | X. Mao, W. Chen, and X. Su, “Improved fourier-transform profilometry,” Appl. Opt. |

30. | H. Dehghani, M. Eames, P. Yalavarthy, S. Davis, S. Srinivasan, C. Carpenter, B. Pogue, and K. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Methods En. |

31. | R. Schulz, J. Ripoll, and V. Ntziachristos, “Noncontact optical tomography of turbid media,” Opt. Lett. |

32. | Z. Geng, “Method and apparatus for omnidirectional three dimensional imaging,” U.S. Patent 6,744,569 (2004). |

33. | R. Schulz, J. Ripoll, and V. Ntziachristos, “Experimental fluorescence tomography of tissues with noncontact measurements,” IEEE Trans. Med. Imag. |

34. | Z. Geng, “Diffuse optical tomography system and method of use,” U.S. Patent 7,242,997 (2007). |

35. | D. Nilson, M. Cable, B. Rice, and K. Kearney, “Structured light imaging apparatus,” U.S. Patent 7,298,415 (2007). |

36. | H. Meyer, A. Garofalakis, G. Zacharakis, S. Psycharakis, C. Mamalaki, D. Kioussis, E. Economou, V. Ntziachristos, and J. Ripoll, “Noncontact optical imaging in mice with full angular coverage and automatic surface extraction,” Appl. Opt. |

37. | X. Jiang, L. Cao, W. Semmler, and J. Peter, “A surface recognition approach for in vivo optical imaging applications using a micro-lens-array light detector,” in |

38. | G. Zavattini, S. Vecchi, G. Mitchell, U. Weisser, R. Leahy, B. Pichler, D. Smith, and S. Cherry, “A hyperspectral fluorescence system for 3d in vivo optical imaging,” Phys. Med. Biol. |

39. | B. Rice, M. Cable, and K. Kearney, “3d in-vivo imaging and topography using structured light,” U.S. Patent 7,797,034 (2010). |

40. | D. Stearns, B. Rice, and M. Cable, “Method and apparatus for 3-d imaging of internal light sources,” U.S. Patent 7,860,549 (2010). |

41. | PerkinElmer, “Ivis 200 series,” http://www.perkinelmer.com/Catalog/Product/ID/IVIS200. |

42. | Berthold Technologies, “Nightowl lb 983 in vivo imaging system,” https://www.berthold.com/en/bio/in_vivo_imager_NightOWL_LB983. |

43. | Biospace Lab, “Photonimager,” http://www.biospacelab.com/m-31-optical-imaging.html. |

44. | A. Chaudhari, F. Darvas, J. Bading, R. Moats, P. Conti, D. Smith, S. Cherry, and R. Leahy, “Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging,” Phys. Med. Biol. |

45. | Biospace Lab, “4-view module,” http://www.biospacelab.com/m-89-4-view-module.html. |

46. | E. Li, X. Peng, J. Xi, J. Chicharo, J. Yao, and D. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3d profilometry,” Opt. Express |

47. | P. Cignoni, “Meshlab home page,” http://meshlab.sourceforge.net/. |

**OCIS Codes**

(120.2830) Instrumentation, measurement, and metrology : Height measurements

(120.6650) Instrumentation, measurement, and metrology : Surface measurements, figure

(170.0110) Medical optics and biotechnology : Imaging systems

**ToC Category:**

Instrumentation, Measurement, and Metrology

**History**

Original Manuscript: December 19, 2012

Revised Manuscript: March 5, 2013

Manuscript Accepted: March 7, 2013

Published: March 14, 2013

**Virtual Issues**

Vol. 8, Iss. 4 *Virtual Journal for Biomedical Optics*

**Citation**

Hector R.A. Basevi, James A. Guggenheim, Hamid Dehghani, and Iain B. Styles, "Simultaneous multiple view high resolution surface geometry acquisition using structured light and mirrors," Opt. Express **21**, 7222-7239 (2013)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-21-6-7222

Sort: Year | Journal | Reset

### References

- J. Guggenheim, H. Dehghani, H. Basevi, I. Styles, and J. Frampton, “Development of a multi-view, multi-spectral bioluminescence tomography small animal imaging system,” Proc. SPIE8088, 80881K (2011). [CrossRef]
- J. Guggenheim, H. Basevi, I. Styles, J. Frampton, and H. Dehghani, “Multi-view, multi-spectral bioluminescence tomography,” in Biomedical Optics, OSA Technical Digest (Optical Society of America, 2012), paper BW4A.7.
- C. Kuo, O. Coquoz, T. Troy, H. Xu, and B. Rice, “Three-dimensional reconstruction of in vivo bioluminescent sources based on multispectral imaging,” J. Biomed. Opt.12, 024007 (2007). [CrossRef] [PubMed]
- A. Gibson, J. Hebden, and S. Arridge, “Recent advances in diffuse optical imaging,” Phys. Med. Biol.50, R1–R43 (2005). [CrossRef] [PubMed]
- A. Cong, W. Cong, Y. Lu, P. Santago, A. Chatziioannou, and G. Wang, “Differential evolution approach for regularized bioluminescence tomography,” IEEE Trans. Biomed. Eng.57, 2229–2238 (2010). [CrossRef] [PubMed]
- S. Arridge and M. Schweiger, “Image reconstruction in optical tomography.” Phil. Trans. R. Soc. B352, 717–726 (1997). [CrossRef] [PubMed]
- B. Brooksby, H. Dehghani, B. Pogue, and K. Paulsen, “Near-infrared (nir) tomography breast image reconstruction with a priori structural information from mri: algorithm development for reconstructing heterogeneities,” IEEE J. Sel. Topics Quantum Electron.9, 199–209 (2003). [CrossRef]
- M. Allard, D. Côté, L. Davidson, J. Dazai, and R. Henkelman, “Combined magnetic resonance and bioluminescence imaging of live mice,” J. Biomed. Opt.12, 034018 (2007). [CrossRef] [PubMed]
- T. Lasser, A. Soubret, J. Ripoll, and V. Ntziachristos, “Surface reconstruction for free-space 360° fluorescence molecular tomography and the effects of animal motion,” IEEE Trans. Med. Imag.27, 188–194 (2008). [CrossRef]
- C. Li, G. Mitchell, J. Dutta, S. Ahn, R. Leahy, and S. Cherry, “A three-dimensional multispectral fluorescence optical tomography imaging system for small animals based on a conical mirror design,” Opt. Express17, 7571–7585 (2009). [CrossRef] [PubMed]
- A. Kumar, S. Raymond, A. Dunn, B. Bacskai, and D. Boas, “A time domain fluorescence tomography system for small animal imaging,” IEEE Trans. Med. Imag.27, 1152–1163 (2008). [CrossRef]
- R. Lange and P. Seitz, “Solid-state time-of-flight range camera,” IEEE J. Quantum Electron.37, 390–397 (2001). [CrossRef]
- A. Dorrington, M. Cree, A. Payne, R. Conroy, and D. Carnegie, “Achieving sub-millimetre precision with a solid-state full-field heterodyning range imaging camera,” Meas. Sci. Technol.18, 2809–2816 (2007). [CrossRef]
- K. Kraus, Photogrammetry: Geometry from Images and Laser Scans (de Gruyter, 2007). [CrossRef]
- J. Salvi, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn.37, 827–849 (2004). [CrossRef]
- J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photon.3, 128–160 (2011). [CrossRef]
- V. Srinivasan, H. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt.23, 3105–3108 (1984). [CrossRef] [PubMed]
- S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Laser. Eng.48, 133–140 (2010). [CrossRef]
- X. Su and W. Chen, “Fourier transform profilometry:: a review,” Opt. Laser. Eng.35, 263–284 (2001). [CrossRef]
- M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shape,” Appl. Opt.22, 3977–3982 (1983). [CrossRef] [PubMed]
- M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A72, 156–160 (1982). [CrossRef]
- T. Judge and P. Bryanston-Cross, “A review of phase unwrapping techniques in fringe analysis,” Opt. Laser. Eng.21, 199–239 (1994). [CrossRef]
- H. Saldner and J. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt.36, 2770–2775 (1997). [CrossRef] [PubMed]
- E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to fourier-transform profilometry,” Opt. Laser. Eng.46, 106–116 (2008). [CrossRef]
- G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt.38, 6565–6573 (1999). [CrossRef]
- F. Berryman, P. Pynsent, and J. Cubillo, “A theoretical comparison of three fringe analysis methods for determining the three-dimensional shape of an object in the presence of noise,” Opt. Laser. Eng.39, 35–50 (2003). [CrossRef]
- Z. Wang, H. Du, and H. Bi, “Out-of-plane shape determination in generalized fringe projection profilometry,” Opt. Express14, 12122–12133 (2006). [CrossRef] [PubMed]
- Z. Wang, H. Du, S. Park, and H. Xie, “Three-dimensional shape measurement with a fast and accurate approach,” Appl. Opt.48, 1052–1061 (2009). [CrossRef]
- X. Mao, W. Chen, and X. Su, “Improved fourier-transform profilometry,” Appl. Opt.46, 664–668 (2007). [CrossRef] [PubMed]
- H. Dehghani, M. Eames, P. Yalavarthy, S. Davis, S. Srinivasan, C. Carpenter, B. Pogue, and K. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Methods En.25, 711–732 (2009). [CrossRef]
- R. Schulz, J. Ripoll, and V. Ntziachristos, “Noncontact optical tomography of turbid media,” Opt. Lett.28, 1701–1703 (2003). [CrossRef] [PubMed]
- Z. Geng, “Method and apparatus for omnidirectional three dimensional imaging,” U.S. Patent 6,744,569 (2004).
- R. Schulz, J. Ripoll, and V. Ntziachristos, “Experimental fluorescence tomography of tissues with noncontact measurements,” IEEE Trans. Med. Imag.23, 492–500 (2004). [CrossRef]
- Z. Geng, “Diffuse optical tomography system and method of use,” U.S. Patent 7,242,997 (2007).
- D. Nilson, M. Cable, B. Rice, and K. Kearney, “Structured light imaging apparatus,” U.S. Patent 7,298,415 (2007).
- H. Meyer, A. Garofalakis, G. Zacharakis, S. Psycharakis, C. Mamalaki, D. Kioussis, E. Economou, V. Ntziachristos, and J. Ripoll, “Noncontact optical imaging in mice with full angular coverage and automatic surface extraction,” Appl. Opt.46, 3617–3627 (2007). [CrossRef] [PubMed]
- X. Jiang, L. Cao, W. Semmler, and J. Peter, “A surface recognition approach for in vivo optical imaging applications using a micro-lens-array light detector,” in Biomedical Optics, OSA Technical Digest (Optical Society of America, 2012), paper BTu3A.1.
- G. Zavattini, S. Vecchi, G. Mitchell, U. Weisser, R. Leahy, B. Pichler, D. Smith, and S. Cherry, “A hyperspectral fluorescence system for 3d in vivo optical imaging,” Phys. Med. Biol.51, 2029–2043 (2006). [CrossRef] [PubMed]
- B. Rice, M. Cable, and K. Kearney, “3d in-vivo imaging and topography using structured light,” U.S. Patent 7,797,034 (2010).
- D. Stearns, B. Rice, and M. Cable, “Method and apparatus for 3-d imaging of internal light sources,” U.S. Patent 7,860,549 (2010).
- PerkinElmer, “Ivis 200 series,” http://www.perkinelmer.com/Catalog/Product/ID/IVIS200 .
- Berthold Technologies, “Nightowl lb 983 in vivo imaging system,” https://www.berthold.com/en/bio/in_vivo_imager_NightOWL_LB983 .
- Biospace Lab, “Photonimager,” http://www.biospacelab.com/m-31-optical-imaging.html .
- A. Chaudhari, F. Darvas, J. Bading, R. Moats, P. Conti, D. Smith, S. Cherry, and R. Leahy, “Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging,” Phys. Med. Biol.50, 5421–5441 (2005). [CrossRef] [PubMed]
- Biospace Lab, “4-view module,” http://www.biospacelab.com/m-89-4-view-module.html .
- E. Li, X. Peng, J. Xi, J. Chicharo, J. Yao, and D. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3d profilometry,” Opt. Express13, 1561–1569 (2005). [CrossRef] [PubMed]
- P. Cignoni, “Meshlab home page,” http://meshlab.sourceforge.net/ .

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.