## Projected fringe profilometry with multiple measurements to form an entire shape

Optics Express, Vol. 16, Issue 6, pp. 4069-4077 (2008)

http://dx.doi.org/10.1364/OE.16.004069

Acrobat PDF (785 KB)

### Abstract

A 3D sensing method to retrieve an entire shape from many segmented profiles is described. Image registration is not required in this method. Advantages of this method also include (1) very high integration accuracy, (2) improved robustness, (3) reduced computational time, (4) very low computation cost for the data fusion, and (5) capability of compensating distortions of the optical system at every pixel location.

© 2008 Optical Society of America

## 1. Introduction

1. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. **22**, 3977–3982 (1983). [CrossRef] [PubMed]

7. L. G. Brown, “A survey of image registration techniques,” ACM Comput. Surv. **24**, 325–376, (1992). [CrossRef]

8. C. Reich, R. Ritter, and J. Thesing, “3-D shape measurement of complex objects by combining photogrammetry and fringe projection,” Opt. Eng. **39**, 224–231 (2000). [CrossRef]

9. A. Dipanda, S. Woo, F. Marzani, and J. M. Bilbault, “3-D shape reconstruction in an active stereo vision system using genetic algorithms” Pattern Recog. **36**, 2143–2159 (2003). [CrossRef]

## 2. Principle

### 2.1 Photogrammetry of a projected fringe system

*x*-

_{g}*y*plane with fringes normal to

_{g}*x*-axis is projected to the inspected object. Phase distribution can be mathematically expressed as

_{g}*d*is the period of the fringes.

*x*,

*y*,

*z*). The relationship between the fringe coordinates and the world coordinates is given by

*r*are coefficients signified with magnification and coordinates rotation, and

_{ij}*t*are shifting parameters. The superscript (

_{i}*p*) denotes a projector parameter. Distortion from the projection lens is addressed with Δ

*x*, Δ

*y*, and Δ

*z*.

*x*,

_{d}*y*). A similar relationship between the detector coordinates and the world coordinates is represented as

_{d}*c*) denotes a camera parameter. For a fixed detection location, the terms on the left side of Eq. (3) are two constants. Equation (3) is therefore simplified as the following expression:

*a*and

_{i}*b*are constants, and can be determined with a proper calibration scheme. Substituting (4) into Eq. (2), the variable z can be represented as a function of phase. Even though distortions of the projection lens are taken into account, it still can be approximately expressed as

_{i}*c*can be carried out with a proper calibration scheme. In Section 2.2, a calibration procedure is described.

_{i}### 2.2 Parameter identification

*c*for a 3D sensing system. A sinusoidal pattern is projected onto the flat surface perpendicular to the

_{i}*z*direction. Fringes on the flat surface are then captured by a CCD detector. The phase measurement is repeated as the flat is successively translated to different z positions. With Fourier transform method [1

1. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. **22**, 3977–3982 (1983). [CrossRef] [PubMed]

2. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. **23**, 3105–3108 (1984). [CrossRef] [PubMed]

*π*data. Thus, a set of absolute phases and the associated

*z*positions are obtained at each pixel location. With the subsequent curve fitting process,

*φ*-to-

*z*relation at each pixel is determined. Parameters

*c*are therefore carried out by the curve-fitting algorithm.

_{i}*a*and

_{i}*b*, a 2D fringe pattern is used as a calibration tool. The reflectivity of this pattern is sinusoidal and is expressed as

_{i}*d*and

_{x}*d*are known periods in

_{y}*x*- and

*y*- direction, respectively.

*z*=

*z*in front of the CCD camera, as shown in Fig. 2(b). Images of the fringe pattern at various

_{i}*z*positions are sequentially captured by the CCD camera. Thus a series of 2D absolute phases and the corresponding

*z*values are obtained at each pixel location. Since the periods of the 2D fringes are known, absolute phases can be used to represent the associated transverse positions. Thus, the relationship between

*x*and

*z*(or

*y*and

*z*) can be found out. With a simple algebra manipulation, the parameters

*a*and

_{i}*b*can be evaluated.

_{i}### 2.3 Calibration of multiple measurements

*l*) denotes a parameter from the

*l*

^{th}sensing system.

*φ*-to-

*z*conversion and

*z*-to-

*x*(or

*y*) conversion are shown in Fig. 4 and Fig. 5, respectively. The calibration tool (either the optical flat or the 2D pattern) is placed at the known plane

*z*=

*z*. Distribution of fringes on the calibration tool is recorded by all the CCD cameras at the same time. The phase measurement is repeated at different z positions and stops when enough measurements are obtained to execute the curve-fitting algorithm. Thus, for each sensor system, a series of absolute phases and the associated depths are obtained at each pixel location. The relationships between

_{i}*φ*

^{(l)}and

*z*,

*z*and

*x*, and z and

*y*in each sensor system are determined. Parameters rs

*a*

^{(l)}

*,*

_{i}*b*

^{(l)}

_{i}, and

*c*

^{(l)}

_{i}are then identified by the curve-fitting algorithm.

*x*,

*y*,

*z*). Thus, all the segmented profiles are retrieved in the same coordinate system. Integration becomes a simple task to display all point clouds in the coordinate system. Once all the sensing systems are calibrated, all the optical elements should be rigid to ensure the repeatability of data fusion.

## 4. Experiments

*mm*in diameter and 40

*mm*in depth was chosen as the inspected sample. Two 3D sensing systems located at different viewpoints were used to perform the segmented measurements. In each system, the projected fringes were observed by a CCD camera with 1024×1024 pixels at 12-bit pixel resolution. The recorded images from two viewpoints are shown in Figs. 6(a) and 6(b), respectively. Data on the edge of the bowl were lost because of shading.

*π*and

*π*. Phase unwrapping was then performed to eliminate the discontinuities. In our experiment, we use Goldstein’s algorithm [10

10. E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry,” Opt. Lasers Eng. **46**, 106–116 (2008). [CrossRef]

*z*position, and then the

*z*value is translated to carry out the

*x*- and

*y*- location. The relationship between

*φ*

^{(l)}and

*z*can be more accurate as the order of the polynomial,

*N*, is increased. Our experiments had shown that accuracy of the coefficients could be achieved in the order of microns while

*N*=4. Thus, at least 5 measurements at different

*z*positions were required to identify the parameters. In our setup, there were 20 measurements employed to perform the curve-fitting algorithm. Calibration range along the

*z*-axis was 50

*mm*, which ensured that all the data points were obtained within the calibration area. With the phase-shifting technique, depth accuracy for each segmented system was approximately 5

*µm*. However, transverse accuracy was only 127

*µm*, which was mainly determined by the sampling density of the CCD camera.

## 7. Conclusion

## References and links

1. | M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. |

2. | V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. |

3. | D. R. Burton and M. J. Lalor, “Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping,” Appl. Opt. |

4. | W. H. Su and H. Liu, “Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities,” Opt. Express |

5. | W. H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express |

6. | A. W. Gruen, “Geometrically constrained multiphoto matching,” Photogramm. Eng. Remote Sens. |

7. | L. G. Brown, “A survey of image registration techniques,” ACM Comput. Surv. |

8. | C. Reich, R. Ritter, and J. Thesing, “3-D shape measurement of complex objects by combining photogrammetry and fringe projection,” Opt. Eng. |

9. | A. Dipanda, S. Woo, F. Marzani, and J. M. Bilbault, “3-D shape reconstruction in an active stereo vision system using genetic algorithms” Pattern Recog. |

10. | E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry,” Opt. Lasers Eng. |

**OCIS Codes**

(110.6880) Imaging systems : Three-dimensional image acquisition

(120.4630) Instrumentation, measurement, and metrology : Optical inspection

**ToC Category:**

Instrumentation, Measurement, and Metrology

**History**

Original Manuscript: January 29, 2008

Revised Manuscript: March 7, 2008

Manuscript Accepted: March 7, 2008

Published: March 11, 2008

**Citation**

Wei-Hung Su, Cho-Yo Kuo, Chun-Chieh Wang, and Chung-Fan Tu, "Projected fringe profilometry with multiple
measurements to form an entire shape," Opt. Express **16**, 4069-4077 (2008)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-6-4069

Sort: Year | Journal | Reset

### References

- M. Takeda, and K. Mutoh, "Fourier transform profilometry for the automatic measurement of 3-D object shaped," Appl. Opt. 22, 3977-3982 (1983). [CrossRef] [PubMed]
- V. Srinivasan, H. C. Liu, and M. Halioua, "Automated phase-measuring profilometry of 3-D diffuse objects," Appl. Opt. 23, 3105-3108 (1984). [CrossRef] [PubMed]
- D. R. Burton and M. J. Lalor, "Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping," Appl. Opt. 33, 2939-2948 (1994). [CrossRef] [PubMed]
- W. H. Su, and H. Liu, "Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities," Opt. Express 14, 9178-9187 (2006). [CrossRef] [PubMed]
- W. H. Su, "Color-encoded fringe projection for 3D shape measurements," Opt. Express 15, 13167-13181 (2007). [CrossRef] [PubMed]
- A. W. Gruen, "Geometrically constrained multiphoto matching," Photogramm. Eng. Remote Sens. 54, 633-641 (1988).
- L. G. Brown, "A survey of image registration techniques," ACM Comput. Surv. 24, 325-376, (1992). [CrossRef]
- C. Reich, R. Ritter, and J. Thesing, "3-D shape measurement of complex objects by combining photogrammetry and fringe projection," Opt. Eng. 39, 224-231 (2000). [CrossRef]
- A. Dipanda, S. Woo, F. Marzani and J. M. Bilbault, "3-D shape reconstruction in an active stereo vision system using genetic algorithms" Pattern Recog. 36, 2143-2159 (2003). [CrossRef]
- E. Zappa, and G. Busca, "Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry," Opt. Lasers Eng. 46, 106-116 (2008). [CrossRef]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.