## Dual-frequency pattern scheme for high-speed 3-D shape measurement

Optics Express, Vol. 18, Issue 5, pp. 5229-5244 (2010)

http://dx.doi.org/10.1364/OE.18.005229

Acrobat PDF (603 KB)

### Abstract

A novel dual-frequency pattern is developed which combines a high-frequency sinusoid component with a unit-frequency sinusoid component, where the high-frequency component is used to generate robust phase information, and the unit-frequency component is used to reduce phase unwrapping ambiguities. With our proposed pattern scheme, phase unwrapping can overcome the major shortcomings of conventional spatial phase unwrapping: phase jumping and discontinuities. Compared with conventional temporal phase unwrapped approaches, the proposed pattern scheme can achieve higher quality phase data using a less number of patterns. To process data in real time, we also propose and develop look-up table based fast and accurate algorithms for phase generation and 3-D reconstruction. Those fast algorithms can be applied to our pattern scheme as well as traditional phase measuring profilometry. For a 640×480 video stream, we can generate phase data at 1063.8 frames per second and full 3-D coordinate point clouds at 228.3 frames per second. These achievements are 25 and 10 times faster than previously reported studies.

© 2010 Optical Society of America

## 1. Introduction

2. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express **11**, 406–417 (2003). [CrossRef] [PubMed]

4. J. Salvi, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recognition **37**, 827–849 (2004). [CrossRef]

5. S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. **46**, 50–57 (2007). [CrossRef]

6. S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-d shape measurement,” Opt. Express **14**, 9120–9129 (2006). [CrossRef] [PubMed]

9. J. M. Huntley and H. O. Saldner, “Error—reduction methods for shape measurement by temporal phase unwrapping,” J. Opt. Soc. Am. A **14**, 3188–3196 (1997). [CrossRef]

10. S. Li, W. Chen, and X. Su, “Reliability—guided phase unwrapping in wavelet-transform profilometry,” Appl. Opt. **47**, 3369–3377 (2008). [CrossRef] [PubMed]

5. S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. **46**, 50–57 (2007). [CrossRef]

11. Y. Shi, “Robust phase unwrapping by spinning iteration,” Opt. Express **15**, 8059–8064 (2007). [CrossRef]

12. M. Costantini, “A novel phase unwrapping method based on network programming,” IEEE Trans. Geoscience and Remote Sensing **36**, 813–821 (1998). [CrossRef]

7. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A **20**, 106–115 (2003). [CrossRef]

13. D. S. Mehta, S. K. Dubey, M. M. Hossain, and C. Shakher, “Simple multifrequency and phase—shifting fringe—projection system based on two—wavelength lateral shearing interferometry for three-dimensional profilometry,” Appl. Opt. **44**, 7515–7521 (2005). [CrossRef] [PubMed]

14. G. Pedrini, I. Alexeenko, W. Osten, and H. J. Tiziani, “Temporal phase unwrapping of digital hologram sequences,” Appl. Opt. **42**, 5846–5854 (2003). [CrossRef] [PubMed]

9. J. M. Huntley and H. O. Saldner, “Error—reduction methods for shape measurement by temporal phase unwrapping,” J. Opt. Soc. Am. A **14**, 3188–3196 (1997). [CrossRef]

*et al*. [15

15. E.-H. Kim, J. Hahn, H. Kim, and B. Lee, “Profilometry without phase unwrapping using multi-frequency and four-step phase-shift sinusoidal fringe projection,” Opt. Express **17**, 7818–7830 (2009). [CrossRef] [PubMed]

*N*≥ 8 patterns. Li

*et al*. [16

16. J.-L. Li, H.-J. Su, and X.-Y. Su, “Two-frequency grating used in phase-measuring profilometry,” Appl. Opt. **36**, 277–280 (1997). [CrossRef] [PubMed]

*N*≥ 6 patterns, where the high frequency had to be equal to

*N*. Su and Liu [17

17. W.-H. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. Express **14**, 9178–9187 (2006). [CrossRef] [PubMed]

*et al*. as one of two bases in their one-shot pattern strategy.

2. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express **11**, 406–417 (2003). [CrossRef] [PubMed]

18. S. Y. Chen, Y. F. Li, and J. Zhang, “Vision processing for realtime 3-d data acquisition based on coded structured light,” IEEE Trans. Image Processing **17**, 167–176 (2008). [CrossRef]

19. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shapes,” Appl. Opt. **22**, 3977–3982 (1983). [CrossRef] [PubMed]

20. T. P. Koninckx and L. V. Gool, “Real-time range acquisition by adaptive structured light,” IEEE Trans. Pattern Anal. Mach Intell. **28**, 432–445 (2006). [CrossRef] [PubMed]

2. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express **11**, 406–417 (2003). [CrossRef] [PubMed]

*et al*. [8

8. S. Zhang and P. S. Huang, “High-resolution, real-time three-dimensional shape measurement,” Opt. Eng. **45**, 123601 (2006). [CrossRef]

21. S. Zhang and S.-T. Yau, “High-speed three-dimensional shape measurement system using a modified two-plus-one phase-shifting algorithm,” Opt. Eng. **46**, 113603 (2007). [CrossRef]

22. P. S. Huang and S. Zhang, “Fast three-step phase-shifting algorithm,” Appl. Opt. **45**, 5086–5091 (2006). [CrossRef] [PubMed]

6. S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-d shape measurement,” Opt. Express **14**, 9120–9129 (2006). [CrossRef] [PubMed]

23. M. Halioua and H. Liu, “Optical three-dimensional sensing by phase measuring profilometry,” Opt. Lasers in Eng. **11**, 185–215 (1989). [CrossRef]

6. S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-d shape measurement,” Opt. Express **14**, 9120–9129 (2006). [CrossRef] [PubMed]

## 2. Phase measuring profilometry

25. X. Su, G. von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. **98**, 141–150 (1993). [CrossRef]

*x*) is the coordinates of a pixel in the projector,

^{p},y^{p}*I*is the intensity of that pixel,

^{p}_{n}*A*and

^{p}*B*are some constants,

^{p}*f*is the frequency of the sine wave,

*n*represents the phase-shift index, and

*N*is the total number of phase shift. Figure (1) shows a group of sine wave patterns with

*N*= 3,

*f*= 1,

*A*= 127.5 and

^{p}*B*= 127.5 for 8-bits color depth projector are projected.

^{p}*x*) is the coordinates of a pixel in the camera while

^{c},y^{c}*I*(

^{c}_{n}*x*) is the intensity of that pixel. To simplify the notation, the coordinates index both in the camera and projector will be removed from our equation henceforth. The term

^{c},y^{c}*A*is the averaged pixel intensity across the pattern set, which can be derived according to:

^{c}*A*is equal to an intensity or texture photograph of the scene. Correspondingly, the term

^{c}*B*is the intensity modulation of a given pixel and is derived from

^{c}*I*as:

^{c}_{n}*B*can be thought of as the amplitude of the sinusoid reflecting off of a point on the target surface. If

^{c}*I*is constant or less affected by the projected sinusoid patterns,

^{c}_{n}*B*will be close to zero. Thus

^{c}*B*is employed as a shadow noise detector/filter [25

^{c}25. X. Su, G. von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. **98**, 141–150 (1993). [CrossRef]

*B*values, are discarded from further processing. Figure 1 shows, an example scene with a background that includes a fluorescent ceiling light, which over saturates the CMOS cameras pixel and, thereby, erases any signal from the SLI projector. In Fig. 1,

^{c}*A*looks like a standard video frame absent any indication of the projected pattern sequence

^{c}*I*. In contrast,

^{p}_{n}*B*, shown in Fig. 1, looks very similar to

^{c}*A*except that it only shows texture in those areas of the scene that significantly reflect the projected sequence

^{c}*I*. Given the significance of

^{p}_{n}*B*as an indicator of the projected signal strength, the binarized image in Fig. 1 shows only those pixels greater in magnitude to a user defined threshold. It is these pixels that will ultimately be used to reconstruct our 3-D surface with ignored pixels being considered too noisy as to relay an reliable depth information.

^{c}*B*,

^{c}*ϕ*represents the phase value of the captured sinusoid pattern derived as:

*B*.

^{c}*ϕ*, we likewise have derived a unique correspondence value

*y*for every camera pixel (

^{p}*x*) through the linear equation

^{c},y^{c}*ϕ*(

*x*) = 2

^{c},y^{c}*πy*. The 3-D world coordinates of the scanned object can, therefore, be derived through triangulation with the projector [7

^{p}7. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A **20**, 106–115 (2003). [CrossRef]

*M*and

^{c}*M*, are given by:

^{p}*X*, and

^{w}, Y^{w}*Z*are defined according to [7

^{w}7. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A **20**, 106–115 (2003). [CrossRef]

## 3. Dual-frequency pattern strategy

*I*is the intensity of a pixel in the projector,

^{p}_{n}*A*,

^{p}*B*

^{p}_{1}and

*B*

^{p}_{2}are some constants to make value of

*I*between 0 and 255 for a 8-bit color depth projector,

^{p}_{n}*f*is the high frequency of the sine wave,

_{h}*f*is the unit frequency of the sine wave and equals to 1,

_{u}*n*represents the phase-shift index, and

*N*is the total number of phase shift and is larger or equal to 5. Figure 2 illustrates one proposed pattern along with its profile for

*N*= 5,

*n*= 0,

*f*= 16,

_{h}*A*= 127.5,

^{p}*B*

^{p}_{1}= 102, and

*B*

^{p}_{2}= 25.5 for 8-bits per pixel grayscale intensity.

*I*is the intensity of a pixel in the camera. The term

^{c}_{n}*A*is still the averaged pixel intensity across the pattern set, which can be derived by Eq. 3 such that the image

^{c}*A*is equal to an intensity or texture photograph of the scene. Correspondingly, the term

^{c}*B*

^{c}_{1}is the intensity modulation of a given pixel corresponding to

*ϕ*and is derived from

_{h}*I*as:

^{c}_{n}*m*= 1 such that

*B*

^{c}_{1}can be thought of as the amplitude of the sinusoid reflecting off of a point on the target surface.

*I*is constant or less affected by the projected sinusoid patterns, then

^{c}_{n}*B*

^{c}_{1}will be close to zero, indicating that

*B*

^{c}_{1}is very sensitive to noise in either the camera, projector, and/or ambient light. Thus

*B*

^{c}_{1}is employed as an indicator of the signal to noise ratio such that excessively noisy regions, with small

*B*

^{c}_{1}values, are discarded from further processing [25

25. X. Su, G. von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. **98**, 141–150 (1993). [CrossRef]

*B*

^{c}_{2}is derived from

*I*by Eq. 10 with

^{c}_{n}*m*= 2 and it is the intensity modulation corresponding to

*ϕ*. Of the reliable pixels with sufficiently large

_{u}*B*

^{c}_{1}, the phase-pair, (

*ϕ*), is derived as:

_{h}, ϕ_{u}*ϕ*represents the wrapped phase value of the captured pattern and

_{h}*ϕ*represents the base phase used to unwrap

_{u}*ϕ*.

_{h}## 4. LUT-based processing

*B*, and phase,

^{c}*ϕ*, terms for traditional PMP that is suitable for all triangulation-based 3-D measurement, including our proposed dual-frequency pattern set. The proposed LUT-based processing takes advantage of the need by real-time systems to use as few patterns as possible by using the 8-bits per pixel, of the captured pattern set, as the indices into the LUT. By having LUTs that account for every possible combination of captured pixel value over the pattern set while storing double-precision results, the proposed scheme is completely lossless compared to traditional processing.

### 4.1. Modulation

*B*by rewriting Eq. (4) as:

^{c}*N*= 3,

*N*= 4, or

*N*= 6. Noting that we need only solve these equations for 8-bits per pixel

*I*images, we can implement a modulation look-up table, MLUT, that, for

^{c}_{n}*N*= 3, is defined according to

*V*and

*U*are derived from

*N*= 4, is defined according to

*V*and

*U*are derived from

*N*= 6, is defined according to

*V*and

*U*are derived from

*B*

^{c}_{1}and

*B*

^{c}_{2}, in Eq. (9), are described as:

*B*

^{c}_{1}as a shadow filter and for representing object texture. The MULT for

*B*

^{c}_{1}is the same as Eq. (19).

### 4.2. Phase

26. H. Guo and G. Liu, “Approximations for the arctangent function in efficient frige pattern analysis,” Opt. Express **15**, 3053–3066 (2007). [CrossRef] [PubMed]

*et al*.’s cross ratio algorithm [22

22. P. S. Huang and S. Zhang, “Fast three-step phase-shifting algorithm,” Appl. Opt. **45**, 5086–5091 (2006). [CrossRef] [PubMed]

*et al*.’s approximation algorithm [26

26. H. Guo and G. Liu, “Approximations for the arctangent function in efficient frige pattern analysis,” Opt. Express **15**, 3053–3066 (2007). [CrossRef] [PubMed]

*B*, we simplify Eq. (5) according to the number of patterns

^{c}*N*such that

*N*= 3,

*N*= 4, or

*N*= 6. Again based on the fact that the intensity values of grabbed images are range-limited integers, we can implement these calculations through a phase LUT (PLUT), for

*N*= 3, defined according to

*U*and

*V*are defined in Eq. (20). The double-precision results are stored in the PLUT. Thus, the time-consuming arctangent operation is pre-performed, and phase values are obtained by accessing the pre-computed PLUT whose size is, again, determined by the number of bits per pixel of the sensor as well as the number of patterns projected with no loss in accuracy. Compared to Eqs. (5), PLUT avoids computing arctangent function at run-time such that the computational cost of phase is greatly reduced without introducing distortion.

*ϕ*and the coarse unit frequency phase

_{h}*ϕ*pair in the Eq. (8) is rewritten as

_{u}*V*and

*U*are derived from

*ϕ*and

_{h}*ϕ*are obtained, phase unwrapping can be also achieved in real-time.

_{u}### 4.3. 3-D point cloud

*T*= (

*C*(

*x*)

^{c},y^{c}*y*+ 1)

^{p}^{-1}and

*M*and

_{x}, M_{y}, M_{z}, N_{x}, N_{y}*N*are defined in the Appendix.

_{z}**20**, 106–115 (2003). [CrossRef]

*M*, if

^{c}*Z*is calculated according to

^{w}*X*and

^{w}*Y*can be computed as

^{w}*M*, and

_{z}, N_{z}, C, E_{x}, E_{y}, F_{x}*F*by means of table look-up for indices (

_{y}*x*) (camera column and row indices), reduces the total computational complexity associated with deriving the 3-D point cloud from the phase term to 7 look-ups, 4 additions, 3 multiplications, and 1 division, which is significantly less than what is involved in performing matrix inversion and matrix multiplication, as required by Eq. (7). It should be noted that the method presented in Eqs. (32) and (33) can be applied to all triangulation-based, 3-D coordinate, reconstruction techniques including stereo vision, 3-D from video, time-of-flight, and so on.

^{c},y^{c}## 5. Experiments

*N*= 3, 4 and 6 involving LUT-based processing; (3) scanning a moving hand with dual-frequency patterns for

*N*= 6 by means of LUT-based processing; and (4) scanning a moving white table tennis ball and reconstructing it with a known sphere model.

*N*= 5,

*f*= 6,

_{h}*f*= 1,

_{u}*A*= 155,

^{p}*B*

^{p}_{1}= 80 and

*B*

^{p}_{2}= 20. The minimum value in each pattern is 55 in order to prevent underflow of captured images using our camera. The visualizations of the extracted parameters

*I*

^{c}_{0},

*A*,

^{c}*B*

^{c}_{1}and

*B*

^{c}_{2}are shown in Fig. 4 (top center), (top right), (bottom left), and (bottom right). The image in Fig. 4 (bottom center) shows the binarized

*B*

^{c}_{1}, to be used as a shadow noise filter for phase generation, using a threshold of 10.

*B*

^{c}_{1}while Fig. 5 (bottom-left) and (bottom-center) show the 250th column curve crossing the angel. In comparison, the unit phase term is noisy while the wrapped phase has high quality but needs to be unwrapped. Because we obtain the wrapped phase and the base phase simultaneously, the unwrapping algorithm for multi-frequency PMP can be performed to generate the final, high-quality and non-ambiguous phase. No neighborhood technique need be employed. The final unwrapped phase image without shadow noise is shown in Fig. 5 (top-right) and the 250th column curve crossing the angel is shown in Fig. 5 (bottom-right). Now once the unwrapped phase is obtained, 3-D world coordinates of the scanned object can be computed by triangulation between the camera and projector. Figure 6 shows the reconstructed 3-D point clouds with texture (top) and depth rendering (bottom) in front (left), side (center), and top view (right).

*A*acts as the texture for the 3-D point clouds in this experiment.

^{c}*N*= 3, 4, and 6 for the purpose of measuring the speed at which our LUT-based algorithms perform without phase unwrapping. Although the processor has four cores, our reported processing rates for each LUT algorithm are based on using just a one core. And although

*B*could be used as a shadow noise filter, thus reducing the number of pixels that would need to be processed while deriving

^{c}*ϕ*, we computed a phase value for the full set of 640×480 pixels such that our observed processing times represent an upper limit. Under these conditions, the average computation time over 1,000 runs for modulation

*B*and phase

^{c}*ϕ*was then reported in Table 1.

*B*acts as texture in this experiment.

^{c}*N*= 3, 164.20 fps for

*N*= 4, and 139.28 for

*N*= 6 versus, using traditional PMP, 15.16 fps for

*N*= 3, 15.38 fps for

*N*= 4, and 14.55 fps for

*N*= 6, which means our LUT-based algorithms is 10× faster than traditional PMP algorithms. Using the general modulation equation, Eqs. (4), and the general phase equation, (5), the processing time is increased with larger

*N*, because larger

*N*means more sin(·), cos(·) and summation computations. The processing time and rates are listed in Table 1.

*N*= 4 (2.34

*ms*) compared with

*N*= 3 (6.72

*ms*) because, before performing square-root computation, there are only 2 subtractions, 1 addition, and 2 square computations for

*N*= 4. There are 3 subtractions, 1 addition, 2 multiplications, and 2 square computations for

*N*= 3. Using the simplified phase equation, Eq. (23), (24), and (25), the processing time is almost the same for different

*N*because, although the processing time for the basic computations, such as addition and subtraction, is still varied for different

*N*. Such basic processing time is negligible compared to the processing time for the processing hungry arctangent function.

*N*because the computations for the integer indices,

*U*and

*V*, are the same for both MLUT and PLUT with the same

*N*as is the time for accessing MLUT and PLUT. However, the processing time for MLUT/PLUT is increased with increasing

*N*because the time for accessing the image buffer,

*I*, increases; therefore, accessing this buffer is the predominant factor in this case. In practice when we want to access MLUT and PLUT, the computations of

^{c}_{n}*U*and

*V*need only be performed once. So Table 1 also shows the processing time of the row marked as “MLUT and PLUT, combined” is less than the summation time of the row marked as “MLUT: Eq. (15), (17) and then (19)” and the row marked as “PLUT: Eq. (26), (27) and then (28)” in which their

*U*and

*V*were computed respectively.

*et al*. reported a reconstruction frame rate of 25.56 fps with a resolution of 532×500 when performing matrix inversion by means of GPU processing on an nVidia Quadro FX 3450 [6

**14**, 9120–9129 (2006). [CrossRef] [PubMed]

*et al*.’s 2 + 1 pattern strategy [21

21. S. Zhang and S.-T. Yau, “High-speed three-dimensional shape measurement system using a modified two-plus-one phase-shifting algorithm,” Opt. Eng. **46**, 113603 (2007). [CrossRef]

*B*. Points with low modulation strength are not displayed.

^{c}*N*= 6,

*f*= 4,

_{h}*f*= 1,

_{u}*A*= 155 and

^{p}*B*

^{p}_{1}=

*B*

^{p}_{2}= 50. Our PLUT and 3-D LUT were employed for phase generation and 3-D reconstruction. From the reconstructed point clouds of the moving ball, we found the best-fit sphere with variable position and radius. From this, we compared the estimated sphere radius with the true radius. Plotted in Fig. 9 (top), is this estimated sphere radius. The averaged distance of the table tennis ball point cloud, in Fig. 9 (bottom), was estimated as the centroid of all points in the cloud.

## 6. Conclusion and future works

**14**, 9120–9129 (2006). [CrossRef] [PubMed]

8. S. Zhang and P. S. Huang, “High-resolution, real-time three-dimensional shape measurement,” Opt. Eng. **45**, 123601 (2006). [CrossRef]

22. P. S. Huang and S. Zhang, “Fast three-step phase-shifting algorithm,” Appl. Opt. **45**, 5086–5091 (2006). [CrossRef] [PubMed]

27. P. L. Wizinowich, “Phase shifting interferometry in the presence of vibration: A new algorithm and system,” Appl. Opt. **29**, 3271–3279 (1990) [CrossRef] [PubMed]

21. S. Zhang and S.-T. Yau, “High-speed three-dimensional shape measurement system using a modified two-plus-one phase-shifting algorithm,” Opt. Eng. **46**, 113603 (2007). [CrossRef]

## Appendix: Definitions of *M*_{x}, M_{y}, M_{z}, N_{x}, N_{y} and *N*_{z}

_{x}, M

_{y}, M

_{z}, N

_{x}, N

_{y}

_{z}

## Acknowledgements

## References and links

1. | S. Zhang, D. Royer, and S. T. Yau, “High-resolution, real-time-geometry video acquisition system,” ACM SIG-GRAPH (2006). |

2. | C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express |

3. | Y. Wang, Q. Hao, A. Fatehpuria, D. L. Lau, and L. G. Hassebrook, “Data acquisition and quality analysis of 3-dimensional fingerprints,” in IEEE conference on Biometrics, Identity and Security, Tampa, Florida, (2009). |

4. | J. Salvi, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recognition |

5. | S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. |

6. | S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-d shape measurement,” Opt. Express |

7. | J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A |

8. | S. Zhang and P. S. Huang, “High-resolution, real-time three-dimensional shape measurement,” Opt. Eng. |

9. | J. M. Huntley and H. O. Saldner, “Error—reduction methods for shape measurement by temporal phase unwrapping,” J. Opt. Soc. Am. A |

10. | S. Li, W. Chen, and X. Su, “Reliability—guided phase unwrapping in wavelet-transform profilometry,” Appl. Opt. |

11. | Y. Shi, “Robust phase unwrapping by spinning iteration,” Opt. Express |

12. | M. Costantini, “A novel phase unwrapping method based on network programming,” IEEE Trans. Geoscience and Remote Sensing |

13. | D. S. Mehta, S. K. Dubey, M. M. Hossain, and C. Shakher, “Simple multifrequency and phase—shifting fringe—projection system based on two—wavelength lateral shearing interferometry for three-dimensional profilometry,” Appl. Opt. |

14. | G. Pedrini, I. Alexeenko, W. Osten, and H. J. Tiziani, “Temporal phase unwrapping of digital hologram sequences,” Appl. Opt. |

15. | E.-H. Kim, J. Hahn, H. Kim, and B. Lee, “Profilometry without phase unwrapping using multi-frequency and four-step phase-shift sinusoidal fringe projection,” Opt. Express |

16. | J.-L. Li, H.-J. Su, and X.-Y. Su, “Two-frequency grating used in phase-measuring profilometry,” Appl. Opt. |

17. | W.-H. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. Express |

18. | S. Y. Chen, Y. F. Li, and J. Zhang, “Vision processing for realtime 3-d data acquisition based on coded structured light,” IEEE Trans. Image Processing |

19. | M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shapes,” Appl. Opt. |

20. | T. P. Koninckx and L. V. Gool, “Real-time range acquisition by adaptive structured light,” IEEE Trans. Pattern Anal. Mach Intell. |

21. | S. Zhang and S.-T. Yau, “High-speed three-dimensional shape measurement system using a modified two-plus-one phase-shifting algorithm,” Opt. Eng. |

22. | P. S. Huang and S. Zhang, “Fast three-step phase-shifting algorithm,” Appl. Opt. |

23. | M. Halioua and H. Liu, “Optical three-dimensional sensing by phase measuring profilometry,” Opt. Lasers in Eng. |

24. | S. F. Frisken, R. N. Perry, A. P. Rockwood, and T. R. Jones, “Adaptively sampled distance fields: A general representation of shape for computer graphics,” in Proceedings of the 27th annual conference on Computer graphics and interactive techniques, 249–254 (2000). |

25. | X. Su, G. von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. |

26. | H. Guo and G. Liu, “Approximations for the arctangent function in efficient frige pattern analysis,” Opt. Express |

27. | P. L. Wizinowich, “Phase shifting interferometry in the presence of vibration: A new algorithm and system,” Appl. Opt. |

**OCIS Codes**

(110.6880) Imaging systems : Three-dimensional image acquisition

(150.5670) Machine vision : Range finding

(150.6910) Machine vision : Three-dimensional sensing

(150.0155) Machine vision : Machine vision optics

(150.1135) Machine vision : Algorithms

**ToC Category:**

Machine Vision

**History**

Original Manuscript: December 14, 2009

Revised Manuscript: January 19, 2010

Manuscript Accepted: February 12, 2010

Published: February 26, 2010

**Citation**

Kai Liu, Yongchang Wang, Daniel L. Lau, Qi Hao, and Laurence G. Hassebrook, "Dual-frequency pattern scheme for high-speed 3-D shape measurement," Opt. Express **18**, 5229-5244 (2010)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-18-5-5229

Sort: Year | Journal | Reset

### References

- S. Zhang, D. Royer, and S. T. Yau, "High-resolution, real-time-geometry video acquisition system," ACM SIGGRAPH (2006).
- C. Guan, L. G. Hassebrook, and D. L. Lau, "Composite structured light pattern for three-dimensional video," Opt. Express 11, 406-417 (2003). [CrossRef] [PubMed]
- Y. Wang, Q. Hao, A. Fatehpuria, D. L. Lau, and L. G. Hassebrook, "Data acquisition and quality analysis of 3-dimensional fingerprints," in IEEE conference on Biometrics, Identity and Security, Tampa, Florida, (2009).
- J. Salvi, J. Pages, and J. Batlle, "Pattern codification strategies in structured light systems," Pattern Recognition 37, 827-849 (2004). [CrossRef]
- S. Zhang, X. Li, and S.-T. Yau, "Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction," Appl. Opt. 46, 50-57 (2007). [CrossRef]
- S. Zhang, D. Royer, and S.-T. Yau, "GPU-assisted high-resolution, real-time 3-d shape measurement," Opt. Express 14, 9120-9129 (2006). [CrossRef] [PubMed]
- J. Li, L. G. Hassebrook, and C. Guan, "Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity," J. Opt. Soc. Am. A 20, 106-115 (2003). [CrossRef]
- S. Zhang and P. S. Huang, "High-resolution, real-time three-dimensional shape measurement," Opt. Eng. 45, 123601 (2006). [CrossRef]
- J. M. Huntley and H. O. Saldner, "Error-reduction methods for shape measurement by temporal phase unwrapping," J. Opt. Soc. Am. A 14, 3188-3196 (1997). [CrossRef]
- S. Li, W. Chen, and X. Su, "Reliability-guided phase unwrapping in wavelet-transform profilometry," Appl. Opt. 47, 3369-3377 (2008). [CrossRef] [PubMed]
- Y. Shi, "Robust phase unwrapping by spinning iteration," Opt. Express 15, 8059-8064 (2007). [CrossRef]
- M. Costantini, "A novel phase unwrapping method based on network programming," IEEE Trans. Geoscience and Remote Sensing 36, 813-821 (1998). [CrossRef]
- D. S. Mehta, S. K. Dubey, M. M. Hossain, and C. Shakher, "Simple multifrequency and phase-shifting fringe-projection system based on two-wavelength lateral shearing interferometry for three-dimensional profilometry," Appl. Opt. 44, 7515-7521 (2005). [CrossRef] [PubMed]
- G. Pedrini, I. Alexeenko, W. Osten, and H. J. Tiziani, "Temporal phase unwrapping of digital hologram sequences," Appl. Opt. 42, 5846-5854 (2003). [CrossRef] [PubMed]
- E.-H. Kim, J. Hahn, H. Kim, and B. Lee, "Profilometry without phase unwrapping using multi-frequency and four-step phase-shift sinusoidal fringe projection," Opt. Express 17, 7818-7830 (2009). [CrossRef] [PubMed]
- J.-L. Li, H.-J. Su, and X.-Y. Su, "Two-frequency grating used in phase-measuring profilometry," Appl. Opt. 36, 277-280 (1997). [CrossRef] [PubMed]
- W.-H. Su and H. Liu, "Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities," Opt. Express 14, 9178-9187 (2006). [CrossRef] [PubMed]
- S. Y. Chen, Y. F. Li, and J. Zhang, "Vision processing for realtime 3-d data acquisition based on coded structured light," IEEE Trans. Image Processing 17, 167-176 (2008). [CrossRef]
- M. Takeda and K. Mutoh, "Fourier transform profilometry for the automatic measurement of 3-d object shapes," Appl. Opt. 22, 3977-3982 (1983). [CrossRef] [PubMed]
- T. P. Koninckx and L. V. Gool, "Real-time range acquisition by adaptive structured light," IEEE Trans. Pattern Anal. Mach Intell. 28, 432-445 (2006). [CrossRef] [PubMed]
- S. Zhang and S.-T. Yau, "High-speed three-dimensional shape measurement system using a modified two-plusone phase-shifting algorithm," Opt. Eng. 46, 113603 (2007). [CrossRef]
- P. S. Huang and S. Zhang, "Fast three-step phase-shifting algorithm," Appl. Opt. 45, 5086-5091 (2006). [CrossRef] [PubMed]
- M. Halioua and H. Liu, "Optical three-dimensional sensing by phase measuring profilometry," Opt. Lasers in Eng. 11, 185-215 (1989). [CrossRef]
- S. F. Frisken, R. N. Perry, A. P. Rockwood, and T. R. Jones, "Adaptively sampled distance fields: A general representation of shape for computer graphics," in Proceedings of the 27th annual conference on Computer graphics and interactive techniques, 249-254 (2000).
- X. Su, G. von Bally, and D. Vukicevic, "Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation," Opt. Commun. 98, 141-150 (1993). [CrossRef]
- H. Guo and G. Liu, "Approximations for the arctangent function in efficient frige pattern analysis," Opt. Express 15, 3053-3066 (2007). [CrossRef] [PubMed]
- P. L. Wizinowich, "Phase shifting interferometry in the presence of vibration: A new algorithm and system," Appl. Opt. 29, 3271-3279 (1990) [CrossRef] [PubMed]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.