## 3-D shape measurement by composite pattern projection and hybrid processing

Optics Express, Vol. 15, Issue 19, pp. 12318-12330 (2007)

http://dx.doi.org/10.1364/OE.15.012318

Acrobat PDF (2838 KB)

### Abstract

This article presents a projection system with a novel composite pattern for one-shot acquisition of 3D surface shape. The pattern is composed of color encoded stripes and cosinoidal intensity fringes, with parallel arrangement. The stripe edges offer absolute height phases with high accuracy, and the cosinoidal fringes provide abundant relative phases involved in the intensity distribution. Wavelet transform is utilized to obtain the relative phase distribution of the fringe pattern, and the absolute height phases measured by triangulation are combined to calibrate the phase data in unwrapping, so as to eliminate the initial and noise errors and to reduce the accumulation and approximation errors. Numerical simulations are performed to prove the new unwrapping algorithms and actual experiments are carried out to show the validity of the proposed technique for accurate 3-D shape measurement.

© 2007 Optical Society of America

## 1. Introduction

## 2. Pattern acquisition

### 2.1 Optical system for pattern projection and triangular measurement

*O*and the camera lens center

_{p}*O*, with a distance

_{c}*D*in between, is parallel to the reference plane. The triangle ▵

*ABC*is similar to ▵

*O*and the side from the point

_{c}O_{p}C*B*to

*A*on the reference plane, corresponding to the side

*d*from

*χ*to

_{r}*χ*on the image plane, when the object intersects the projected pattern at point

_{o}*C*. The shift

*d*can be measured by comparing the pattern image distorted by the object shape with that projected on the reference plane, so as to obtain the object height

*h*by triangulation [13

13. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express **11**, 406–417 (2003). [CrossRef] [PubMed]

### 2.2 Composite pattern of color stripes and intensity fringes

13. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express **11**, 406–417 (2003). [CrossRef] [PubMed]

3. J. Salvia, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. **37**, 827–849 (2004). [CrossRef]

*m*over an alphabet of

*n*symbols is a circular string of length

*n*, where the substring with length m appears only once. The sequence is designed to search a Hamiltonian circuit over a De Bruijn graph whose vertices are the words of length

^{m}*m*+1. Since we need to find the borderlines of the color stripes, the adjacent stripes should be encoded with different colors for recognition. To fulfill this condition, the algorithm is improved by deleting the graphic vertices whose words have the same adjacent symbols, and searching for a Hamiltonian circuit over the new graph to produce a sequence of length

*n*(

*n*-1)

^{m-1}, which visualizes the borderlines without any identical symbols between the neighboring stripes. Five colors (yellow, green, cyan, blue and magenta, represented by the symbol numbers of 1, 2, 3, 4 and 5, respectively), are used to produce the stripe pattern encoded by an order 2 De Bruijn sequence of length 20, as shown in Fig. 2(a). Without any repetition, each edge is encoded by the color symbols of its neighboring stripes. That is, if the color numbers are

*cl*and

*cr*for the left and right stripes, then the edge between them is thus encoded as (

*cl*,

*cr*).

*V*(

*x*)∈[0,1],

*f*is the frequency of the cosinoidal fringes. With the spatial frequency of the value channel identical to that in the hue channel, i.e.,

_{v}*f*=20, this intensity distribution ensures that the edge positions of the color stripes locate in the extremal positions of the cosinoidal fringes, when the two groups of patterns are parallelized, to make the stripe edges easily recognizable and the intensity distribution detectable in the processing.

_{v}## 3. Image processing

### 3.1 Extraction of color stripes and intensity fringes

*R*,

*G*,

*B*∈[0,1] and

*H*,

*S*,

*V*∈[0,1], max(

*R*,

*G*,

*B*) and min(

*R*,

*G*,

*B*) represent the maximum and minimum values in R, G and B channels, respectively, and

*H*is a temporary variable in the algorithm. A captured image from the reference plane is presented in Fig. 3(a). The extracted color stripes in the hue channel and the extracted intensity fringes in the value channel are presented in Fig. 3(b) and Fig. 3(c), respectively. The results basically recover the designed patterns. As shown in Fig. 3(c), however, the extracted fringes have some brightness changes caused by intensity imbalance among different colors produced in the projector and camera, which make the fringe intensity be deviated from the exact cosinoidal distribution in some degree. Therefore, the wavelet transform, which will be described in Section 3.3, is used to automatically reduce this influence by a bank of filterers with different bandpass [16

_{t}16. J Fang, C. Y. Xiong, and Z. L. Yang, “Digital transform processing of carrier fringe patterns from speckle-shearing interferometry,” J. Mod. Opt. **48**, 507–520 (2001). [CrossRef]

### 3.2 Edge searching and phase solution in color stripes

*EC*=(

^{reference}_{j}*cl*,

^{reference}_{j}*cr*),(1≤

^{reference}_{j}*j*≤19) to form a sequence of

*EC*

^{reference}_{1}=(1, 2),

*EC*

^{reference}_{2}=(2,1),

*EC*

^{reference}_{3}=(1, 3),

*EC*

^{reference}_{4}=(3,1),

*EC*

^{reference}_{5}=(1, 4),

*EC*

^{reference}_{3}=(4,1),

*EC*

^{reference}_{7}=(1, 5),

*EC*

^{reference}_{8}=(5, 2),

*EC*

^{reference}_{9}=(2,3),

*EC*

^{reference}_{10}=(3, 2),

*EC*

^{reference}_{11}=(2, 4),

*EC*

^{reference}_{12}=(4, 2),

*EC*

^{reference}_{13}=(2,5),

*EC*

^{reference}_{14}=(5,3),

*EC*

^{reference}_{15}=(3, 4),

*EC*

^{reference}_{16}=(4,3),

*EC*

^{reference}_{17}=(3,5),

*EC*

^{reference}_{18}=(5, 4),

*EC*

^{reference}_{19}=(4,5). So when an edge in the image of the object surface has a color edge code of

*EC*=(

^{object}*cl*,

^{object}*cr*), which matches the code

^{object}*EC*of the

^{reference}_{j}*j*edge in the image of the reference plane, given by

^{th}*cl*=

^{object}*cl*,

^{reference}_{j}*cr*=

^{object}*cr*, the absolute phase of this edge can be calculated as

^{reference}_{j}*ϕ*=2

*jπ*. For example, an edge in the captured image of the object surface has a yellow stripe on the left side and a magenta stripe on the right side. Its edge code is

*EC*=(1,5), equal to the 7

^{object}^{th}edge code in the sequence of the reference plane. Therefore, the absolute phases of the pixels on that edge are

*ϕ*=14

*π*.

### 3.3 Wavelet transform processing for intensity fringes

*I*

_{0}(

*x*) is the image background of illumination,

*I*

_{1}(

*x*) the fringe contrast, and

*ϕ*(

*x*) is the phase function

*I*(

*x*) is defined as an integral of the signals with translation and dilation of the complex conjugation of a mother wavelet

*ψ*(

*x*), given by

*f*(

*x*) distributed in the fringe pattern, the range is [

*ω*

_{0}/(10

*πf*

_{max}/3),

*ω*

_{0}/(10

*πf*

_{min}/3)], where

*f*

_{max}and

*f*

_{min}are the maximum and minimum frequency in

*f*(

*x*), which can be found in the reference image. Corresponding to the positions of the maximum amplitude determined in this range, the phases of the wavelet coefficients are obtained in the WT phase map, ranged in [-

*π*,

*π*] or [-

*π*/2,

*π*/2]. As a result, an unwrapping procedure is needed to make the interrupted phases continued, as presented in the following section.

### 3.4 Unwrapping based on absolute phase

*ϕ*, where the index

^{unwrap}_{i}*i*∈[1,

*len*] includes the points from

*i*=1 at the position of the left known phase to

*i*=

*len*at the position of the right known phase. The differences between the unwrapping phase and the absolute phase at those two points are given by

*ϕ*(

_{m}*x*) of surface height can be obtained to realize static and dynamic 3-D shape measurement.

## 4. Results

### 4.1 Simulation for the phase correction in unwrapping

*π*. By searching the maximum values in the amplitude map of the WT coefficient (the black curve in the brightest region of Fig. 5(b)), the fringe phases are solved in the corresponding WT phase map [the corresponding black curve in Fig. 5(c)] and then unwrapped to connect the phase interruptions. For comparison, a traditional unwrapping procedure (MALTLAB v7.0) is used to directly connect those interrupting phases, as presented in Fig. 5(d), showing big differences between the designed values and the calculated phase curve, with a standard deviation of 0.32628. In the region around two phase peaks with high second order derivatives, Fig. 5(d) demonstrates the influence of the approximation errors on the WT phase, and near the two ends of the plot and in the middle area with abrupt change of the first derivative of the phases, the results present significant errors resulting from the non-smoothness of the phase variation. By using our new method to calibrate the points at phase 2

*nπ*with the known phases specified in the intensity distribution, and to correct the phase data at the points among them with the above algorithm, the unwrapped results show very good agreement with the designed phases, as presented in Fig. 5(e), with a standard deviation of 0.07648, showing that the phase errors in the WT processing have been much reduced.

### 4.2 Two examples of measurement experiment

*f*

_{max}=21.83 and the minimum frequency is

*f*

_{min}=17.72, respectively, so that the scale searching range in wavelet analysis is [0.0274,0.1693], which is divided into 200 intervals to search the maximum amplitudes of the WT coefficients.

*nπ*, Fig. 6(b) presents the phase map with reasonable smoothness due to smooth paper surface with regular boundaries, where the phase errors at the starting points of unwrapping are small and the noise in the unwrapping path is little in each row. The maximum and minimum phases obtained by the traditional process are 9.91 and 0.42, respectively. The results obtained by the proposed method, however, have larger value of 10.36 as the maximum phase and smaller value of 0.08 as the minimum phase, respectively, as indicated in Fig. 6(c), in which the calibration has been performed based on the absolute phases of every stripe edge in the phase unwrapping. The phase map is thus improved by eliminating the accumulation errors and the approximation errors, showing the 3-D shape of the curved surface with high accuracy.

## 5. Conclusion

## Acknowledgment

## References and links

1. | E. Trucco and A. Verri, |

2. | R. Furukawa and H. Kawasaki, “Interactive shape acquisition using marker attached laser projector,” in |

3. | J. Salvia, J. Pages, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. |

4. | D. Caspi, N. Kiryati, and J. Shamir. “Range imaging with adaptive color structured light,” IEEE Trans Pattern Anal. Mach. Intel. |

5. | F. Tsalakanidou, F. Forster, S. Malassiotis, and M. G. Strintzis, “Real-time acquisition of depth and color images using structured light and its application to 3D face recognition,” Real-Time Imag. |

6. | Z. J. Geng, “Rainbow 3-dimensional camera: new concept of high-speed 3-dimensional vision systems,” Opt. Eng. |

7. | M. S. Jeong and S. W. Kim, “Color grating projection moiré with time-integral fringe capturing for high-speed 3-D imaging,” Opt. Eng. |

8. | O. A. Skydan, M. J. Lalor, and D. R. Burton, “Technique for phase measurement and surface reconstruction by use of colored structured light,” Appl. Opt. |

9. | Z. H. Zhang, C. E. Towers, and D. P. Towers “Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency selection,” Opt. Express |

10. | S. Zhang and S. -T. Yau, “High-resolution, real-time 3D absolute coordinate measurement based on a phase shifting method,” Opt. Express |

11. | C. Karaalioglu and Y. Skarlatos, “Fourier transform method for measurement of thin film thickness by speckle interferometry,” Opt. Eng. |

12. | H. J. Li, H. J. Chen, J. Zhang, C. Y. Xiong, and J. Fang, “Statistical searching of deformation phases on wavelet transform maps of fringe patterns,” Opt. Laser Tech. |

13. | C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express |

14. | A. K.C. Wong, P. Niu, and X. He, “Fast acquisition of dense depth data by a new structured light scheme,” Comput. Vis. Image Underst. |

15. | P. Fong and F. Buron, “Sensing deforming and moving objects with commercial off the shelf hardware,” in |

16. | J Fang, C. Y. Xiong, and Z. L. Yang, “Digital transform processing of carrier fringe patterns from speckle-shearing interferometry,” J. Mod. Opt. |

17. | H. J. Li and H. J. Chen, “Phase solution of modulated fringe carrier using wavelet transform,” Acta Sci. Nat. Uni. Pek. |

**OCIS Codes**

(110.6880) Imaging systems : Three-dimensional image acquisition

(120.2650) Instrumentation, measurement, and metrology : Fringe analysis

(120.5050) Instrumentation, measurement, and metrology : Phase measurement

(120.6650) Instrumentation, measurement, and metrology : Surface measurements, figure

**ToC Category:**

Instrumentation, Measurement, and Metrology

**History**

Original Manuscript: August 9, 2007

Revised Manuscript: September 10, 2007

Manuscript Accepted: September 11, 2007

Published: September 13, 2007

**Citation**

H. J. Chen, J. Zhang, D. J. Lv, and J. Fang, "3-D shape measurement by composite pattern projection and hybrid processing," Opt. Express **15**, 12318-12330 (2007)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-15-19-12318

Sort: Year | Journal | Reset

### References

- E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision, (Prentice Hall, 1998).
- R. Furukawa and H. Kawasaki, "Interactive shape acquisition using marker attached laser projector," in Proceedings of the Fourth International Conference on 3-D Digital Imaging and Modeling (2003), pp. 491- 498.
- J. Salvia, J. Pages, and J. Batlle, "Pattern codification strategies in structured light systems," Pattern Recogn. 37, 827-849 (2004). [CrossRef]
- D. Caspi, N. Kiryati, and J. Shamir, "Range imaging with adaptive color structured light," IEEE Trans Pattern Anal. Mach. Intell. 20, 470-480 (1998). [CrossRef]
- F. Tsalakanidou, F. Forster, S. Malassiotis and M. G. Strintzis, "Real-time acquisition of depth and color images using structured light and its application to 3D face recognition," Real-Time Imag. 11, 358-369 (2005). [CrossRef]
- Z. J. Geng, "Rainbow 3-dimensional camera: new concept of high-speed 3-dimensional vision systems," Opt. Eng. 35, 376-383 (1996). [CrossRef]
- M. S. Jeong and S. W. Kim, "Color grating projection moiré with time-integral fringe capturing for high-speed 3-D imaging," Opt. Eng. 41, 1912-1917 (2002). [CrossRef]
- O. A. Skydan, M. J. Lalor, and D. R. Burton, "Technique for phase measurement and surface reconstruction by use of colored structured light," Appl. Opt. 41, 6104-6117 (2002). [CrossRef] [PubMed]
- Z. H. Zhang, C. E. Towers, and D. P. Towers "Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency selection," Opt. Express 14, 6444-6455 (2006). [CrossRef] [PubMed]
- S. Zhang and S. -T. Yau, "High-resolution, real-time 3D absolute coordinate measurement based on a phase-shifting method," Opt. Express 14, 2644-2649 (2006). [CrossRef] [PubMed]
- C. Karaalioglu and Y. Skarlatos, "Fourier transform method for measurement of thin film thickness by speckle interferometry," Opt. Eng. 42, 1694-1698 (2003). [CrossRef]
- H. J. Li, H. J. Chen, J. Zhang, C. Y. Xiong, and J. Fang, "Statistical searching of deformation phases on wavelet transform maps of fringe patterns," Opt. Laser Technol. 39, 275-281 (2006). [CrossRef]
- C. Guan, L. G. Hassebrook, and D. L. Lau, "Composite structured light pattern for three-dimensional video," Opt. Express 11, 406-417 (2003). [CrossRef] [PubMed]
- A. K.C. Wong, P. Niu, and X. He, "Fast acquisition of dense depth data by a new structured light scheme," Comput. Vis. Image Underst. 98, 398-422 (2005). [CrossRef]
- P. Fong and F. Buron, "Sensing deforming and moving objects with commercial off the shelf hardware," in Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (2005), Vol. 3, pp.20-26.
- J Fang, C. Y. Xiong and Z. L. Yang, "Digital transform processing of carrier fringe patterns from speckle-shearing interferometry," J. Mod. Opt. 48, 507-520 (2001). [CrossRef]
- H. J. Li and H. J. Chen, "Phase solution of modulated fringe carrier using wavelet transform," Acta Sci. Nat. Uni. Pek. 43, 317-320 (2007).

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.