## 3D shape measurement technique for multiple rapidly moving objects |

Optics Express, Vol. 19, Issue 9, pp. 8539-8545 (2011)

http://dx.doi.org/10.1364/OE.19.008539

Acrobat PDF (1048 KB)

### Abstract

Recently proposed binary defocusing techniques have led to ultrafast speed 3D shape measurement, but they are generally limited to measurement of a single object at a time. Introducing additional gray coded patterns for point-by-point phase unwrapping could permit simultaneous multiple-object measurement. However, when the objects are moving rapidly, the displacement between the first captured pattern and the last can be significant, and pose challenges related to the precisely designed gray codes. This paper presents a new phase unwrapping strategy that combines the conventional spatial phase unwrapping with the gray code to resolve motion related phase unwrapping problems. A system with a speed of 5,000 Hz was developed to verify the performance of the proposed technique.

© 2011 OSA

## 1. Introduction

1. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Laser. Eng. **48**, 133–140 (2010). [CrossRef]

2. S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Laser Eng. **48**(2), 149–158 (2010). [CrossRef]

3. S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. **34**(20), 3080–3082 (2009). [CrossRef] [PubMed]

4. S. Zhang, D. van der Weide, and J. Olvier, “Superfast phase-shifting method for 3-D shape measurement,” Opt. Express **18**(9), 9684–9689 (2010). [CrossRef] [PubMed]

4. S. Zhang, D. van der Weide, and J. Olvier, “Superfast phase-shifting method for 3-D shape measurement,” Opt. Express **18**(9), 9684–9689 (2010). [CrossRef] [PubMed]

5. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. **38**(31), 6565–6573 (1999). [CrossRef]

7. J. Pan, P. S. Huang, and F. Chiang, “Color-coded binary fringe projection technique for 3D shape measurement,” Opt. Eng. **44**(2), 023606 (2005). [CrossRef]

*π*. Then a conventional unwrapping algorithm is applied to unwrap the phase for each region to obtain relative phase. Finally, the gray code is again used to obtain absolute phase by properly shifting the relative phase for each region by adding multiples of 2

*π*. By this means, the phase unwrapping problems caused by motion could be resolved. To verify the performance of the proposed method, a system with a speed of 5,000 Hz was developed to capture the collision process of two balls.

## 2. Principle of phase-shifting technique

### 2.1. Three-step phase-shifting algorithm

8. D. Malacara, ed., *Optical Shop Testing*, 3rd ed. (John Wiley and Sons, 2007). [CrossRef]

*π*/3 can be written as: where

*I*′(

*x,y*) is the average intensity,

*I*″(

*x,y*) the intensity modulation, and

*ϕ*(

*x,y*) the phase to be solved for. The phase can be solved for from these equations as This equation provides the wrapped phase ranging from −

*π*to +

*π*with 2

*π*discontinuities. The 2

*π*phase jumps can be removed to obtain a continuous phase map by adopting a phase unwrapping algorithm [9]. Specifically, the phase unwrapping is to determine fringe order, integer

*k*(

*x,y*), so that Here, Φ(

*x,y*) is the unwrapped absolute phase. However, such a single three-step phase-shifting algorithm cannot be used to measure multiple objects at a time.

### 2.2. Fringe order k(x,y) determination with gray code

*k*(

*x,y*) in Eq. (5) can be uniquely defined by a sequence of binary (1-bit) patterns. Figure 1 illustrates two coding examples. It can be seen that for each fringe period, a unique bit sequence (e.g., 0001, 0011) is used. However, due to the projector defocusing and discrete camera sampling, the binary state changes in coded patterns become blurred, and this makes the code boundaries difficult to be determined accurately. This problem will be worse if the state changes at the same place for different coded patterns, as illustrated in Fig. 1(a): all four patterns change from code 0111 to 1000.

7. J. Pan, P. S. Huang, and F. Chiang, “Color-coded binary fringe projection technique for 3D shape measurement,” Opt. Eng. **44**(2), 023606 (2005). [CrossRef]

### 2.3. Phase-to-depth conversion

*x,y*), the depth distribution can be retrieved by adopting a phase-to-height conversion algorithm. In this paper, a reference-plane-based method is adopted to convert absolute phase to depth

*z*[10

10. C. Zhang, P. S. Huang, and F.-P. Chiang, “Microscopic phase-shifting profilometry based on digital micromirror device technology,” Appl. Opt. **41**, 5896–5904 (2002). [CrossRef] [PubMed]

*x,y*) is the object phase map, and Φ

*(*

^{rp}*x,y*) is the reference phase map that is obtained by measuring a uniform flat planar surface, the relationship between the surface height relative to the reference plane and their phase difference, ΔΦ(

*x,y*) = Φ(

*x,y*) – Φ

*(*

^{rp}*x,y*), is approximately proportional, that is Assuming the reference plane has a depth of

*z*

_{0}= 0, the absolute depth value for each camera pixel can be represented as where

*c*

_{0}is a constant that can be determined through calibration.

## 3. Novel phase unwrapping framework

*Step 1: Use the codeword to segment the whole image into separated regions*. For each point, we can detect the codeword jump by comparing it with neighboring points. For each region, the codeword does not change more than 1, which means that the phase difference between neighbor pixels should not be more than 2

*π*. If the jump of one point is more than 1, meaning that the phase difference of the point might be more than 2

*π*, this point is treated as a discontinuous edge. The entire image is then divided into continuous regions separated by the edges.

*Step 2: Adopt a conventional phase unwrapping algorithm to unwrap the phase for each region to obtain the relative phase map*. When the phase map is continuous in one region, a conventional unwrapping algorithm can be applied to obtain the continuous phase map. Because each region is unwrapped individually relative to one point in that region, the unwrapped phase obtained in this step is called

*relative phase*, Φ

*(*

^{r}*x,y*).

*Step 3: Use the codeword to adjust the relative phase maps to be absolute ones for all regions*. Step 2 only provides relative phase for each region. In order to measure multiple objects simultaneously,

*absolute phase*is needed. To do so, we first use the conventional approach to extract the absolute phase map Φ

^{0}(

*x,y*). For each region, the difference between the absolute phase Φ

^{0}(

*x,y*) and the relative phase Φ

*(*

^{r}*x,y*) obtained in Step 2 should be a constant, ΔΦ(

*x,y*) = Φ

*(*

^{r}*x,y*)−Φ

^{0}(

*x,y*) = Δ

*k*(

*x,y*)×2

*π*. Here, Δ

*k*(

*x,y*) is an integer and constant for each region. The difference values should be nearly the same for each region except for a few noisy points. To obtain Δ

*k*(

*x,y*), we can first average all the phase difference values in one region after applying a low-pass filter, and then divide the average value by 2

*π*and round it to an integer for each region. Once Δ

*k*(

*x,y*) is obtained, the relative phase can be converted to absolute one by adding Δ

*k*(

*x,y*) × 2

*π*.

*Step 4: Combine different region phase maps into a single one and convert the final unwrapped phase map to 3D shape*. This step will generate the final complete absolute phase map Φ(

*x,y*) by combing all regions together. Once the absolute phase map is obtained, 3D shape can be recovered if the system is calibrated.

## 4. Experimental results

*μ*s. In the experiment, both balls are suspended by strings, and initially in contact with one another. Then, one ball is pulled to a certain height and released to swing like a pendulum until collision occurs with the stationary ball, at the bottom of the pendular arc where the moving ball’s velocity is at its maximum. Figure 2(a) shows one fringe pattern and Fig. 2(b) shows one of the coded binary patterns. From the four coded binary patterns, the codeword

*k*(

*x,y*), which represents the fringe order of each point, can be extracted as shown in Fig. 2(c). From the three phase-shifted fringe patterns, the wrapped phase map can be obtained as shown in Fig. 2(d). The codeword can be used to unwrap the phase point-by-point using Eq. (5). Figure 2(e) shows the result. It clearly shows some problems on the unwrapped phase (e.g., stripes and spikes). To highlight these problems, Fig. 2(f) shows a view zoomed into a small area. This figure clearly shows that the problematic areas are more than 2 pixels wide. This means that a standard filtering technique cannot be used to eliminate this problem. Since the object is moving rapidly, the coded patterns and the phase-shifted fringe patterns are captured sequentially, and the time lag between the first frame and the last frame is 1.4 ms, the displacement between the first captured fringe pattern and the last one could be significant. Thus the precisely designed codeword and the wrapped phase do not match, and the phase cannot be unwrapped correctly by the coded patterns.

*x*-axis. The relative phase map Φ

*(*

^{r}*x,y*) can be converted to absolute one by referring to the absolute phase map Φ

^{0}(

*x,y*) obtained by using a conventional approach as shown in Fig. 2(e). Figure 2(j) shows the difference phase map ΔΦ(

*x,y*) = Φ

*(*

^{r}*x,y*) − Φ

^{0}(

*x,y*). This figure indicates that the difference values indeed are nearly the same for each region except for a few noisy points. Once the constants for each region is determined, the relative phase can be converted to be absolute one as show in Fig. 2(k). Because the relative phase map for each region is obtained through a conventional phase unwrapping algorithm, it should be smooth. Figure 2(l) shows the corresponding area of the absolute phase as shown in Fig. 2(f). It can be seen that the noisy areas are gone, and the phase map is smooth.

## 5. Conclusion

*π*and 2

*π*. Although this is not likely to happen in most measurement scenarios, if it becomes a problem for some particular applications, various image processing means could be adopted to address it.

*μ*s exposure time, we assume the object is motionless during three phase-shifted fringe images capture, i.e., the phase error caused by 600

*μ*s time lag will not be alleviated. Even with this limitation, the proposed system has achieved a 3D shape measurement with an equivalent shutter speed of 1,667 Hz or 600

*μ*s shutter time, which can be used to capture relatively rapidly moving objects.

## References and links

1. | S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Laser. Eng. |

2. | S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Laser Eng. |

3. | S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. |

4. | S. Zhang, D. van der Weide, and J. Olvier, “Superfast phase-shifting method for 3-D shape measurement,” Opt. Express |

5. | G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. |

6. | S. Zhang, “Flexible 3-D shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. |

7. | J. Pan, P. S. Huang, and F. Chiang, “Color-coded binary fringe projection technique for 3D shape measurement,” Opt. Eng. |

8. | D. Malacara, ed., |

9. | D. C. Ghiglia and M. D. Pritt, |

10. | C. Zhang, P. S. Huang, and F.-P. Chiang, “Microscopic phase-shifting profilometry based on digital micromirror device technology,” Appl. Opt. |

**OCIS Codes**

(110.6880) Imaging systems : Three-dimensional image acquisition

(120.5050) Instrumentation, measurement, and metrology : Phase measurement

(320.7100) Ultrafast optics : Ultrafast measurements

**ToC Category:**

Instrumentation, Measurement, and Metrology

**History**

Original Manuscript: February 17, 2011

Revised Manuscript: March 30, 2011

Manuscript Accepted: March 30, 2011

Published: April 18, 2011

**Citation**

Yajun Wang, Song Zhang, and James H. Oliver, "3D shape measurement technique for multiple rapidly moving objects," Opt. Express **19**, 8539-8545 (2011)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-19-9-8539

Sort: Year | Journal | Reset

### References

- S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Laser. Eng. 48, 133–140 (2010). [CrossRef]
- S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Laser Eng. 48(2), 149–158 (2010). [CrossRef]
- S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. 34(20), 3080–3082 (2009). [CrossRef] [PubMed]
- S. Zhang, D. van der Weide, and J. Olvier, “Superfast phase-shifting method for 3-D shape measurement,” Opt. Express 18(9), 9684–9689 (2010). [CrossRef] [PubMed]
- G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38(31), 6565–6573 (1999). [CrossRef]
- S. Zhang, “Flexible 3-D shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. 35(7), 931–933 (2010).
- J. Pan, P. S. Huang, and F. Chiang, “Color-coded binary fringe projection technique for 3D shape measurement,” Opt. Eng. 44(2), 023606 (2005). [CrossRef]
- D. Malacara, ed., Optical Shop Testing , 3rd ed. (John Wiley and Sons, 2007). [CrossRef]
- D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (John Wiley and Sons, 1998).
- C. Zhang, P. S. Huang, and F.-P. Chiang, “Microscopic phase-shifting profilometry based on digital micromirror device technology,” Appl. Opt. 41, 5896–5904 (2002). [CrossRef] [PubMed]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

### Supplementary Material

» Media 1: MOV (489 KB)

» Media 2: MOV (340 KB)

» Media 3: MOV (341 KB)

» Media 4: MOV (318 KB)

« Previous Article | Next Article »

OSA is a member of CrossRef.