OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 15, Iss. 20 — Oct. 1, 2007
  • pp: 13167–13181
« Show journal navigation

Color-encoded fringe projection for 3D shape measurements

Wei-Hung Su  »View Author Affiliations


Optics Express, Vol. 15, Issue 20, pp. 13167-13181 (2007)
http://dx.doi.org/10.1364/OE.15.013167


View Full Text Article

Acrobat PDF (1506 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

A novel technique using color-encoded stripes embedded into a sinusoidal fringe pattern for finding the absolute shape of an object is proposed. Phases of the projected fringes on the surface are evaluated by Fourier transform method. Unwrapping is then performed with reference to the color-encoded stripes. When surfaces of interest contain large depth discontinuities, the color-encoded stripes can easily identify the fringe order. Compared with other phase unwrapping schemes, this method offers many major advantages, including: (1) very low computation cost for the 3D reconstruction, (2) reliable phase unwrapping to complex objects, especially for surfaces with large depth discontinuities, (3) only one-shot measurement is required, and (4) robust performance to analyze dynamic objects.

© 2007 Optical Society of America

1. Introduction

In optical 3D sensing, techniques based on pattern-projected methods have the properties of nondestructive detection, high depth accuracy, fast measurement speed, and full-field inspection. It consists of a pattern projection system and an image acquisition system. The inspected surface is illuminated by a pattern from the projection system. Pattern distorted by the surface is recorded by the image acquisition system at a different viewpoint. With triangulation methods or suitable calibrations, distribution of the projected pattern is analyzable to retrieve the 3D shape.

One of the major studies for 3D profile measurements is the application to complicated objects, such as the inspection of a dynamic object with large depth discontinuities. Improvement of the existing projection techniques is one of the desirable solutions. The projection technique can be either the so-called structured light encoded method [1–5

1. M. D. Altschuler, B. R. Altschuler, and J. Taboada, “Laser electro-optic system for rapid three-dimensional topographic mapping of surfaces,” Opt. Eng. 20, 953–961 (1981).

] or the fringe projection method [6–8

6. G. Indebetouw, “Profile measurement using projection of running fringes,” Appl. Opt. 17, 2930–2933 (1978). [CrossRef] [PubMed]

]. However, both existing methods have advantages and drawbacks.

In the method of fringe projection, the distorted fringes on the tested object are evaluated to retrieve the 3D shape. Phase of the fringes can be carried out by the phase-shifting technique [8–11

8. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23, 3105–3108 (1984). [CrossRef] [PubMed]

] or Fourier transform method [7

7. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. 22, 3977–3982 (1983). [CrossRef] [PubMed]

]. In general, the phase-shifting technique takes more than three frames to evaluate the phases and therefore is more accurate than the Fourier transform method. Depth accuracy better than one part in ten thousandth of the field of view can be achieved [12–14

12. L. Salas, E. Luna, J. Salinas, V. Garcia, and M. Servin, “Profilometry by fringe projection,” Opt. Eng. 42, 3307–3314 (2003). [CrossRef]

]. Unfortunately, the longer measurement procedure makes it inefficient to detect dynamic objects. The Fourier transform method is advantageous to perform dynamic inspection since only one single-shot measurement is required. However, compared with the phase-shifting technique, the accuracy is relatively worse. In addition, both the phase-shifting technique and Fourier transform method involve the arctangent operation. Thus, the extracted phases have principle values ranging from –π and π, and have discontinuities with 2π phase jumps. A so-called phase unwrapping process to recover the absolute phase is inevitable. When objects of interest have large discontinuous height steps, the height steps hinder the unique assignment of fringe order, resulting in ambiguity of phase unwrapping.

Intensive studies have been devoted to developing reliable phase unwrapping algorithms, but each has advantages and disadvantages. Algorithms by temporal methods [15–19

15. J. M. Huntley and H. O. Saldner, “Temporal phase-unwrapping algorithm for automated inteferogram analysis,” Appl. Opt. 32, 3047–3052 (1993). [CrossRef] [PubMed]

] project a series of fringe patterns with slightly different frequencies to the discontinuous surfaces in a time sequence. Unwrapping is executed from the comparison of each phase map. Algorithms by spatial methods [20–22

20. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilomery: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36, 5347–5354 (1997). [CrossRef] [PubMed]

] utilize only one projected pattern in which two or more sets of fringes with different frequencies are embedded together. The tolerance of depth discontinuities is therefore enlarged by the comparison of different sets of fringes. In general, temporal algorithms provide more accurate information of discontinuities since several frames are measured. However, they are time consuming and not suitable to detecting dynamic objects. It seems that spatial projection algorithms are more practical for the real-time performance. Unfortunately, the sampling density and contrast of fringes on the recorded image is relatively low.

Compared with the fringe projection method, phase unwrapping is not a critical problem for the structured light encoded method. In the method of structured light projection, each stripe on the image is identified by a unique code and therefore the ambiguity does not occur. The encoding algorithms can be further divided into two categories, namely temporal encoding algorithms [1–3

1. M. D. Altschuler, B. R. Altschuler, and J. Taboada, “Laser electro-optic system for rapid three-dimensional topographic mapping of surfaces,” Opt. Eng. 20, 953–961 (1981).

] and spatial encoding algorithms [4–5

4. K. Sato and S. Inokuchi, “Three-dimensional surface measurement by space encoding range image,” J. Rob. Syst. 2, 27–39 (1985).

]. Algorithms that use temporal methods project a sequence of patterns with different number of stripes to the inspect surface temporally. Intensity of the stripe on the image is coded in binary or gray intensity levels. Each intensity level is assigned to a character. Thus, a code word, or a stream of coded characters, is created by the sequent projections. Stripes are identified by the code words if the code words are unique. Systematic accuracy can be as high as one pixel, depending on the number of projections. Again, the measuring procedure is time consuming and impractical to on-line inspection. In the spatial encoding algorithms, stripes are generally encoded by colors. A color CCD camera is utilized to record the color-encoded strips. The color illumination provides additional degree of freedom to identify the stripes. Thus, the stripe colors can be arranged in a specific order [5

5. W. Liu, Z. Wang, G. Mu, and Z. Fang, “Color-coded projection grating method for shape measurement with a single exposure,” Appl. Opt. 39, 3504–3508 (2000). [CrossRef]

]. Each stripe is then distinguishable by this order.

In this paper, a method to reconstruct a 3D shape from a complicated object using color-encoded fringe projection is presented. The projected pattern consists of a set of color-encoded strips and a set of sinusoidal fringes. Thus, the profile measurement can be analyzed either by the color-encoded stripes, or by the sinusoidal fringes. If the color-encoded stripes are performed to retrieve the 3D shape, then the sinusoidal fringes can be treated as the intensity variation in each color stripe (a gray intensity algorithm [3

3. G. Sansoni, S. Corini, S. Lazzari, and F. Docchio, “Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications,” Appl. Opt. 36, 4463–4472 (1997). [CrossRef] [PubMed]

]) to refine the described shape. Accuracy is therefore better than that of a conventional color-encoded projection (a binary intensity algorithm [5

5. W. Liu, Z. Wang, G. Mu, and Z. Fang, “Color-coded projection grating method for shape measurement with a single exposure,” Appl. Opt. 39, 3504–3508 (2000). [CrossRef]

]). If the sinusoidal fringes are employed to reconstruct the 3D shape, then the color strips can be utilized to identify the order of the sinusoidal fringes. With reference to the encoded-color strips, phases can be unwrapped without ambiguity. In this paper, we use the sinusoidal fringes to reconstruct the 3D shape, and use the encoded-color strips to unwrap phase for demonstration.

Among all the projection schemes, this proposed method is superior since it offers many major advantages, including: (1) very low computation cost for the 3D reconstruction, (2) reliable phase unwrapping to complex objects, especially for surfaces with large depth discontinuities, (3) only one-shot measurement is required, and (4) robust performance to analyze dynamic objects.

2. Measurement principle

Figure 1 shows the coordinate systems in a projected fringe profilometry. For simplicity, an optical system with telecentric projection is assumed. An inspected object is projected with a color-encoded frimge pattern and observed by a color CCD camera. The y-z axis is located in the figure plane and the x-axis is normal to the figure plane. A fringe from point P is projected to point A on the reference plane and to point Z on the surface of the measured object. This fringe is recorded by the image sensor array and therefore the fringe on the image plane is shifted from point A’ to C’.

It is obvious that the object depth value Z(x, y) can be determined as

Zxy=AC¯tanθ+tanθn.
(1)

If θn is close to zero, Eq. (1) can be further simplified as

Zxy=AC¯cotθ=φAφC2πdocotθ,
(2)

where φA and φC are phase values on point A and C, respectively. Note that the phases of points A and Z are the same. Thus, on the image obtained from point C’, the value Z(x, y) can be found out by the phases on the object surface and on the reference plane. Equation (2) is therefore expressed as

Zxy=AC¯cotθ=φZφC2πdocotθ.
(3)
Fig. 1. Optical geometry of the projected fringe profilometry.
Fig. 2. Appearance of a color-encoded fringe pattern.

3. Color-encoded fringe pattern

The color-encoded fringe pattern consists of a set of color-encoded strips and a set of sinusoidal fringes. The sinusoidal fringes on the surface are used to retrieve the 3D shape, while the color-encoded strips are used to identify the fringe order. The period of the sinusoidal fringe and the width of the color strips need not to be the same, depending on complicacy of the tested object. Shown in Fig. 2 is an example of the encoded fringe pattern, in which the period of the fringes is twice of the stripe width. The transmittance of the sinusoidal fringes is represented as

t(x)=0.6+0.4cos(2πdx),
(4)

where d is the period of the fringe. The minimum of the transmittance is set to 0.2 so that the entire color stripes are observable and discriminable by the color CCD camera. The color-encoded scheme to identify the fringe order is described in section 3.1. Phases of the projected pattern are evaluated by Fourier transform method, as illustrated in section 3.2.

3.1 Color-encoded algorithm

The color-encoded stripes play an important role for phase unwrapping. In our setup, there are seven colors used to code the strips. The colors are red, green, blue, yellow, magenta, cyan, and white. They are arranged in a manner that three adjacent stripes form a group. The adjacent groups are overlapped with two stripes, as shown in Fig. 3. The sequence of colors in one group does not appear again in any other group so that any stripe can be identified without ambiguity. Figure 4(a) shows such arranged stripes. For example, a color sequence [cyan, white, red] in one group represents the 6th, 7th, and 8th stripe, respectively. Another sequence [white, red, green] in a group represents the 7th, 8th, and 9th stripe, respectively. One can see that the color order in each group is unique and therefore the stripe is distinguishable.

Fig. 3. Color arrangement of the encoded scheme.

For the purpose of robust operation, each color is further represented by a digital number. A code table, which contains a stream of digital numbers, is employed to represent the arranged colors. In our setup, we use the digital number “2” to represent the green color, and the digital number “5” to represent the magenta color. Demonstrated in Table 1 is the represented digital numbers. Thus, a code table using a stream of digital numbers to denote the color stripes in Fig. 4(a) is created. Shown in Fig. 4(b) is the corresponding code table. This code table provides significant information to identify the stripe order. When surfaces with large depth discontinuities are detected by the color CCD camera, the code table can easily address the amount of shifted fringes on the discontinuous area. Since the color arrangement in each group is unique, the overall stream of digital numbers is also unique. Searching for the orders of stripes becomes a simple task to address the locations of the sequent digital numbers in the code table. For example, a sequence of distorted fringes with the digital number stream [6 7 1] in one group is mapped to the 6th, 7th, and 8th position in the code table, respectively. Thus, the fringe orders are 6, 7, and 8.

Table 1. Assigned digital numbers with the corresponding colors

table-icon
View This Table
| View All Tables

Now the color stripes are embedded to the sinusoidal fringes. Shown in Fig. 4(c) is the designed pattern, in which the width of the stripes and the period of the fringes are the same. When performing the profile measurement, projected color-encoded fringes on the inspected object are recoded by the color CCD camera. Appearance of the projected pattern on a plate recorded by the color CCD camera is shown in Fig. 4(d). In a 24-bit red-green-blue image model, this recorded image is formed by combination of three color-channels, the red, green, and blue. For each channel, it is possible to provide sufficient discrimination with two intensity levels, “bright” and “dark”. Thus, each color image can be transformed to a binary intensity distribution by means of a suitable threshold algorithm, depending on whether the gray level is below the threshold or not. With a suitable threshold to filter the noises, three masks corresponding to the red, green, and blue channels are created. These binary masks provide significant information to distinguish the fringe color. They are desirable to address the distribution of the color-encoded fringes.

Fig. 4. Color-encoded algorithm. (a) Appearance of the color-encoded stripes. (b) A code table created by (a). (c) A color-encoded fringe pattern. (d) Appearance of the projected pattern on a plate recorded by the color CCD camera (e) Binary mask generated from the red channel. (f) Binary mask created from the green channel. (g) Binary mask created from the blue channel.

Table 2. Assigned binary values in red-green-blue model with the corresponding digital numbers

table-icon
View This Table
| View All Tables

However, it is a challenge to assign a suitable threshold value to the whole image. If the threshold is set too low, noise cannot be filtered out. If the threshold is set too high, intensity of the dark fringes, where the transmittance of the pattern is close to the minimum, might be set to zero. Errors are therefore introduced to distinguish the fringe color. To retrieve the lost data, one can address the boundary of each fringe by searching for the intervals between two local minimums. The mask pattern is then created with reference to boundaries of the fringes. Shown in Figs. 4(e), 4(f), and 4(g) are three retrieved binary masks.

With reference to these binary masks, a stream of digital numbers for each row on the image can be generated. The digital numbers with their corresponding binary values are illustrated in Table 2. This stream is then evaluated with the code table. Fringe order at each row is therefore carried out.

3.2 Phase extraction

To evaluate the phase distribution, the color-encoded fringes should be displayed in gray intensity levels. Figure 5 illustrates the basic stages of the transform method. Shown in Fig. 6 is the transformation of Fig. 4(d). The binary masks described in section 3.1 are employed to weigh the intensities of the colors.

Fig. 5. Flow diagram of stages for color-encoded fringes transformed in gray intensity levels.
Fig. 6. Projected color-encoded fringes displayed in gray intensity levels

The image of fringes projected to the tested surface obtained by the sensor array is described as

Ixy=axy+bxycos[2πdx+Δφxy]
=axy+12bxyej[2πdx+Δφxy]+12bxyej[2πdx+Δφxy],
(5)

where a(x, y) is the background intensity, b(x, y) is the modulation amplitude, d is the period of the projected fringes obtained by the CCD camera, and Δφ is distorted phase of the fringes. Equation (5) can be represented as

Ixy=axy+12b˜xyej2πdx+12b˜*xyej2πdx,
(6)

where (x, y) =b(x, y)∙e jΔφ(x, y), and “*” denotes the conjugate operation.

The one-dimensional Fourier transform of Eq. (6) with respect x-axis is realized as

{Ixy}=Afxy+12B˜fx1dy+12B˜*fx+1dy,
(7)

where ℑ{(x, y)} = (fx,y). The fundamental component, 12B˜(fx+−1d,y), can be carried out with a suitable band-pass filter.

The inverse Fourier transform of the fundamental component is expressed as,

sxy=1{12B˜fx1dy}=12b˜xyej2πdx=12bxyej[2πdx+Δφxy].
(8)

Phase distribution from Eq. (8) is then given by

Φxy=tan1{Im{sxy}Re{sxy}},
(9)

where Im{} and Re{} represents the image part and real part of the complex signal.

The phase evaluation gives principle values Φ ranging from –π and π, and has discontinuities with 2π phase jumps. It needs to be unwrapped to obtain the absolute phases Δφ+2πx/d, as described in section 3.3.

3.3 Phase unwrapping

An example is given to illustrate our proposed method. A box attached to a plate was selected as the testing object. A color-encoded pattern was projected onto this object. The recorded image is shown in Fig. 7(a). This image was separated by the red-green-blue color channels, as depicted in Figs. 7(b), 7(c), and 7(d). They are then transformed to a binary intensity distribution. The corresponding mask patterns are created, as shown in Figs. 7(e), 7(f), and 7(g). As demonstrated Table 2, these mask patterns generate a stream of digital numbers for each row on the image to represent the color stripes. Then the fringe order can be found out with reference to the code table [Fig. 4(b)]. Distribution of the fringe order identified by the code table is illustrated in Fig. 7(h), in which a color bar is used to represent the order number.

To evaluate the phase distribution, the color-encoded fringes should be displayed in gray levels, as depicted in Fig. 8(a). A color bar is used to address the gray level. Its phase distribution computed by Fourier transform method is shown in Fig. 8(b). Since the fringe order has been identified [as shown in Fig. 7(h)], phase unwrapping can be easily carried out using the mathematical expression

2πxd+Δφxy=Φxy+2π(n1),
(10)

where n is the fringe order number. Figure 8(c) shows the unwrapped phase map according to Eq. (10

10. V. Y. Su, G. Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993). [CrossRef]

).

Phase noise, surface anomalies, and insufficient sampling frequencies might cause a spurious jump. When this spurious jump occurs, conventional unwrapping algorithms might be fooled into adding a multiple of 2π to the phase value. This error might propagate over a large area and spoil many pixels. In our proposed algorithm, unwrapping is performed pixel by pixel with reference to the code table. Thus, unwrapped errors are confined in a small area and won’t spoil the other pixels.

Fig. 7. Fringe order identification: (a) shows an observed image in which the tested object is projected with a color-encoded fringe pattern; (b), (c), and (d) show the same image in red, green, and blue channel, respectively; (e), (f), and (g) show the binary mask created by the red channel, green channel, and blue channel, respectively; (h) shows the evaluated distribution of the fringe order, with a color bar to represent the order.
Fig. 8. Phase-extraction and unwrapping: (a) Appearance of the recorded image displayed in gray intensity levels. (b) Computed phase distribution using Fourier transform method. (c) Unwrapped phase map using Eq. (10).

4. Experiments

4.1 Shape retrieval from measured phases

Two boxes, where one was obstructed by the other, were selected as the measured sample. A color-encoded fringe pattern was plotted by computer and then reproduced on a color film. The minimum transmittance of this pattern was 0.2. Thus, all the color stripes were observable. A white lamp was utilized as the light source. The color film was illuminated by this light source and then projected onto the measured sample. A color CCD camera with 1024×768 pixels at 24-bit pixel resolution was used to record the projected fringes. The period of the fringes on the reference plane was 2.0mm. Figure 9 shows the recorded image. Description about the optical system and its calibration can also be found in the reference paper [22

22. W. H. Su and H. Liu, “Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities,” Opt. Express 14, 9178–9187 (2006). [CrossRef] [PubMed]

].

Fig. 9. Appearance of a projected color-encoded fringe pattern. A 24-bit color CCD camera with 1024× 78 pixels was used to record the fringes.
Fig. 10. Appearance of three channel images: (a), (b), and (c) show the observed image in red, green, and blue channel, respectively; (d), (e), and (f) show the binary mask created by the red channel, green channel, and blue channel, respectively

To identify the fringe order at each pixel, this image was separated by three color-channels. Shown in Figs. 10(a), 10(b), and 10(c) are the transformed images. Each color image was mapped to a binary intensity distribution with a suitable threshold. Three binary masks corresponding to the red, green, and blue images was therefore created, as shown in Figs. 10(d), 10(e), and 10(f), respectively.

Fig. 11. Distribution of the fringe order. A color bar is utilized to address the order number.

For each row of the inspected image (Fig. 9), a stream of digital numbers was generated with reference to the binary masks. This stream was evaluated by the code table in order to find out the fringe orders. Figure 11 illustrates the evaluated result.

Before phase-extraction, the color image should be displayed in gray intensity levels. The transform procedure is depicted in Fig. 5. Appearance of the image in gray intensity levels is shown in Fig. 12(a). Phases of the fringes were evaluated by Fourier transform method. Figure 12(b) shows the computed phases, which were within the interval between –π and π. Since the fringe order at every pixel had been identified, phase unwrapping could be easily performed using Eq. (10). The distribution of absolute phases is shown in Fig. 12(c).

To reconstruct the 3D shape, absolute phases of the reference plane needs to be found out, too. By the similar way, the absolute phases was computed, as shown in Fig. 13. Depth profile was determined by Eq. (3). The reconstructed 3D shape is shown in Fig. 14. Since the color-encoded fringes were reproduced in a color film, geometric shape and contrast of the fringes could not be accurately controlled. Thus, systemic errors were mainly from the pattern’s quality. In our setup, depth accuracy was approximately 0.5mm.

4.2 Unwrapping Accuracy

If the reflectivity of the object or the illumination of the source is not uniform, a simple threshold value might not be applicable to the whole image. The brighter region should be assigned a higher threshold, while the darkish region assigned a lower one. Thus, the image might be segmented to several regions according to various mean values of the reflectivity or illuminance. Different threshold values are then necessary for the corresponding segmented regions.

Poor selectivity of the color channels also makes it difficult to assign a suitable threshold. In our setup, the color-encoded fringe pattern was generated by computer and reproduced on a color film. Unfortunately, the band-pass of the color film in frequency domain was not narrow enough that the transmitted waves might involve unwanted components. For example, the red fringe observed by the CCD camera contained green and blue components. In general, the unwanted components could be filtered by assigning a suitable threshold value. However, signals in the darkish region might be filtered out, too. Of course, it won’t be a main problem if a better color projector is utilized, such as a digital light processing (DLP) video projector.

Fig. 12 Phase-extraction and unwrapping. (a) Appearance of an image in gray intensity levels. (b) Computed phase distribution using Fourier transform method. (c) Unwrapped phase map.
Fig. 13. (a). Appearance of a projected color-encoded fringe pattern on the reference plane. (b) Unwrapped phase map.
Fig. 14. Reconstructed 3D profile.

In addition, when the entire object is fairly red, signals observed in green channel or blue channel will be weak. Colors such as yellow, magenta, and white become indiscernible. Actually, it is a common problem for most existing schemes using the color pattern projection. Unwrapping algorithm described in section 3.3 might not be employed directly. An alternate solution is using conventional unwrapping algorithms to compute the absolute phases and then using the available information in a color channel to distinguish the unwrapped errors. In such situation, the projected approach is similar with a gray-encoded algorithm. Since the fringes are spatially encoded, it is still feasible to identify the displacements of shifted fringes for surfaces with large depth discontinuities.

7. Conclusion

We have presented a projected fringe profilometry using a color-encoded fringe pattern for finding the absolute shape of an object. When surfaces of interest contain large depth discontinuities, the color-encoded strips can easily identify the fringe order. Fringe order is evaluated by a code table. Thus, the computation cost is relatively low. Unlike conventional algorithms, unwrapped errors are confined in a small area and won’t spoil the other fringes. The presented method provides a robust performance to analyze dynamic objects. It requires only one measurement frame to retrieve the 3D shape. Thus, it is desirable to inspect dynamic objects with large depth discontinuities.

Acknowledgments

This work was performed under the support of the Aim for the Top University Plan. The authors are also grateful for support from the Advanced Crystalline Opto-Electronics Science and Technology Research Center.

References and links

1.

M. D. Altschuler, B. R. Altschuler, and J. Taboada, “Laser electro-optic system for rapid three-dimensional topographic mapping of surfaces,” Opt. Eng. 20, 953–961 (1981).

2.

M. Minou, T. Kanade, and T. Sakai, “A method of time-coded parallel planes of light for depth measurement,” Trans. IECE Japan 64, 521–528 (1981).

3.

G. Sansoni, S. Corini, S. Lazzari, and F. Docchio, “Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications,” Appl. Opt. 36, 4463–4472 (1997). [CrossRef] [PubMed]

4.

K. Sato and S. Inokuchi, “Three-dimensional surface measurement by space encoding range image,” J. Rob. Syst. 2, 27–39 (1985).

5.

W. Liu, Z. Wang, G. Mu, and Z. Fang, “Color-coded projection grating method for shape measurement with a single exposure,” Appl. Opt. 39, 3504–3508 (2000). [CrossRef]

6.

G. Indebetouw, “Profile measurement using projection of running fringes,” Appl. Opt. 17, 2930–2933 (1978). [CrossRef] [PubMed]

7.

M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. 22, 3977–3982 (1983). [CrossRef] [PubMed]

8.

V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23, 3105–3108 (1984). [CrossRef] [PubMed]

9.

K. G. Larkin and B. F. Oreb, “Design and assessment of symmetrical phase-shifting algorithms,” J. Opt. Soc. Am. A 9, 1740–1748 (1992). [CrossRef]

10.

V. Y. Su, G. Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993). [CrossRef]

11.

Y. Surrel, “Design of algorithms for phase measurements by the use of phase stepping,” Appl. Opt. 35, 51–60 (1996). [CrossRef] [PubMed]

12.

L. Salas, E. Luna, J. Salinas, V. Garcia, and M. Servin, “Profilometry by fringe projection,” Opt. Eng. 42, 3307–3314 (2003). [CrossRef]

13.

H. Liu, W. H. Su, K. R., and S. Yin, “Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3D surface profile measurement,” Opt. Commun. 216, 65–80 (2003). [CrossRef]

14.

W. H. Su, H. Liu, K. Reichard, S. Yin, and F. T. S. Yu, “Fabrication of digital sinusoidal gratings and precisely conytolled diffusive flats and their application to highly accurate projected fringe profilometry,” Opt. Eng. 42, 1730–1740 (2003). [CrossRef]

15.

J. M. Huntley and H. O. Saldner, “Temporal phase-unwrapping algorithm for automated inteferogram analysis,” Appl. Opt. 32, 3047–3052 (1993). [CrossRef] [PubMed]

16.

H. O. Saldner and J. M. Huntley, “Profilometry using temporal phase unwrapping and a spatial light modulator-based fringe projector,” Opt. Eng. 36, 610–615 (1997). [CrossRef]

17.

D. R. Burton and M. J. Lalor, “Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping,” Appl. Opt. 33, 2939–2948 (1994) [CrossRef] [PubMed]

18.

Y. Hao, Y. Zhao, and D. Li, “Multifrequency grating projection profilometry based on the nonlinear excess fraction method,” Appl. Opt. 38, 4106–4110 (1999). [CrossRef]

19.

E. B. Li, X. Peng, J. Xi, J. F. Chicharo, J. Q. Yao, and D.W. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3D profilometry,” Opt. Express 13, 1561–1569 (2005). [CrossRef] [PubMed]

20.

M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilomery: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36, 5347–5354 (1997). [CrossRef] [PubMed]

21.

J. L. Li, H. J. Su, and X. Y. Su, “Two-frequency grating used in phase-measuring profilometry,” Appl. Opt. 36, 277–280 (1997). [CrossRef] [PubMed]

22.

W. H. Su and H. Liu, “Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities,” Opt. Express 14, 9178–9187 (2006). [CrossRef] [PubMed]

OCIS Codes
(110.6880) Imaging systems : Three-dimensional image acquisition
(120.4630) Instrumentation, measurement, and metrology : Optical inspection

ToC Category:
Instrumentation, Measurement, and Metrology

History
Original Manuscript: August 13, 2007
Revised Manuscript: September 23, 2007
Manuscript Accepted: September 24, 2007
Published: September 26, 2007

Citation
Wei-Hung Su, "Color-encoded fringe projection for 3D shape measurements," Opt. Express 15, 13167-13181 (2007)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-15-20-13167


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. M. D. Altschuler, B. R. Altschuler, and J. Taboada, "Laser electro-optic system for rapid three-dimensional topographic mapping of surfaces," Opt. Eng. 20, 953-961 (1981).
  2. M. Minou, T. Kanade, and T. Sakai, "A method of time-coded parallel planes of light for depth measurement," Trans. IECE Japan 64, 521-528 (1981).
  3. G. Sansoni, S. Corini, S. Lazzari and F. Docchio, "Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications," Appl. Opt. 36, 4463-4472 (1997). [CrossRef] [PubMed]
  4. K. Sato, and S. Inokuchi, "Three-dimensional surface measurement by space encoding range image," J. Rob. Syst. 2, 27-39 (1985).
  5. W. Liu, Z. Wang, G. Mu, and Z. Fang, "Color-coded projection grating method for shape measurement with a single exposure," Appl. Opt. 39, 3504-3508 (2000). [CrossRef]
  6. G. Indebetouw, "Profile measurement using projection of running fringes," Appl. Opt. 17, 2930-2933 (1978). [CrossRef] [PubMed]
  7. M. Takeda, and K. Mutoh, "Fourier transform profilometry for the automatic measurement of 3-D object shaped," Appl. Opt. 22, 3977-3982 (1983). [CrossRef] [PubMed]
  8. V. Srinivasan, H. C. Liu, and M. Halioua, "Automated phase-measuring profilometry of 3-D diffuse objects," Appl. Opt. 23, 3105-3108 (1984). [CrossRef] [PubMed]
  9. K. G. Larkin, and B. F. Oreb, "Design and assessment of symmetrical phase-shifting algorithms," J. Opt. Soc. Am. A 9, 1740-1748 (1992). [CrossRef]
  10. V. Y. Su, G. Bally, and D. Vukicevic, "Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation," Opt. Commun. 98, 141-150 (1993). [CrossRef]
  11. Y. Surrel, "Design of algorithms for phase measurements by the use of phase stepping," Appl. Opt. 35, 51-60 (1996). [CrossRef] [PubMed]
  12. L. Salas, E. Luna, J. Salinas, V. Garcia, and M. Servin, "Profilometry by fringe projection," Opt. Eng. 42, 3307-3314 (2003). [CrossRef]
  13. H. Liu, W. H. Su, K. R., and S. Yin, "Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3D surface profile measurement," Opt. Commun. 216, 65-80 (2003). [CrossRef]
  14. W. H. Su, H. Liu, K. Reichard, S. Yin, and F. T. S. Yu, "Fabrication of digital sinusoidal gratings and precisely conytolled diffusive flats and their application to highly accurate projected fringe profilometry," Opt. Eng. 42, 1730-1740 (2003). [CrossRef]
  15. J. M. Huntley, and H. O. Saldner, "Temporal phase-unwrapping algorithm for automated inteferogram analysis," Appl. Opt. 32, 3047-3052 (1993). [CrossRef] [PubMed]
  16. H. O. Saldner, and J. M. Huntley, "Profilometry using temporal phase unwrapping and a spatial light modulator-based fringe projector," Opt. Eng. 36, 610-615 (1997). [CrossRef]
  17. D. R. Burton, and M. J. Lalor, "Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping," Appl. Opt. 33, 2939-2948 (1994) [CrossRef] [PubMed]
  18. Y. Hao, Y. Zhao, and D. Li, "Multifrequency grating projection profilometry based on the nonlinear excess fraction method," Appl. Opt. 38, 4106-4110 (1999). [CrossRef]
  19. E. B. Li, X. Peng, J. Xi, J. F. Chicharo, J. Q. Yao, and D.W. Zhang, "Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3D profilometry," Opt. Express 13, 1561-1569 (2005). [CrossRef] [PubMed]
  20. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, "Frequency-multiplex Fourier-transform profilomery: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations," Appl. Opt. 36, 5347-5354 (1997). [CrossRef] [PubMed]
  21. J. L. Li, H. J. Su, and X. Y. Su, "Two-frequency grating used in phase-measuring profilometry," Appl. Opt. 36, 277-280 (1997). [CrossRef] [PubMed]
  22. W. H. Su, and H. Liu, "Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities," Opt. Express 14, 9178-9187 (2006). [CrossRef] [PubMed]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited