OSA's Digital Library

Virtual Journal for Biomedical Optics

Virtual Journal for Biomedical Optics

| EXPLORING THE INTERFACE OF LIGHT AND BIOMEDICINE

  • Editors: Andrew Dunn and Anthony Durkin
  • Vol. 6, Iss. 4 — May. 4, 2011
« Show journal navigation

Optical filter highlighting spectral features Part II: quantitative measurements of cosmetic foundation and assessment of their spatial distributions under realistic facial conditions

Ken Nishino, Mutsuko Nakamura, Masayuki Matsumoto, Osamu Tanno, and Shigeki Nakauchi  »View Author Affiliations


Optics Express, Vol. 19, Issue 7, pp. 6031-6041 (2011)
http://dx.doi.org/10.1364/OE.19.006031


View Full Text Article

Acrobat PDF (1225 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

We previously proposed a filter that could detect cosmetic foundations with high discrimination accuracy [Opt. Express 19, 6020 (2011)]. This study extends the filter’s functionality to the quantification of the amount of foundation and applies the filter for the assessment of spatial distributions of foundation under realistic facial conditions. Human faces that are applied with quantitatively controlled amounts of cosmetic foundations were measured using the filter. A calibration curve between pixel values of the image and the amount of foundation was created. The optical filter was applied to visualize spatial foundation distributions under realistic facial conditions, which clearly indicated areas on the face where foundation remained even after cleansing. Results confirm that the proposed filter could visualize and nondestructively inspect the foundation distributions.

© 2011 OSA

1. Introduction

Spectral imaging has recently become an attractive research topic because this method can visualize the spatial distribution of an object’s properties that appear within its spectral features. Human skin is an attractive research target of spectral imaging because it contains rich information about health and mental conditions but a strictly noninvasive method is required for measuring them. Skin color is mainly due to human skin pigments such as melanin, carotene, and hemoglobin [2

2. E. Angelopoulou, The reflectance spectrum of human skin, (Technical Report MS-CIS-99–29, GRASP Laboratory, Department of Computer and Information Science, University of Pennsylvania, USA, 1999).

]. These pigments have characteristic spectral absorption properties in the visible wavelength region. Several methods for measuring and visualizing human skin pigmentation were reported and applied for evaluating human skin diseases like inflammation and melanoma [3

3. N. Tsumura, M. Kawabuchi, H. Haneishi, and Y. Miyake, “Mapping pigmentation in human skin by multi-visible-spectral imaging by inverse optical scattering technique,” J. Imag. Sci. Tech. 45(5), 444–450 (2001).

9

9. G. N. Stamatas and N. Kollias, “In vivo documentation of cutaneous inflammation using spectral imaging,” J. Biomed. Opt. 12(5), 051603 (2007). [CrossRef] [PubMed]

]. In the field of cosmetics, in addition to such skin pigmentation measurements, a noninvasive method to measure applied cosmetic products is strongly desired. Because cosmetic products are used to hide skin flaws and display beautiful skin, the measurement of applied cosmetic products under realistic conditions is important for their development. Doi et al. reported a method for the estimation of the spectral reflectance of made-up skin from the spectral reflectance of bare skin and cosmetic foundations, and the results showed quite good estimation accuracy [10

10. M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).

]. This technique can be also used for estimating the thickness of the foundation layer.

This study focuses on the fact that the discriminant score obtained by the LDA varies depending on the difference between the color signals of with/ without cosmetic foundation, implying that it might reflect the amount of applied cosmetic foundation. Therefore, we extended its application to the quantification of the amount of foundation and we finally applied the filter to the assessment of the spatial foundation distributions under realistic facial conditions.

2. Materials and methods

2.1. The optical filter for discrimination of cosmetic foundations

The details of the optical filter used for the discrimination of cosmetic foundations were described in the accompanying paper [1

1. K. Nishino, M. Nakamura, M. Matsumoto, O. Tanno, and S. Nakauchi, “Optical filter for highlighting spectral features Part I: design and development of the filter for discrimination of human skin with and without an application of cosmetic foundation,” Opt. Express 19(7), 6020–6030 (2011). [CrossRef] [PubMed]

]. The spectral transmittance of the filter was theoretically designed to minimize the misclassification of LDA performed to the filtered spectral data sets of human skin in the presence and absence of cosmetic foundation. LDA was performed on the color signal space and the r-g chromaticity signal transformed from the RGB trichromatic signal was used as the color signal space. The RGB signal was defined using the spectral sensitivity of a commercial RGB digital camera (Nikon D70). The filter was realized as a multilayer thin film filter composed of 31 layers of SiO2 and TiO2 by using the vacuum deposition technology (Fig. 1
Fig. 1 (a) Spectral transmittance of the theoretically designed and the optically realized filter. Theoretical transmittance was designed by optimization. The optical filter was realized by the vacuum deposition technology. (b) Developed optical filter. The optical filter was made of a multilayer thin film that was composed of 31 layers of SiO2 and TiO2.
).

The discrimination of human skin in the presence and absence of cosmetic foundation was obtained from the RGB image taken with the digital camera equipped with the filter. Also, in this case, RGB pixel values were transformed to chromaticity signals and the discriminant analysis was performed on the chromaticity coordinates. The computations are as follows:
C=(r,g)=(CRCR+CG+CB,CGCR+CG+CB),
(1)
fd(C)=(Σ1(μ1μ2))t(Cμ1+μ22)log(p2/p1),
(2)
where C is a color signal transformed from the trichromatic signal (output of the RGB camera) Ck (k∈{R,G,B}). A discriminant score fd (C) is computed using several predefined parameters and the observed color signal C. In Eq. (2), Σ −1 is a pooled variance-covariance matrix of predefined color signal sets C1 and C2, and μ1 and μ2 are the means of C1 and C2, respectively. The measurement of predefined data sets C1 and C2 is described in Part I [1

1. K. Nishino, M. Nakamura, M. Matsumoto, O. Tanno, and S. Nakauchi, “Optical filter for highlighting spectral features Part I: design and development of the filter for discrimination of human skin with and without an application of cosmetic foundation,” Opt. Express 19(7), 6020–6030 (2011). [CrossRef] [PubMed]

]. When fd (C) > 0, an observed color signal is classified as category 1 otherwise as category 2. The prior probabilities p1 and p2 for categories 1 and 2 were assumed to be p1 = p2 = 0.5.

As described in the accompanying paper [1

1. K. Nishino, M. Nakamura, M. Matsumoto, O. Tanno, and S. Nakauchi, “Optical filter for highlighting spectral features Part I: design and development of the filter for discrimination of human skin with and without an application of cosmetic foundation,” Opt. Express 19(7), 6020–6030 (2011). [CrossRef] [PubMed]

], the discriminant score computation for each pixel of the image yielded satisfactory detection results. Meanwhile, the current study aims to create the calibration curve for estimating the amount of cosmetic foundation from the discriminant score.

2.2. Quantitative measurement

To obtain the calibration curve, a quantitatively controlled amount of liquid cosmetic foundation was applied to the face and facial images were measured. The application area was 3 × 2 cm2 and there were 14 such areas of application over the face. The amounts of foundation applied were 0.5, 1, 1.5, 2, 3, 4, 5, 6, 8, and 10 μL. They were strictly controlled by recoating using a micropipette. The observation angles were −45, 0, and 45°. All the devices used for measurement were fixed through the measurement and the subjects turned sideways to take the profile images (Fig. 2
Fig. 2 Areas of application of cosmetic foundation for quantitative measurement. Cosmetic foundation was applied to 14 areas. Observation angles were −45, 0, and 45°. The application areas indicated in each facial image were used for the following analysis.
). We extracted the color signals from the numbered areas in the figure manually to compute the discriminant scores. As shown in Fig. 2, several areas of application were used to obtain multiple observation angles. Thus, the estimation error arising from the observation angle will also be discussed using the crossover. The measurement device was a commercially available camera (Nikon D70) and a fluorescent light (Diva-Lite, 6300K) was selected as the source of illumination. Polarizing films were installed on both the camera and the source of illumination to eliminate specular reflection. This fluorescent light had a broad emission spectrum over the entire visible wavelength region, and lower heat was generated. The measurement was carried out on four Japanese females with ages in the range 18–40 years. The subjects chose their personal favorite color of cosmetic foundation.

2.3. Evaluation of facial conditions after cleansing

Next, we attempted to apply the filter to visualize the distribution of foundation across the human face under various conditions. This experiment aimed to construct a statistical discussion of foundation distribution under realistic facial conditions measured using the proposed filter. The visualization target was the difference in the foundation distributions depending on the method of cleansing. It is empirically known that foundation tends to remain around the eyes, nose, and hairline even after cleansing. We tested the developed filter to determine whether it could visualize and nondestructively inspect such foundation distributions under realistic conditions.

In this experiment, the following facial conditions were measured:

  • 1. Applying foundation
  • 2. Cleansed by the subject (CS)
  • 3. Applying foundation
  • 4. Cleansed in accordance with professional instruction (CP)

Here, we focus on the comparison of condition 2 (CS) and condition 4 (CP) to evaluate the differences in the finishing based on the cleansing methods adopted.

3. Results

3.1. Results of the quantitative measurement

Figure 3(a)
Fig. 3 (a) Discriminant scores that were manually extracted from the measured images. The error bars show the standard deviations. (b) Estimation errors due to the observation angle. Solid lines show the discriminant scores of the forehead and jaw (Nos. 1–4, 13, and 14 shown in Fig. 2) observed at a 0° angle. Scores at the same positions observed at −45° and +45° are shown as broken lines.
shows the discriminant scores for various amounts of applied cosmetic foundations that were manually extracted from the measured images. Error bars show the standard deviations. There were strong positive correlations with logarithmic character between the amounts and the discriminant scores. Figure 3(b) shows the estimation error arising from the observation angle. Solid lines show the discriminant scores of the forehead and jaw observed at a 0° angle. The scores at the same positions observed at −45° and +45° are shown as broken lines. There were few estimation errors between the observation angles. Thus, the estimation of the amount of foundation was not entirely restricted to the observation angle of the skin surface.

3.2. Calibration curve fitting

A logarithmic relationship between the discriminant score and the applied amount of foundation was determined using quantitative measurements. Thus, the amount of foundation may be estimated with high accuracy using discriminant score. In this study, we proposed two types of estimation formulae:

y=aln(x+c)+b,
(3)
y=a{ln(x+by0+c)ln(by0+c)}+y0.
(4)

We used the steepest descent and the least squares methods to solve the undefined parameters a, b, and c. Table 1

Table 1. Parameters and Evaluated Values of the Calibration Curve

table-icon
View This Table
describes the estimated parameters, decision coefficients, and standard error of prediction (SEP). Finally, calibration equations to estimate the amount of foundation from the discriminant score are obtained:

x=exp{yba}c,
(5)
x=(by0+c)exp{yy0a}(by0+c).
(6)

Figure 4
Fig. 4 Relationship between the applied and estimated amounts of foundation. Error bars denote the standard deviations. (a) Without baseline correction: decision coefficient is 0.9152 and SEP is 0.1557. (b) With baseline correction: decision coefficient is 0.8978 and SEP is 0.1645. Estimation accuracy is higher for (a) than that for (b). However, (b) has no error when the applied foundation is zero.
shows the relationship between the applied and estimated amounts of foundation. Error bars denote the standard deviations. As described in Table 1, both the estimation formulae indicate high estimation accuracy of the applied cosmetic foundation. The decision coefficient and SEP showed slightly higher accuracy in the simpler formula. However, the estimation by using the baseline correction achieved a marginally better result for the case in which only a small amount had been applied (Fig. 4(a)). The SEP was advantageous for lesser amounts (<0.1 μL/cm2, Table 1).

3.3. Visualization results

Both the proposed calibration curves showed sufficient estimation accuracy as described in Table 1 and Figs. 4(a) and (b). To compare these two formulae, we computed the foundation maps of the images that we4re measured in the quantitative measurement (2.2). Figure 5
Fig. 5 Comparison of foundation maps computed using different calibration curves. (a)Without baseline correction. (b) With baseline correction. These are results of one subject. Computed images were obtained from a make-up doll image by using the “local weighted mean method” of image transformation [12].
shows the maps of the computed amount of foundation of one subject. We applied this foundation to a make-up doll by using the “local weighted mean method” of image transformation [12

12. G. Ardeshir, “Image Registration by approximation method,” Image Vis. Comput. 6(4), 255–261 (1988). [CrossRef]

].

The maps of the amount of foundation were visualized in both ways with high estimation accuracy. When cosmetic foundation was not applied, slight errors appeared on the foundation map computed by the calibration curve without any baseline correction. Thus, the estimation formula with the baseline correction was more suitable when the estimation target of the amount of foundation was <0.167 μL/cm2. On the other hand, the calibration curve without any baseline correction results showed better estimation accuracy for most amounts of cosmetic foundation. Thus, the better estimation method should be selected depending on the visualization target.

Figure 6
Fig. 6 Foundation maps of test data showing the foundation distribution of realistic made-up skin. Cosmetic foundation was uniformly applied over the face so that the facial skin color looks uniform.
shows the estimated foundation maps. The cosmetic foundation was distributed unevenly even though the skin color looked uniform to the human eye.

3.4. Evaluation of the facial condition after cleansing

Results achieved from the above examinations showed that the proposed filter had great sensitivity for detecting applied cosmetic foundation and could be a revolutionary visualization tool for the cosmetic research field. Thus, finally, the abovementioned techniques were applied to visualize the distribution of foundation across the human face after cleansing.

Figure 7
Fig. 7 Average cosmetic foundation maps. (a) Foundation distribution of CS and (b) foundation distribution of CP. Average cosmetic foundation maps were computed using image transformation. Standard deviations among subjects were also computed for each condition and were used to show the reliability by changing the transparency rate depending on the standard deviation. The calibration curve with baseline correction (Eq. (6)) was used to compute the foundation map. All pixel values of (b) were zero because the average map of CP was used as the baseline image.
shows the cosmetic foundation distributions of CS and CP. There were remains of the foundation in some areas even after cleansing, especially in the hairline and the areas around the eyes (Fig. 7(a)). Between CS and CP, the amount of foundation computed from these images showed significant difference (p < 0.05). The results reflected that the finish of the cleanse performed by the subject (CS) was poorer than the finish of the cleanse performed according to professional instruction (CP). In addition, the comparison of normal and high-water-resistant foundation showed significant differences (see Fig. 8
Fig. 8 Comparison of the normal and high-water-resistant foundation. Figures show the average foundation distribution maps of CS. (a) The average map of subjects who used normal foundation and (b) the average map for high-water-resistant foundation.
). Thus, the obtained results confirmed the practical performance of our filter.

4. Discussion

In [10

10. M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).

], the reflectance spectra of made-up skin were estimated by the Kubelka–Munk theory [13

13. P. Kubelka, “New contributions to the optics of intensely light-scattering materials. Part I,” J. Opt. Soc. Am. 38(5), 448–457 (1948). [CrossRef] [PubMed]

] and good estimation accuracy were obtained. Meanwhile, the output of our filter had an obvious logarithmic relationship with the amount of cosmetic foundation. In this section, we described the comparison of these two methods of estimation.

In [10

10. M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).

], estimation results were achieved using following equations:
R(λ)=(1RS)(Rm(λ)+Tm(λ)Rskin(λ)1Rm(λ)Rskin(λ))+RS,Rm(λ)=1am(λ)+bm(λ)cothDmbm(λ)Sm(λ),Tm(λ)=bm(λ)am(λ)sinhDmbm(λ)Sm(λ)+bm(λ)coshDmbm(λ)Sm(λ),
(7)
where R(λ) is the reflectance spectra of made-up skin. R(λ) is determined using four parameters: the specular reflectance between air and the skin surface RS; the reflectance spectra of the human skin surface Rskin; and the reflectance and transmittance spectra of cosmetic foundation Rm and Tm, respectively. Also, Tm and Rm are determined by the thickness Dm and the optical characteristics of cosmetic foundation am, bm, and Sm. According to [10

10. M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).

], these optical characteristic parameters can also be estimated using the Kubelka–Munk theory as follows:
Sm(λ)=1bm(λ)D0(cot  h1am(λ)R0(λ)bm(λ)cot  h1am(λ)Rg(λ)bm(λ)),am=12(1R+R),bm=(am21),12
(8)
where D0 and R 0 are the thickness of the cosmetic foundation layer and the spectral reflectance of the surface on which a thin layer of cosmetic foundation is formed, respectively. The parameter Rg is the reflectance spectrum of a background material and Rinf is the spectral reflectance of the surface having a thick cosmetic foundation layer. Therefore, optical characteristic parameters of cosmetic foundations can be estimated using experimentally obtained parameters R0, Rg, Rinf, and D0. Then, the reflectance spectra of made-up skin R(λ) are determined using the thickness Dm and the bare skin reflectance Rskin.

Figure 9
Fig. 9 Comparison of the Kubelka–Munk theory and the calibration curve. (a) Relationship between the relative thickness of the cosmetic foundation layer and the estimated spectral reflectance based on the Kubelka–Munk theory. This computation was performed under the assumption based on actual measured values that the parameters were Rinf = 0.6, Rg = 0, R0 = Rinf/100, and D0 = 1. Spectral reflectance of bare skin Rskin was changed from 0.2 to 0.35 in steps of 0.01. (b) Relationship between the estimated amount of liquid foundation and the discriminant score that was computed based on the calibration curve with baseline correction. Discriminant scores in bare skin were changed from −7 to 2 in steps of 0.5.
shows the comparison of this theory and the calibration curve. Figure 9(a) shows the relationship between the relative thickness of the cosmetic foundation layer and the estimated spectral reflectance based on the Kubelka–Munk theory. This computation was performed on the assumption based on actual measured values in which the parameters are Rinf = 0.6, Rg = 0, R0 = Rinf/100, and D0 = 1. The spectral reflectance of bare skin Rskin was changed from 0.2 to 0.35 in steps of 0.01. Figure 9(b) shows the relationship between the estimated amount of liquid foundation and the discriminant score that was computed based on the calibration curve with the baseline correction. The discriminant scores of bare skin were changed from −7 to 2 in steps of 0.5. The estimated reflectance and the discriminant score demonstrated quite similar properties in a correlation with the amount of cosmetic foundation indicating that the developed optical filter was optimized to reflect the change in spectral reflectance by the application of cosmetic foundation to the filtered RGB outputs (Fig. 9). According to Eq. (8) and Fig. 7(a), the spectral reflectance of made-up skin was affected by the spectral reflectance of bare skin and the optical characteristics of cosmetic materials. Therefore, the output of our filter should also be affected by different types of cosmetic foundation even though we already confirmed detection for 30 products and estimation for three products. In future work, the calibration curve will be improved to include optical characteristics of cosmetic materials.

5. Conclusion

This study established the calibration curve to estimate the amount of applied cosmetic foundations and applied it to visualize the foundation distribution under realistic facial conditions.

We designed the spectral transmittance of the filter to enhance the spectral difference of two predefined spectral data sets. The designed theoretical spectral transmittance was optically realized as a multilayer thin film filter. The color distributions of the obtained RGB images taken with a digital camera equipped with the filter show a distinct enhancement of the spectral differences between the two sets of spectra, which were invisible to the human eye.

In addition, there were strong positive correlations between the amount of applied foundation and the discriminant score. Therefore, we plotted two calibration curves as described by Eq. (5) and Eq. (6). Equation (6) includes a baseline correction obtained from a bare skin image. Both equations showed high estimation accuracy as described in Table 1. Also, the visualization results in Fig. 4 showed high estimation accuracy for both methods. The calibration curve with the baseline correction (Eq. (6)) specifically achieved better estimation for lesser amounts while the decision coefficient of the calibration curve without any baseline correction (Eq. (5)) was higher.

The visualization results of the maps of the amounts of cosmetic foundation on realistically made-up skin displayed uneven foundation distributions (Fig. 6). The cosmetic foundation had been applied to make the skin color uniform all over the face. This result suggests that the optical filter and the calibration curves were very useful for evaluating various made-up conditions such as the evaluation of foundation deterioration with the passage of time.

Finally, we applied the optical filter to visualize the differences in the foundation distributions between the two cleansing methods. Visualized distributions of foundations clearly indicated the areas on the face where foundation remained even after cleansing.

Our method is not restricted to the field of cosmetic but could be applied to other applications such as food inspection, medical imaging, or other targets that require nondestructive inspection. The proposed system could be implemented as part of an online measuring system in a compact and an inexpensive way.

Acknowledgments

The authors appreciate the assistance of Itoh Optical Industrial Co. Ltd. for providing the optical filter and thereby helping in conducting the experiments. This work was also supported in part by the Global COE Program “Frontiers of Intelligent Sensing” from the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan.

References and links

1.

K. Nishino, M. Nakamura, M. Matsumoto, O. Tanno, and S. Nakauchi, “Optical filter for highlighting spectral features Part I: design and development of the filter for discrimination of human skin with and without an application of cosmetic foundation,” Opt. Express 19(7), 6020–6030 (2011). [CrossRef] [PubMed]

2.

E. Angelopoulou, The reflectance spectrum of human skin, (Technical Report MS-CIS-99–29, GRASP Laboratory, Department of Computer and Information Science, University of Pennsylvania, USA, 1999).

3.

N. Tsumura, M. Kawabuchi, H. Haneishi, and Y. Miyake, “Mapping pigmentation in human skin by multi-visible-spectral imaging by inverse optical scattering technique,” J. Imag. Sci. Tech. 45(5), 444–450 (2001).

4.

I. V. Meglinski and S. J. Matcher, “Quantitative assessment of skin layers absorption and skin reflectance spectra simulation in the visible and near-infrared spectral regions,” Physiol. Meas. 23(4), 741–753 (2002). [CrossRef] [PubMed]

5.

G. N. Stamatas, B. Z. Zmudzka, N. Kollias, and J. Z. Beer, “Non-invasive measurements of skin pigmentation in situ,” Pigment Cell Res. 17(6), 619–626 (2004). [CrossRef]

6.

M. Moncrieff, S. Cotton, E. Claridge, and P. Hall, “Spectrophotometric intracutaneous analysis: a new technique for imaging pigmented skin lesions,” Br. J. Dermatol. 146(3), 448–457 (2002). [CrossRef] [PubMed]

7.

J. K. Wagner, C. Jovel, H. L. Norton, E. J. Parra, and M. D. Shriver, “Comparing quantitative measures of erythema, pigmentation and skin response using reflectometry,” Pigment Cell Res. 15(5), 379–384 (2002). [CrossRef] [PubMed]

8.

G. N. Stamatas, M. Southall, and N. Kollias, “In vivo monitoring of cutaneous edema using spectral imaging in the visible and near infrared,” J. Invest. Dermatol. 126(8), 1753–1760 (2006). [CrossRef] [PubMed]

9.

G. N. Stamatas and N. Kollias, “In vivo documentation of cutaneous inflammation using spectral imaging,” J. Biomed. Opt. 12(5), 051603 (2007). [CrossRef] [PubMed]

10.

M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).

11.

S. J. Preece and E. Claridge, “Spectral filter optimization for the recovery of parameters which describe human skin,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 913–922 (2004). [CrossRef]

12.

G. Ardeshir, “Image Registration by approximation method,” Image Vis. Comput. 6(4), 255–261 (1988). [CrossRef]

13.

P. Kubelka, “New contributions to the optics of intensely light-scattering materials. Part I,” J. Opt. Soc. Am. 38(5), 448–457 (1948). [CrossRef] [PubMed]

OCIS Codes
(330.1690) Vision, color, and visual optics : Color
(330.6180) Vision, color, and visual optics : Spectral discrimination
(310.6845) Thin films : Thin film devices and applications

ToC Category:
Vision, Color, and Visual Optics

History
Original Manuscript: November 1, 2010
Manuscript Accepted: March 7, 2011
Published: March 17, 2011

Virtual Issues
Vol. 6, Iss. 4 Virtual Journal for Biomedical Optics

Citation
Ken Nishino, Mutsuko Nakamura, Masayuki Matsumoto, Osamu Tanno, and Shigeki Nakauchi, "Optical filter highlighting spectral features Part II: quantitative measurements of cosmetic foundation and assessment of their spatial distributions under realistic facial conditions," Opt. Express 19, 6031-6041 (2011)
http://www.opticsinfobase.org/vjbo/abstract.cfm?URI=oe-19-7-6031


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. K. Nishino, M. Nakamura, M. Matsumoto, O. Tanno, and S. Nakauchi, “Optical filter for highlighting spectral features Part I: design and development of the filter for discrimination of human skin with and without an application of cosmetic foundation,” Opt. Express 19(7), 6020–6030 (2011). [CrossRef] [PubMed]
  2. E. Angelopoulou, The reflectance spectrum of human skin, (Technical Report MS-CIS-99–29, GRASP Laboratory, Department of Computer and Information Science, University of Pennsylvania, USA, 1999).
  3. N. Tsumura, M. Kawabuchi, H. Haneishi, and Y. Miyake, “Mapping pigmentation in human skin by multi-visible-spectral imaging by inverse optical scattering technique,” J. Imag. Sci. Tech. 45(5), 444–450 (2001).
  4. I. V. Meglinski and S. J. Matcher, “Quantitative assessment of skin layers absorption and skin reflectance spectra simulation in the visible and near-infrared spectral regions,” Physiol. Meas. 23(4), 741–753 (2002). [CrossRef] [PubMed]
  5. G. N. Stamatas, B. Z. Zmudzka, N. Kollias, and J. Z. Beer, “Non-invasive measurements of skin pigmentation in situ,” Pigment Cell Res. 17(6), 619–626 (2004). [CrossRef]
  6. M. Moncrieff, S. Cotton, E. Claridge, and P. Hall, “Spectrophotometric intracutaneous analysis: a new technique for imaging pigmented skin lesions,” Br. J. Dermatol. 146(3), 448–457 (2002). [CrossRef] [PubMed]
  7. J. K. Wagner, C. Jovel, H. L. Norton, E. J. Parra, and M. D. Shriver, “Comparing quantitative measures of erythema, pigmentation and skin response using reflectometry,” Pigment Cell Res. 15(5), 379–384 (2002). [CrossRef] [PubMed]
  8. G. N. Stamatas, M. Southall, and N. Kollias, “In vivo monitoring of cutaneous edema using spectral imaging in the visible and near infrared,” J. Invest. Dermatol. 126(8), 1753–1760 (2006). [CrossRef] [PubMed]
  9. G. N. Stamatas and N. Kollias, “In vivo documentation of cutaneous inflammation using spectral imaging,” J. Biomed. Opt. 12(5), 051603 (2007). [CrossRef] [PubMed]
  10. M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).
  11. S. J. Preece and E. Claridge, “Spectral filter optimization for the recovery of parameters which describe human skin,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 913–922 (2004). [CrossRef]
  12. G. Ardeshir, “Image Registration by approximation method,” Image Vis. Comput. 6(4), 255–261 (1988). [CrossRef]
  13. P. Kubelka, “New contributions to the optics of intensely light-scattering materials. Part I,” J. Opt. Soc. Am. 38(5), 448–457 (1948). [CrossRef] [PubMed]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited