OSA's Digital Library

Virtual Journal for Biomedical Optics

Virtual Journal for Biomedical Optics

| EXPLORING THE INTERFACE OF LIGHT AND BIOMEDICINE

  • Editor: Gregory W. Faris
  • Vol. 2, Iss. 1 — Jan. 19, 2007
« Show journal navigation

A new image calibration system in digital colposcopy

Wenjing Li, Marcelo Soto-Thompson, and Ulf Gustafsson  »View Author Affiliations


Optics Express, Vol. 14, Issue 26, pp. 12887-12901 (2006)
http://dx.doi.org/10.1364/OE.14.012887


View Full Text Article

Acrobat PDF (1065 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Colposcopy is a primary diagnostic method used to detect cancer and precancerous lesions of the uterine cervix. During the examination, the metaplastic and abnormal tissues exhibit different degrees of whiteness (acetowhitening effect) after applying a 3%–5% acetic acid solution. Colposcopists evaluate the color and density of the acetowhite tissue to assess the severity of lesions for the purpose of diagnosis, telemedicine, and annotation. However, the color and illumination of the colposcopic images vary with the light sources, the instruments and camera settings, as well as the clinical environments. This makes assessment of the color information very challenging even for an expert. In terms of developing a Computer-Aided Diagnosis (CAD) system for colposcopy, these variations affect the performance of the feature extraction algorithm for the acetowhite color. Non-uniform illumination from the light source is also an obstacle for detecting acetowhite regions, lesion margins, and anatomic features. There fore, in digital colposcopy, it is critical to map the color appearance of the images taken with different colposcopes into one standard color space with normalized illumination. This paper presents a novel image calibration technique for colposcopic images. First, a specially designed calibration unit is mounted on the colposcope to acquire daily calibration data prior to performing subject examinations. The calibration routine is fast, automated, accurate and reliable. We then use our illumination correction algorithm and a color calibration algorithm to calibrate the exam data. In this paper we describe these techniques and demonstrate their applications in clinical studies.

© 2006 Optical Society of America

1. Introduction

Uterine cervical cancer is the second most common cancer in women worldwide, with nearly 500,000 new cases and over 270,000 deaths annually [1

1. International agency for research in cancer, “Globocan 2002 database,” http://www-dep.iarc.fr/.

]. Invasive disease is preceded by pre-malignant cervical intraepithelial neoplasia (CIN) and if detected early and treated adequately, cervical cancer can be universally prevented [2

2. D. G. Ferris, J. T. Cox, D. M. O’Connor, V. C. Wright, and J. Foerster, Modern Colposcopy. Textbook and Atlas (American Society for Colposcopy and Cervical Pathology, 2004).

].

Colposcopy is one of the primary diagnostic methods used to detect CIN and cervical cancer, following an abnormal cytological screen (Papanicolaou smear). The purpose of a colposcopic examination is to identify and rank the severity of lesions, so that biopsies representing the highest-grade abnormality can be taken, if necessary. A colposcopic examination involves a systematic visual evaluation of the lower genital tract (cervix, vulva and vagina), with special emphasis on the subjective appearance of metaplastic epithelium comprising the transformation zone on the cervix. During the exam, a 3–5% acetic acid solution is applied to the cervix, causing abnormal and metaplastic epithelia to turn white. Cervical cancer precursor lesions and invasive cancer exhibit certain distinctly abnormal morphologic features that can be identified by colposcopic examination. Lesion characteristics such as color or opacity, margin shape, blood vessel caliber, intercapillary spacing and distribution, and contour are considered by colposcopists to derive a clinical diagnosis [3

3. R. Reid and P. Scalzi, “Genital warts and cervical cancer. VII. An improved colposcopic index for differentiating benign papillomaviral infections from high-grade cervical intraepithelial neoplasia,” Am. J. Obstet. Gynecol. 153, 611–618 (1985). [PubMed]

]. Lugol’s iodine is another contrast solution often used during colposcopy in which the color difference of the iodine staining also assists in identifying and differentiating the severity of the lesions.

However, the color and appearance of digital colposcopic images vary with the light sources, the instruments and camera settings, as well as the clinical environment. As illustrated in Fig. 1, the color of the epithelium may look very different (including normal and abnormal findings) in cervical images acquired with different instruments or at different times. This makes the assessment of the color information very challenging, even for an expert. Using an objective image calibration technique accompanied by corresponding monitor calibration technique may help the physician/colposcopists to better assess the information in cervical images in terms of diagnosis and severity.

Fig. 1. Colposcopic images with various color and illumination.

The use of digital imaging is revolutionizing medical imaging and enables sophisticated computer programs to assist the physicians with Computer-Aided-Diagnosis (CAD). Clinicians and academia have suggested and shown proof of concept to use automated image analysis of cervical imagery for cervical cancer screening and diagnosis [4–6

4. B. L. Craine and E. R. Craine, “Digital imaging colposcopy: basic concepts and applications,” Obstet. Gynecol. 82, 869–873 (1993). [PubMed]

]. Various image processing algorithms have been developed to detect different colposcopic features, such as acetowhite color[7–9

7. S. Gordon, G. Zimmerman, and H. Greenspan, “Image Segmentation of Uterine Cervix Images for Indexing in PACs,” in Proceedings of IEEE 17th Symposium on Computer-based Medical Systems (2004).

], lesion margins [10

10. I. Claude, R. Winzenrieth, P. Pouletaut, and J-C. Boulanger, “Contour Features for Colposcopic Images Classification by Artificial Neural Networks,” in Proceedings of International Conference on Pattern Recognition, 771–774 (2002).

,11

11. V. Van Raad, Z. Xue, and H. Lange, “Lesion margin analysis for automated classification of cervical cancer lesions,” Image Processing, J. M. Reinhardt and J. P. Pluim, eds., in Proc. SPIE6144 (2006). [CrossRef]

], and blood vessels [12–14

12. Q. Ji, J. Engel, and E. Craine, “Texture Analysis for Classification of Cervix Lesions,” IEEE Trans. Med. Imaging 19, 1144–1149 (2000). [CrossRef]

]. On the other hand, lack of color calibration of the digital images makes it very difficult to accurately extract the color property of the acetowhite lesions. The non-uniform light distribution has also been a major obstacle extracting lesion margins and blood vessel structures.

CAD for colposcopy could have a direct impact on improving women’s health care and reducing the associated costs. Accurate color calibration is a crucial factor in developing a CAD system for colposcopy. Several image enhancement techniques, such as histogram stretching and/or equalization, have been used in an attempt to compensate for uneven light distribution or color differences [13

13. Y. Srinivasan, D. Hernes, B. Tulpule, S. Yang, J. Guo, S. Mitra, S. Yagneswaran, B. Nutter, B. Phillips, R. Long, and D. Ferris, “A probabilistic approach to segmentation and classification of neoplasia in uterine cervix images using color and geometric features,” Image Processing, J. M. Fitzpatrick and J. M. Reinhardt, eds., in Proc. SPIE5747, 995–1003 (2005). [CrossRef]

,15

15. S. Yang, J. Guo, P. King, Y. Sriraja, S. Mitra, B. Nutter, D. Ferris, M. Schiffman, J. Jeronimo, and R. Long, “A multi-spectral digital cervigram™ analyzer in the wavelet domain for early detection of cervical cancer,” Image Processing, J. M. Fitzpatrick and M. Sonka, eds., in Proc. SPIE5370, 1833–1844 (2004). [CrossRef]

]. However, illumination differences are dangerous to be processed via histogram technologies in cases where color is used for diagnosis. Those techniques offer maximal differentiation, but do not map color information correctly. To our knowledge, the design of a system that both corrects the non-uniform light distribution and calibrates the color of the colposcopic images has not yet been reported.

STI® Medical Systems are developing digital imaging technology for cervical cancer screening and diagnosis. On that note, we have developed a high-resolution digital colposcope to acquire high-resolution digital imagery for colposcopy. To accompany the digital colposcope, we have designed a calibration unit to acquire calibration data at the clinical sites. The calibration is performed daily before the subject examination. The process is fast, automatic and easily done by a nurse or clinical operator. We use an illumination correction algorithm and a color calibration algorithm to calibrate the exam data. The illumination correction algorithm normalizes the light distribution by dividing the exam image with its corresponding neutral gray target image. The color calibration algorithm estimates the linear mapping between two different color spaces and uses an optimization process to compensate for any non-linearities. Morphological operations are also applied for noise removal and color patch detection. Our calibration system has been successfully applied at multiple clinical sites. A calibrated digital colposcopic image database with to date 149 human subjects is under construction.

2. Method description

Generally speaking, the colors in an image depend on the light source, the image acquisition device, and the properties of the subject being imaged. The red, green, and blue color filters of a digital color camera are designed to mimic the color sensitivity of the human eye and are, thus, said to be creating a “true” color image. In reality, the color filter responses are fairly dissimilar to the sensitivity of the human eye which means that color cameras and the eye represent colors quite differently. The different color representations are especially noticeable under different lighting conditions. Consequently, depending on lighting conditions and camera characteristics, digital color image often appear different from what is perceived by eye. The goal of image calibration is that the colors should appear to be identical, independent of camera/camera settings and light source used. This can be achieved by mapping the color appearance of the images taken with different instruments into a standard color space, as illustrated in Fig. 2.

Fig. 2. The concept of color calibration: mapping the raw color space of different instruments into a standard color space.

The entire calibration procedure proposed for the colposcopic image calibration is shown in Fig. 3. Both exam data and calibration data are acquired at the clinical sites using the same instrument. Calibration data includes images of a gray target for gray balance and a color target for color calibration (see Fig. 4). The image of the color target is processed by a gray balance algorithm to normalize the light distribution. This image is then used to compute color correction matrices for the color calibration algorithm. The exam data are processed using the gray balance algorithm followed by the color correction algorithm. The calibrated exam data are then ready for algorithm development of the CAD system for colposcopy, colposcopic image annotation, and telemedicine.

Fig. 3. Calibration Procedure

2.1 Gray balance algorithm

The gray balance calibration is used to normalize the spatial variations of the light source and the camera response. In our current implementation, we use the gray color corresponding to Neutral 5 color patch in the ColorChecker from GretagMacbeth (www.gretagmacbeth.com) as the calibration target. An image of the gray calibration target is shown in Fig. 4(a). The reflectance value of the gray target is 20% over the entire visible spectrum. The corresponding 8-bit RGB values are (122, 122, 121) in the sRGB color space for illuminant D65.

Fig. 4. Calibration Targets used (a) Gray target, (b) Color target

The gray balance algorithm is based on the following equation:

Cgb(x,y)=Craw(x,y)Cbackground(x,y)Cgrayflat(x,y)Cgraybackground(x,y)×Sc
(1)

where Cgb(x,y) is the gray balanced image, Craw(x,y) is the raw monochrome channel image, Cbackground(x,y) is the background image that corresponds spatially to the raw image acquired with ambient light on but with the instrument light off, Cgrayflat(x,y) is the image of the gray target with both ambient light and the instrument light on, Cgraybackground(x,y) is the image taken from the gray target with ambient light on but instrument light off, and Sc is the gray target ground truth value for a monochrome color channel in the corresponding color space. Any background image acquired with the background, or room, light on should be acquired at a very short time before or after the acquisition of the raw image, such that no movement of the scene can happen. In order to reduce the noise to a minimum in the gray balanced images, we use the average of multiple gray dark and gray flat images in Eq. (1) and we also apply low-pass filtering on the images.

The gray balance algorithm can, in theory, be applied to the image in any color space. The use of RGB space is very common in digital image processing as they are produced by most color image-capturing devices and they can be directly displayed on a monitor. However, the use of RGB space in computer vision applications has drawbacks. First, there is high correlation among RGB channels for natural images [16

16. H. C. Li, “Regularized color clustering in medical image database,” IEEE Trans. Med. Imaging 19, 1150–1155 (2000). [CrossRef]

,17

17. H. Palus, Colour spaces (Chapmann and Hall, 1998).

]. Second, the representation of RGB is not very close to the way humans perceive colors, as humans normally determine color by parameters such as brightness, hue and colorfulness [18

18. G. Wyszecki and W. S. Styles, Color Science: Concepts and Methods, Quantitative Data and Formulae (New York: Wiley, 1982).

]. The third drawback is that RGB space is not perceptually uniform [19

19. S. A. Karkanis, D. K. Iakovidis, D. E. Maroulis, D. A. Karras, and M. Tzivras, “Computer-aided tumor detection in endoscopic video using color wavelet features,” IEEE Trans. Inf. Technol. Biomed. 7, 141–152 (2003). [CrossRef] [PubMed]

]. CIE-Lab is a perceptually uniform color space that has proven to perform better than RGB for color texture analysis [20

20. G. Paschos, “Perceptually uniform color spaces for color texture analysis: an empirical evaluation,” IEEE Trans. Image Process. 10, 932–936 (2001). [CrossRef]

]. It also has been applied to cervical image segmentation [7

7. S. Gordon, G. Zimmerman, and H. Greenspan, “Image Segmentation of Uterine Cervix Images for Indexing in PACs,” in Proceedings of IEEE 17th Symposium on Computer-based Medical Systems (2004).

]. A computational benefit of CIE-Lab (or any other approximately perceptually uniform space, like HSV or HLS) compared to RGB is that the gray balance correction only needs to be applied to the luminosity channel of the image [see Fig. 5(a)] whereas gray balancing must be applied to each color channel in RGB [see Fig. 5(b)].

Fig. 5. Gray Flat image (a), Cross-sectional signal in CIE-Lab color Space illustrating non-uniformity in the luminosity, L, channel only (b) Cross-sectional signal in RGB space showing non-uniformity in all three channels. The CIE-lab color space is scaled to the same type as the input signal which in our case is a 16-bit signal.

2.2 Color calibration algorithm

The color calibration algorithm is based on the work by Wolf [21

21. S Wolf, is preparing a manuscript to be called “Color Correction Matrix for Digital Still and Video Imaging Systems.”

] who presented an automated color correction matrix computation approach for correcting inaccurate color output by digital still and video imaging systems. Such a matrix-based color calibration method is common on imaging devices due to their generally well behaved performance. A look up table can also be generated from the color correction matrix to speed up the calibration process. The method uses a known reference image together with a robust least-square algorithm to estimate the optimal color channel matrix that must be applied to the output images in order to correct for color inaccuracies. The color transformation can be represented by the following equation:

(A1B1C1A2B2C2AnBnCn)=(1nativeA1nativeB1nativeC11nativeA2nativeB2nativeC21nativeAnnativeBnnativeCn)(a11a12a13a21a22a23a31a32a33a41a42a43)
(2)

where n is the number of color patches, (Ai Bi Ci) are the calibrated colors, (nativeAi nativeBi nativeCi) are the native colors extracted from the image of the color targets, and [ajk] is the 4×3 color transformation matrix.

A third-order polynomial fitting to the individual color components is also applied in order to perform a monotonic non-linear correction:

yj=b1jxj3+b2jxj2+b3jxj+b4j
(3)

where j=1, 2, 3, corresponds to the three channels of the image, xj denotes the color values of the individual color component before the non-linear correction, yj denotes the color values after the non-linear correction, and [bij] composes another 4×3 color transformation matrix. In our implementation of the color calibration algorithm we compute the linear color correction matrix first, followed by the monotonic non-linear correction. This order is selected empirically and provides the best fit to the true color values for our data set.

We adapted Wolf’s method into our cervical image calibration system, using the following steps:

  1. Perform gray balancing of a color calibration target such as the GretagMacbeth color checker [see Fig. 4(b)],
  2. Automatically extract the position and color values of the color patches from the gray balanced color calibration target using morphology operations.
  3. Compute the color correction matrices between the extracted values and the standard values in the preferred color space using Eq. (2) and Eq. (3),
  4. Apply the calculated color transformation matrix on any raw cervical image.

In general, the gray balance and the color calibration algorithms can be applied to any color space. In the current clinical application, the gray balance algorithm and the color correction matrices are computed in the CIE-Lab color space. Currently, the RGB signals are transformed to CIE-lab space using functions provided by Matlab under the assumption of an sRGB output for the camera RGB signal. The output CIE-Lab signal is automatically normalized to the same data type as the input signal, which is in our case is a 16 bit unsigned integer. The ground truth of the color values is provided by the manufacturer of the color target but also verified and adjusted over time with reflectance measurements performed using a NIST-traceable reflectance system (Optronic Laboratories, Inc., OL740-70).

3. Data acquisition and hardware

3.1 Digital Colposcope

Many high-resolution cervical images have been acquired using a customized 35mm film camera, the Cerviscope®, from National Testing Laboratories. These cervical images are of excellent quality and of very high-resolution but must be scanned prior to their use in digital image processing. Although digital video colposcopes are commercially available and most colposcope manufacturing companies provide means to attach still or video cameras to their colposcopes, digital colposcopy is still in its infancy. Research efforts using a color video camera attached to a standard colposcope for reflectance and fluorescence studies have been developed and tested in clinical settings [22

22. J. M. Benavides, S. Chang, S. Y. Park, R. Richards-Kortum, N. Mackinnon, C. MacAulay, A. Milbourne, A. Malpica, and M. Follen, “Multispectral digital colposcopy for in vivo detection of cervical cancer,” Opt. Express 11, 1223–1236 (2003). [CrossRef] [PubMed]

,23

23. A. Milbourne, S. Y. Park, J. L. Benedet, D. Miller, T. Ehlen, H. Rhodes, A. Malpica, J. Matisic, Niekirk D. Van, E. N. Atkinson, N. Hadad, N. Mackinnon, C. MacAulay, R. Richards-Kortum, and M. Follen, “Results of a pilot study of multispectral digital colposcopy for the in vivo detection of cervical intraepithelial neoplasia,” Gynecol. Oncol. (2005). [CrossRef] [PubMed]

] but high-resolution digital image acquisition systems are usually not available.

As a potential source of high-resolution digital imagery for colposcopy, STI’s digital colposcope was developed to acquire images with a resolution sufficient for vessel detection. The digital colposcope, as seen in Fig. 6, utilizes a standard colposcope (Seiler, Series 935), two high-resolution digital cameras (Kodak, DCS Pro 14n or SLR/n), and a fiber guided light source assembly (Seiler, Series 935 standard halogen lamp or Perkin Elmer, DiX1765 Xenon lamp). In addition to high-resolution imaging capabilities, the digital colposcope includes stereoscopic imaging capabilities (used for three-dimensional image reconstruction) and cross-polarized image acquisition (used to remove specular reflections). The instruments have been used to acquire cervical data at the Tripler Army Medical Center in Honolulu, Hawaii, the Eisenhower Army Medical Center in Augusta, Georgia, as well as in Instituto Especializado de Enfermedades Neoplásicas in Lima, Peru, and Hospital Regional in Cuzco, Peru.

Fig. 6. High-resolution digital colposcope with stereoscopic and cross-polarized imaging capabilities.

3.2 Calibration Unit Design

The calibration unit is designed to fully automate the data acquisition process of calibration and instrument characterization targets at the clinical sites. The unit, as shown in Fig. 7, consists of three main parts: 1) a motorized filter wheel (customized Thorlabs filter wheel FW102), 2) calibration targets, and 3) a light shielding tube. Up to six targets including a gray target and a color target for gray and color calibration, as well other targets for, e.g., stereoscopic image calibration, resolution, focus, and depth-of-focus verification can be mounted on the filter wheel.

The filter wheel is motorized and switches the calibration targets automatically. The light shielding tube is used to mimic the lighting condition when exam data is acquired. The calibration images are acquired daily at the clinical site by the operator, who controls the system through a calibration acquisition program. The process is highly automated and requires only three steps to be performed: 1) connect the calibration unit to the colposcope [see Fig. 7(b)], 2) start the calibration program, and 3) remove the calibration unit. The entire image acquisition process takes about 10 minutes and requires no supervision by the operator. After the acquisition of calibration data, the digital colposcope is ready for acquiring exam data for the entire day. The calibration data is used to calibrate the exam data acquired the same day.

Fig. 7. Calibration unit (a) The major components of the calibration unit: calibration targets, filter wheel, and the tube, (calibration target cover removed) (b) Calibration unit mounted on the digital colposcope to take calibration data (including the calibration target cover).

4. Experimental results

Our digital colposcopes and calibration units have been utilized at the four clinical sites mentioned previously, using three different digital colposcopes. To date, clinical exam data for 149 human subjects, including cervical images before and after application of acetic acid and Lugol’s iodine images, have been collected and calibrated.

4.1 Gray balancing

The result of the illumination correction on the gray target is visualized in Fig. 8. The light distribution of the luminosity channel in the gray flat image is shown in Fig. 8(a) and Fig. 8(b) is the corresponding light distribution after applying the gray balancing. These images clearly show the effect and importance of applying a gray balancing algorithm to the clinical data.

Fig. 8. (a) Non-uniform distribution of light, (b) Corrected distribution

4.2 Color calibration

The color calibration process is illustrated in Fig. 9 for the color calibration target. The original image is shown in Fig. 9(a), while Fig. 9(b) shows the result of the automatic patch finder, and (c) shows the final calibrated color target image. The mean error between the color values of the ground truth and the calibrated color values is less than 6 pixel values for 8 bit images (i.e. 2.4%).

Fig. 9. Images of the color targets, (a) original image, (b) result of patch finder, (c) calibrated image

4.3 Cervical image calibration

Examples of calibration results on cervical imagery can be seen in Fig. 10, Fig. 11, and Fig. 12. In these figures, the top row displays the cervical images before calibration, and the bottom row displays the corresponding calibrated images. In Fig. 10 and Fig. 11, examples of pre-acetowhite images, acetowhite images, and Lugol’s iodine images acquired in Augusta, USA and Lima, Peru, respectively, are shown. Examples of acetic acid images taken in Honolulu, USA are shown in Fig. 12. The colposcope used in Honolulu, USA, had different cameras and a different light source (Kodak DCS Pro 14n and Seiler, Series 935 Halogen lamp) compared to the camera and light source (Kodak DCS Pro SLR/n and Perkin Elmer, DiX1765 Xenon lamp) used in Augusta and Peru.

Visually, it can be seen that the periphery of the cervix looks darker in the un-calibrated images due to the inherent light source intensity drop-off and thus the periphery is brightened in the calibrated images. In Fig. 10 and Fig. 11, the raw images look bluer than the calibrated image. This is because the Xenon light source used has a strong blue emission component. As a comparison, the raw images shown in Fig. 12 look redder because the halogen light source has a “redder” emission spectrum. From the calibrated images we can see that the color effect caused by using different light sources and camera systems has been adequately corrected for by applying both gray balancing and color calibration.

Fig. 10. Calibration results I (Xenon lamp, Augusta data), (a), (b), and (c) are colposcopic pre-acetowhite, acetowhite, and Lugol’s iodine images, respectively, before calibration, and (d), (e), and (f) are corresponding calibrated images.
Fig. 11. Calibration results II (Xenon lamp, Lima data), (a), (b), and (c) are colposcopic pre-acetowhite, acetowhite, and Lugol’s iodine images, respectively, before calibration, and (d), (e), and (f) are corresponding calibrated images.
Fig. 12. Calibration results III (Halogen lamp, Honolulu data), (a) and (b) are colposcopic acetowhite images before calibration and (c) and (d) are corresponding calibrated images.

4.4 Performance evaluation

In colposcopy, epithelium that appears grossly normal but turns white after application of 3% to 5% acetic acid is called acetowhite epithelium. A feature that colposcopists use to assess the severity of a lesion is to evaluate the color and the density of the acetowhite reaction. Typically, abnormal acetowhite epithelium varies from a faint or a bright white (low-grade changes) to a dense gray-white (high-grade lesions). Acetowhite epithelium is one of the major diagnostic features observed in detecting cancer and pre-cancerous regions and it is the only diagnostic feature used in a cost effective screening method commonly used in developing countries, Visual Inspection with Acetic Acid (VIA). A detailed description of the diagnostic features of acetowhite epithelium can be found in reference [2

2. D. G. Ferris, J. T. Cox, D. M. O’Connor, V. C. Wright, and J. Foerster, Modern Colposcopy. Textbook and Atlas (American Society for Colposcopy and Cervical Pathology, 2004).

]. In order to develop a basic CAD system for colposcopy, we have implemented a fully unsupervised acetowhite feature extraction algorithm to analyze the color properties of acetowhite epithelium [24

24. W. Li, STI® Medical Systems, 733 Bishop Street, Honolulu, Hawaii 96813, is preparing a manuscript to be called “Acetowhite color feature extraction algorithm for cervical images.”

]. The algorithm segments the acetowhite regions based on K-means clustering in joint high dimensional color-texture space, and models different shapes of the colors as Gaussian mixtures. The mean value of the Gaussian represents the color property of the corresponding acetowhite region. In order to access the effectiveness of our calibration approach, we have run this algorithm on both un-calibrated and calibrated images, and compared the detection results with colposcopic annotations.

A cervical image acquired after the application of acetic acid is shown in Fig. 13. The data was acquired in Lima, Peru and was calibrated according to the process discussed in this paper. The subject was confirmed, by a standard histopathological exam, to have a high grade squamous intraepithelial lesion (HSIL). The acetowhite feature extraction algorithm has been applied to the same region of interest of the un-calibrated and calibrated images and with the same parameter settings. Since only unsupervised algorithms are involved, no learning and prior knowledge has been used. Fig. 13(a) shows the algorithm result on the un-calibrated image, Fig. 13(b) shows the algorithm result on the calibrated image, and Fig. 13(c) shows the expert colposcopist’s annotation. The blue lines indicate detected acetowhite regions with similar color properties.

Fig. 13. (a) Acetowhite region detection results for un-calibrated image, (b), Acetowhite region detection results on calibrated image, (c) colposcopist’s annotation. (Blue curves indicate detected regions.)

Although the current results are preliminary, we can see that the acetowhite detection result on the calibrated image provides a much better match to the annotation. The result on the un-calibrated image is significantly affected by both the non-uniform distribution of the light intensity and the perceived color information. The extracted color values of the corresponding acetowhite regions from both images are represented by the pseudo images shown in Fig. 14. These images are composed of the average acetowhite colors extracted from the non-calibrated and calibrated images respectively, such that the visual difference of colors can be easily inspected.

Fig. 14. The average color values of the extracted Acetowhite regions from (a) the uncalibrated image and (b) the calibrated image in Fig. 13. pre-acetowhite, acetowhite,

The parameters and settings of the acetowhite detection algorithm can be tweaked to accommodate the color information in the un-calibrated image so that the detection result better matches the annotation. However, without both gray balancing and color correction, these parameter settings will have to be modified when the detection algorithm is applied to images acquired with other colposcopy systems. Applying both gray balancing and color correction, the same parameter settings can be used independently of the colposcopy system.

5. Conclusions

The color and illumination of cervical images offer important information for cervical cancer screening and diagnosis. The inability to correct the color and illumination in colposcopic images has precluded the development of an effective CAD system for colposcopy. In this paper, a new colposcopic image calibration technique that standardizes the visual appearance of colposcopic images has been presented. Our image calibration technique includes the design of a fully automatic calibration unit for data acquisition at the clinical site, a gray balance algorithm for effective illumination correction, and a color calibration process. The system has been applied in multiple clinical sites with different instruments. The results of acetowhite feature extraction algorithm have shown the effectiveness of the proposed technique and the importance of applying both gray balancing and color correction. The technique is a vital element for annotation and further algorithm development purpose in CAD systems. It is also useful for telemedicine and building standard databases for research and development in industry and academia. The designed calibration system can be easily adapted to other similar medical applications.

Acknowledgments

Part of the work is supported by the US Army Medical Research and Materiel Command under Contract No. W81XWH-05-C-0005. The views, opinions and/or findings contained in this paper are those of the authors and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation. In the conduct of research where humans are the subjects, the investigators adhered to the polices regarding the protection of human subjects as prescribed by Code of Federal Regulations Title 45, Volume 1, Part 46; Title 32, Chapter 1, Part 219; and Title 21, Chapter 1, Part 50 (Protection of Human Subjects).

References and links

1.

International agency for research in cancer, “Globocan 2002 database,” http://www-dep.iarc.fr/.

2.

D. G. Ferris, J. T. Cox, D. M. O’Connor, V. C. Wright, and J. Foerster, Modern Colposcopy. Textbook and Atlas (American Society for Colposcopy and Cervical Pathology, 2004).

3.

R. Reid and P. Scalzi, “Genital warts and cervical cancer. VII. An improved colposcopic index for differentiating benign papillomaviral infections from high-grade cervical intraepithelial neoplasia,” Am. J. Obstet. Gynecol. 153, 611–618 (1985). [PubMed]

4.

B. L. Craine and E. R. Craine, “Digital imaging colposcopy: basic concepts and applications,” Obstet. Gynecol. 82, 869–873 (1993). [PubMed]

5.

W. Li, V. Van Raad, J. Gu, U. Hansson, J. Hakansson, H. Lange, and D. Ferris, “Computer-aided Diagnosis (CAD) for cervical cancer screening and diagnosis: a new system design in medical image processing,” Lecture Notes in Computer Science, CVBIA 2005240–250 (2005). [CrossRef]

6.

M. S. Mikhail, I. R. Merkatz, and S. L. Romney, “Clinical usefulness of computerized colposcopy: image analysis and conservative management of mild dysplasia,” Obstet. Gynecol. 80, 5–8 (1992). [PubMed]

7.

S. Gordon, G. Zimmerman, and H. Greenspan, “Image Segmentation of Uterine Cervix Images for Indexing in PACs,” in Proceedings of IEEE 17th Symposium on Computer-based Medical Systems (2004).

8.

H. Lange, “Automatic detection of multi-level acetowhite regions in RGB color images of the uterine cervix,” Image Processing, J. M Fitzpatrick and J. M. Reinhardt, eds., in Proc. SPIE5747, 1004–1017 (2005). [CrossRef]

9.

S. Gordon, G. Zimmerman, R. Long, S. Antani, J. Jeronimo, and H. Greenspan, “Content analysis of uterine cervix images: initial steps towards content based indexing and retrieval of cervigrams,” Image Processing, J. M. Reinhardt and J. P. Pluim, eds., in Proc. SPIE6144, 1549–1556 (2006).

10.

I. Claude, R. Winzenrieth, P. Pouletaut, and J-C. Boulanger, “Contour Features for Colposcopic Images Classification by Artificial Neural Networks,” in Proceedings of International Conference on Pattern Recognition, 771–774 (2002).

11.

V. Van Raad, Z. Xue, and H. Lange, “Lesion margin analysis for automated classification of cervical cancer lesions,” Image Processing, J. M. Reinhardt and J. P. Pluim, eds., in Proc. SPIE6144 (2006). [CrossRef]

12.

Q. Ji, J. Engel, and E. Craine, “Texture Analysis for Classification of Cervix Lesions,” IEEE Trans. Med. Imaging 19, 1144–1149 (2000). [CrossRef]

13.

Y. Srinivasan, D. Hernes, B. Tulpule, S. Yang, J. Guo, S. Mitra, S. Yagneswaran, B. Nutter, B. Phillips, R. Long, and D. Ferris, “A probabilistic approach to segmentation and classification of neoplasia in uterine cervix images using color and geometric features,” Image Processing, J. M. Fitzpatrick and J. M. Reinhardt, eds., in Proc. SPIE5747, 995–1003 (2005). [CrossRef]

14.

W. Li and A. Poisson, “Detection and characterization of abnormal vascular patterns in automated cervical image analysis,” Lecture Notes in Computer Science : Advances in Visual Computing 4292, 627–636 (2006). [CrossRef]

15.

S. Yang, J. Guo, P. King, Y. Sriraja, S. Mitra, B. Nutter, D. Ferris, M. Schiffman, J. Jeronimo, and R. Long, “A multi-spectral digital cervigram™ analyzer in the wavelet domain for early detection of cervical cancer,” Image Processing, J. M. Fitzpatrick and M. Sonka, eds., in Proc. SPIE5370, 1833–1844 (2004). [CrossRef]

16.

H. C. Li, “Regularized color clustering in medical image database,” IEEE Trans. Med. Imaging 19, 1150–1155 (2000). [CrossRef]

17.

H. Palus, Colour spaces (Chapmann and Hall, 1998).

18.

G. Wyszecki and W. S. Styles, Color Science: Concepts and Methods, Quantitative Data and Formulae (New York: Wiley, 1982).

19.

S. A. Karkanis, D. K. Iakovidis, D. E. Maroulis, D. A. Karras, and M. Tzivras, “Computer-aided tumor detection in endoscopic video using color wavelet features,” IEEE Trans. Inf. Technol. Biomed. 7, 141–152 (2003). [CrossRef] [PubMed]

20.

G. Paschos, “Perceptually uniform color spaces for color texture analysis: an empirical evaluation,” IEEE Trans. Image Process. 10, 932–936 (2001). [CrossRef]

21.

S Wolf, is preparing a manuscript to be called “Color Correction Matrix for Digital Still and Video Imaging Systems.”

22.

J. M. Benavides, S. Chang, S. Y. Park, R. Richards-Kortum, N. Mackinnon, C. MacAulay, A. Milbourne, A. Malpica, and M. Follen, “Multispectral digital colposcopy for in vivo detection of cervical cancer,” Opt. Express 11, 1223–1236 (2003). [CrossRef] [PubMed]

23.

A. Milbourne, S. Y. Park, J. L. Benedet, D. Miller, T. Ehlen, H. Rhodes, A. Malpica, J. Matisic, Niekirk D. Van, E. N. Atkinson, N. Hadad, N. Mackinnon, C. MacAulay, R. Richards-Kortum, and M. Follen, “Results of a pilot study of multispectral digital colposcopy for the in vivo detection of cervical intraepithelial neoplasia,” Gynecol. Oncol. (2005). [CrossRef] [PubMed]

24.

W. Li, STI® Medical Systems, 733 Bishop Street, Honolulu, Hawaii 96813, is preparing a manuscript to be called “Acetowhite color feature extraction algorithm for cervical images.”

OCIS Codes
(100.0100) Image processing : Image processing
(110.0110) Imaging systems : Imaging systems
(150.0150) Machine vision : Machine vision
(170.0170) Medical optics and biotechnology : Medical optics and biotechnology
(330.0330) Vision, color, and visual optics : Vision, color, and visual optics

ToC Category:
Medical Optics and Biotechnology

History
Original Manuscript: September 1, 2006
Revised Manuscript: November 27, 2006
Manuscript Accepted: November 29, 2006
Published: December 22, 2006

Virtual Issues
Vol. 2, Iss. 1 Virtual Journal for Biomedical Optics

Citation
Wenjing Li, Marcelo Soto-Thompson, and Ulf Gustafsson, "A new image calibration system in digital colposcopy," Opt. Express 14, 12887-12901 (2006)
http://www.opticsinfobase.org/vjbo/abstract.cfm?URI=oe-14-26-12887


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. International agency for research in cancer, "Globocan 2002 database," http://www-dep.iarc.fr/.
  2. D. G. Ferris, J. T. Cox, D. M. O'Connor, V. C. Wright, and J. Foerster, Modern Colposcopy. Textbook and Atlas (American Society for Colposcopy and Cervical Pathology, 2004).
  3. R. Reid and P. Scalzi, "Genital warts and cervical cancer. VII. An improved colposcopic index for differentiating benign papillomaviral infections from high-grade cervical intraepithelial neoplasia," Am. J. Obstet. Gynecol. 153,611-618 (1985). [PubMed]
  4. B. L. Craine and E. R. Craine, "Digital imaging colposcopy: basic concepts and applications," Obstet. Gynecol. 82,869-873 (1993). [PubMed]
  5. W. Li, V. Van Raad, J. Gu, U. Hansson, J. Hakansson, H. Lange, and D. Ferris, "Computer-aided Diagnosis (CAD) for cervical cancer screening and diagnosis: a new system design in medical image processing," Lecture Notes in Computer Science, CVBIA 2005240-250 (2005). [CrossRef]
  6. M. S. Mikhail, I. R. Merkatz, and S. L. Romney, "Clinical usefulness of computerized colposcopy: image analysis and conservative management of mild dysplasia," Obstet. Gynecol. 80,5-8 (1992). [PubMed]
  7. S. Gordon, G. Zimmerman, and H. Greenspan, "Image Segmentation of Uterine Cervix Images for Indexing in PACs," in Proceedings of IEEE 17th Symposium on Computer-based Medical Systems (2004).
  8. H. Lange, "Automatic detection of multi-level acetowhite regions in RGB color images of the uterine cervix," Image Processing, J. M Fitzpatrick and J. M. Reinhardt, eds., in Proc. SPIE5747, 1004-1017 (2005). [CrossRef]
  9. S. Gordon, G. Zimmerman, R. Long, S. Antani, J. Jeronimo, and H. Greenspan, "Content analysis of uterine cervix images: initial steps towards content based indexing and retrieval of cervigrams," Image Processing, J. M. Reinhardt and J. P. Pluim, eds., in Proc. SPIE6144, 1549-1556 (2006).
  10. I. Claude, R. Winzenrieth, P. Pouletaut, and J-C. Boulanger, "Contour Features for Colposcopic Images Classification by Artificial Neural Networks," in Proceedings of International Conference on Pattern Recognition, 771-774 (2002).
  11. V. Van Raad, Z. Xue, and H. Lange, "Lesion margin analysis for automated classification of cervical cancer lesions," Image Processing, J. M. Reinhardt and J. P. Pluim, eds., in Proc. SPIE6144 (2006). [CrossRef]
  12. Q. Ji, J. Engel, and E. Craine, "Texture Analysis for Classification of Cervix Lesions," IEEE Trans. Med. Imaging 19,1144-1149 (2000). [CrossRef]
  13. Y. Srinivasan, D. Hernes, B. Tulpule, S. Yang, J. Guo, S. Mitra, S. Yagneswaran, B. Nutter, B. Phillips, R. Long, and D. Ferris, "A probabilistic approach to segmentation and classification of neoplasia in uterine cervix images using color and geometric features," Image Processing, J. M. Fitzpatrick and J. M. Reinhardt, eds., in Proc. SPIE5747, 995-1003 (2005). [CrossRef]
  14. W. Li and A. Poisson, "Detection and characterization of abnormal vascular patterns in automated cervical image analysis," Lecture Notes in Computer Science : Advances in Visual Computing 4292,627-636 (2006). [CrossRef]
  15. S. Yang, J. Guo, P. King, Y. Sriraja, S. Mitra, B. Nutter, D. Ferris, M. Schiffman, J. Jeronimo, and R. Long, "A multi-spectral digital cervigram™ analyzer in the wavelet domain for early detection of cervical cancer," Image Processing, J. M. Fitzpatrick and M. Sonka, eds., in Proc. SPIE5370, 1833-1844 (2004). [CrossRef]
  16. H. C. Li, "Regularized color clustering in medical image database," IEEE Trans. Med. Imaging 19,1150-1155 (2000). [CrossRef]
  17. H. Palus, Colour spaces (Chapmann and Hall, 1998).
  18. G. Wyszecki and W. S. Styles, Color Science: Concepts and Methods, Quantitative Data and Formulae (New York: Wiley, 1982).
  19. S. A. Karkanis, D. K. Iakovidis, D. E. Maroulis, D. A. Karras, and M. Tzivras, "Computer-aided tumor detection in endoscopic video using color wavelet features," IEEE Trans. Inf. Technol. Biomed. 7,141-152 (2003). [CrossRef] [PubMed]
  20. G. Paschos, "Perceptually uniform color spaces for color texture analysis: an empirical evaluation," IEEE Trans. Image Process. 10,932-936 (2001). [CrossRef]
  21. S Wolf, is preparing a manuscript to be called "Color Correction Matrix for Digital Still and Video Imaging Systems."
  22. J. M. Benavides, S. Chang, S. Y. Park, R. Richards-Kortum, N. Mackinnon, C. MacAulay, A. Milbourne, A. Malpica, and M. Follen, "Multispectral digital colposcopy for in vivo detection of cervical cancer," Opt. Express 11,1223-1236 (2003). [CrossRef] [PubMed]
  23. A. Milbourne, S. Y. Park, J. L. Benedet, D. Miller, T. Ehlen, H. Rhodes, A. Malpica, J. Matisic, NiekirkD. Van, E. N. Atkinson, N. Hadad, N. Mackinnon, C. MacAulay, R. Richards-Kortum, and M. Follen, "Results of a pilot study of multispectral digital colposcopy for the in vivo detection of cervical intraepithelial neoplasia," Gynecol. Oncol. (2005). [CrossRef] [PubMed]
  24. <other>. W. Li, STI® Medical Systems, 733 Bishop Street, Honolulu, Hawaii 96813, is preparing a manuscript to be called "Acetowhite color feature extraction algorithm for cervical images."</other>

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited