OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 20, Iss. 16 — Jul. 30, 2012
  • pp: 17973–17986
« Show journal navigation

Compact real-time birefringent imaging spectrometer

Michael W. Kudenov and Eustace L. Dereniak  »View Author Affiliations


Optics Express, Vol. 20, Issue 16, pp. 17973-17986 (2012)
http://dx.doi.org/10.1364/OE.20.017973


View Full Text Article

Acrobat PDF (3005 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

The design and experimental demonstration of a snapshot hyperspectral imaging Fourier transform (SHIFT) spectrometer is presented. The sensor, which is based on a multiple-image FTS (MFTS), offers significant advantages over previous implementations using Michelson interferometers. Specifically, its use of birefringent interferometry creates a vibration insensitive and ultra-compact (15x15x10 mm3) common-path interferometer while offering rapid reconstruction rates through the graphics processing unit. The SHIFT spectrometer’s theory and experimental prototype are described in detail. Included are reconstruction and spectral calibration procedures, followed by the spectrometer’s validation using measurements of gas-discharge lamps. Lastly, outdoor measurements demonstrate the sensor’s ability to resolve spectral signatures in typical outdoor lighting and environmental conditions.

© 2012 OSA

1. Introduction

Imaging spectroscopy is a vital tool in both biomedical imaging and remote sensing applications [1

1. C. M. Biradar, P. S. Thenkabail, A. Platonov, X. Xiao, R. Geerken, P. Noojipady, H. Turral, and J. Vithanage, “Water productivity mapping methods using remote sensing,” J. Appl. Remote Sens. 2(1), 023544 (2008). [CrossRef]

, 2

2. R. M. Levenson and J. R. Mansfield, “Multispectral imaging in biology and medicine: slices of life,” Cytometry A 69(8), 748–758 (2006). [CrossRef] [PubMed]

]. The objective of an imaging spectrometer is to acquire the three-dimensional (3D) datacube (x, y, λ). In an analogues comparison to imaging polarimetry [3

3. J. S. Tyo, D. L. Goldstein, D. B. Chenault, and J. A. Shaw, “Review of passive imaging polarimetry for remote sensing applications,” Appl. Opt. 45(22), 5453–5469 (2006). [CrossRef] [PubMed]

], a spectral imaging system can acquire the datacube using the following measurement methods: division of time (DoT), division of image (DoI), division of optical path (DoOP), channeled imaging (CI), and division of aperture (DoA). Each category can also be further subdivided by how the instrument spectrally resolves incident light; namely, whether spectra are resolved within the frequency domain (e.g., diffraction gratings) or in the time domain (e.g., two-beam interference and a Fourier transformation). In DoT spectrometers, spatial or spectral scanning is used to time-sequentially resolve the datacube. Examples include whiskbroom, pushbroom, and filtered cameras [4

4. R. Glenn Stellar and D. G, Boreman, “Classification of imaging spectrometers for remote sensing applications,” Opt. Eng. 44, 013602 (2004).

]. Conversely, DoI, DoOP, CI, and DoA spectrometers attempt to acquire the datacube within a single snapshot.

Snapshot DoI spectrometers, in which the image of a scene is divided in some fashion before spectral characterization, are the most common. For instance, DoI spectrometers are found in all consumer color digital cameras in the form of a Bayer filter pattern [5

5. P. L. P. Dillon, D. M. Lewis, and F. G. Kaspar, “Color imaging system using a single CCD area array,” IEEE Trans. Electron. Dev. 25(2), 102–107 (1978). [CrossRef]

]. This concept has been successfully extended to more complex Bayer patterns with higher spectral resolution [6

6. J. Mercier, T. Townsend, and R. Sundberg, “Utility assessment of a multispectral snapshot LWIR imager,” in Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2010 2nd, 1–5 (2010).

]; however, they suffer from spatial aliasing due to their low spatial sampling. Other examples of DoI spectroscopy includes R. Kester and T. Tkaczyk’s image mapping technique, which redirects and disperses alternating columns of pixels into separate imaging pupils [7

7. R. T. Kester, N. Bedard, L. Gao, and T. S. Tkaczyk, “Real-time snapshot hyperspectral imaging endoscope,” J. Biomed. Opt. 16(5), 056005 (2011). [CrossRef] [PubMed]

]. Additionally, A. Bodkin’s Hyperpixel Array Imager (HPA) disperses the image of a scene after it is transmitted through an aperture mask [8

8. A. Bodkin, A. Sheinis, A. Norton, J. Daly, S. Beaven, and J. Weinheimer, “Snapshot hyperspectral imaging – the hyperpixel array camera,” Proc. SPIE 7334, 73340H, 73340H-11 (2009). [CrossRef]

]. Similarly, D. Brady’s Coded Aperture Snapshot Spectral Imager (CASSI) uses a mask, with random pixel-sized obscurations, at an intermediate image [9

9. A. A. Wagadarikar, N. P. Pitsianis, X. Sun, and D. J. Brady, “Video rate spectral imaging using a coded aperture snapshot spectral imager,” Opt. Express 17(8), 6368–6388 (2009). [CrossRef] [PubMed]

]. After dispersing the scene and aperture mask through a prism, the raw data is inverted through an iterative reconstruction process. A similar technique to CASSI is realized in Computed Tomographic Imaging Spectroscopy (CTIS), in which the intermediate image of a scene is dispersed into highly multiplexed projections [10

10. M. R. Descour and E. L. Dereniak, “Nonscanning no-moving-parts imaging spectrometer,” Proc. SPIE 2480, 48–64 (1995). [CrossRef]

, 11

11. M. R. Descour, C. E. Volin, E. L. Dereniak, T. M. Gleeson, M. F. Hopkins, D. W. Wilson, and P. D. Maker, “Demonstration of a computed-tomography imaging spectrometer using a computer-generated hologram disperser,” Appl. Opt. 36(16), 3694–3698 (1997). [CrossRef] [PubMed]

]. Demultiplexing, and the subsequent reconstruction of the datacube, is achieved with iterative reconstruction algorithms. Last is N. Gat’s or D. Fletcher-Holmes's four-dimensional imaging spectrometer (4D-IS), which uses a coherent two-dimensional (2D) to one-dimensional (1D) fiber array [12

12. D. W. Fletcher-Holmes and A. R. Harvey, “Real-time imaging with a hyperspectral fovea,” J. Opt. A, Pure Appl. Opt. 7(6), S298–S302 (2005). [CrossRef]

, 13

13. N. Gat, G. Scriven, J. Garman, M. De Li, and J. Zhang, “Development of four-dimensional imaging spectrometer,” Proc. SPIE 6302, 63020M, 63020M-11 (2006). [CrossRef]

]. An image, focused on the 2D end of the fiber array, is transmitted to the 1D side where it is collimated, dispersed, and reimaged. This enables each fiber’s spectrum to be extracted and re-arranged into a datacube.

Meanwhile, DoOP spectrometers divide the incoming optical path into multiple paths using beamsplitters. One example includes the use of a dichroic beamsplitter and two cameras [14

14. M. Kise, B. Park, K. C. Lawrence, and W. R. Windham, “Compact multi-spectral imaging system for contaminant detection on poultry carcass,” Proc. SPIE 6503, 650305, 650305-11 (2007). [CrossRef]

]. However, such systems are complex due to the need to align and calibrate separate cameras and objective lenses. To avoid these complications, A. Harvey’s image replication imaging spectrometer (IRIS) can be classified as a DoOP system using a single camera [15

15. A. Gorman, D. W. Fletcher-Holmes, and A. R. Harvey, “Generalization of the Lyot filter and its application to snapshot spectral imaging,” Opt. Express 18(6), 5602–5608 (2010). [CrossRef] [PubMed]

]. It achieves this by using a series of polarization beamsplitters (e.g., Wollaston prisms) and performs spectral selection based on the principle of a generalized Lyot filter. Meanwhile, CI instruments use optical interference to generate spatial carrier frequencies [16

16. M. W. Kudenov, M. E. L. Jungwirth, E. L. Dereniak, and G. R. Gerhart, “White-light Sagnac interferometer for snapshot multispectral imaging,” Appl. Opt. 49(21), 4067–4076 (2010). [CrossRef] [PubMed]

]. By using high order gratings in a Sagnac interferometer, different spectral passbands can be modulated onto unique carrier frequencies. Lastly, DoA instruments divide an optical system’s exit pupil into different optical paths. Such instruments use one camera and a lens array to re-image unique spectra into corresponding sub-images [17

17. J. C. Ramella-Roman and S. A. Mathews, “Spectroscopic measurements of oxygen saturation in the retina,” IEEE J. Sel. Top. Quantum Electron. 13(6), 1697–1703 (2007). [CrossRef]

].

The Snapshot Hyperspectral Imaging Fourier Transform (SHIFT) spectrometer described here lies within the DoA class of instrumentation. The SHIFT spectrometer benefits by incorporating birefringent polarization optics, thereby enabling it to achieve compactness and vibration insensitivity. In this paper, the theoretical and experimental description of the SHIFT spectrometer is presented. In Section 2, we describe the Multiple-Image Fourier transform spectrometer (MFTS), on which the SHIFT spectrometer is based. Section 3 describes the SHIFT spectrometer’s theory and Section 4 overviews our theoretical simulations in Zemax. Section 5 overviews the experimental prototype and Section 6 details our laboratory calibration. Lastly, Sections 7 depicts data, acquired with the laboratory prototype in relation to the sensor’s spectral validation.

2. Multiple-image Fourier transform spectrometer

Several imaging interferometers have been developed. However, many require either spatial scanning (e.g., a Sagnac) or temporal scanning (e.g., an imaging Michelson). One unique snapshot imaging interferometer, first demonstrated by Akiko Hirai [18

18. A. Hirai, T. Inoue, K. Itoh, and Y. Ichioka, “Application of multiple-image Fourier transform spectral imaging to measurement of fast phenomena,” Opt. Rev. 1(2), 205–207 (1994). [CrossRef]

], is depicted in Fig. 1
Fig. 1 The Multiple-image Fourier Transform Spectrometer (MFTS), as originally described by Hirai. Acronyms: Beamsplitter (BS), Focal Plane Array (FPA), M1 (Mirror 1), and M2 (Mirror 2).
. It is referred to as a Multiple-image Fourier Transform Spectrometer (MFTS), as its underlying principle of operation relies on creating an array of sub-images using a lens array. These sub-images are then transmitted into a Michelson interferometer with a tilted mirror. Since the fringes in this interferometer are localized at the mirrors, re-imaging the sub-images onto the mirrors makes the sub-images coincident with the interference. Reimaging the sub-images with an objective lens re-localizes both the fringes and the sub-images onto a focal plane array (FPA), thereby enabling the measurement of a 3D interferogram cube (xi, yi, OPD) within a single snapshot. Fourier transformation of the interferogram cube, along the OPD axis, enables the spectrum at each pixel to be extracted.

While the MFTS is snapshot, it suffers from several disadvantages that make it difficult to implement in a fieldable sensor. These include

  • 1. Vibration sensitivity due to the two non-common optical paths.
  • 2. Sensitivity to the misalignment of either mirror.
  • 3. A requirement for large focal-ratio imaging lenses.

These issues can be resolved by replacing the Michelson interferometer in the original MFTS with a common-path birefringent interferometer. Specifically, the Michelson interferometer and lens array can be substituted with an image-plane birefringent interferometer and lenslet array, respectively.

3. Snapshot hyperspectral imaging Fourier transform (SHIFT) spectrometer

The SHIFT spectrometer incorporates the birefringent polarization interferometer (BPI) depicted in Fig. 2(a)
Fig. 2 (a) A BPI based on Nomarski prisms (NP). The fringe localization (FL) plane is compensated and coincident with the focal plane array (FPA). (b) The rotated Generating polarizer (G), NPs, and Analyzer (A) are placed on the FPA behind anN×Mlenslet array.
. The interferometer contains two Nomarski prisms (NP), NP1 and NP2, each of which consisting of two birefringent crystal prisms with wedge angle α. Note that one of the fast axes in each Nomarski prism is tilted with respect to the y-axis by an angle β. This enables a real fringe localization (FL) plane to be formed outside of the prism [19

19. M. Francon and S. Mallick, Polarization Interferometers (John Wiley & Sons Ltd., 1971).

]. Also included is a half-wave plate (HWP) between NP1 and NP2. Orienting the HWP at 45 enables a 90 rotation of NP2’s polarization eigenmodes. Since NP2 is rotated 180 degrees with respect to NP1, the localization plane is compensated to lie within the xy plane and is subsequently coincident with a focal plane array (FPA). The separation between the two orthogonally polarized rays versus x, at y = 0 and a z coordinate that lies behind NP2, is depicted in Fig. 3
Fig. 3 Ray tracing results of a Nomarski prism with and without inclusion of the HWP.
. Without the HWP, fringes are only localized at x = −2.6 mm. However, inclusion of the HWP enables the fringes to be localized within the xy plane.

Light is analyzed into a coherent polarization state by a linear (generating) polarizer (G) and interference fringes are created along the localization plane by the linear analyzer (A), both of which are oriented at 45 degrees with respect to the prism's polarization eigenmodes. This small component generates a linear OPD vs. x between the two orthogonal polarization states. Using the prism’s apex angle α, the OPD of a single Nomarski prism can be expressed as
OPD=2Bxtan(α),
(1)
where B is the birefringence of the crystal, and is defined as the difference between the extraordinary (ne) and ordinary (no) indices of refraction, B = (neno), and α is the wedge angle. It should be mentioned that this expression is approximate for small α [20

20. C. C. Montarou and T. K. Gaylord, “Analysis and design of modified Wollaston prisms,” Appl. Opt. 38(31), 6604–6616 (1999). [CrossRef] [PubMed]

]. Placing the interferometer onto an FPA creates a spatially dependent OPD similar to that observed in imaging polarimetry [21

21. K. Oka and T. Kaneko, “Compact complete imaging polarimeter using birefringent wedge prisms,” Opt. Express 11(13), 1510–1519 (2003). [CrossRef] [PubMed]

, 22

22. J. Van Delden, U.S. Patent No. 6,674,532 B2, Jan. 6, 2004.

]. A rotation of this OPD, as generated by tilting the mirror in both x and y in Hirai’s original MFTS, is replicated by rotating the prisms, waveplate, and polarizers by a small angle δ about the z axis per Fig. 2(b). Finally, image replication is incorporated using a lenslet array in front of the interferometer and FPA.

TheN×Msub-images are formed coincident with both the FPA and the localization plane, where N and M are the number of lenslets along y and x, respectively. An example of the OPD versus spatial position, relative to each sub-image, is depicted in Fig. 4(a)
Fig. 4 (a) Linear OPD (μm) as a function of FPA position in pixels (pix). (b) Illustration of the construction of the 3D data cube with dimensions (xi, yi, OPD) from the sampled sub-images.
. Notable is the large slope in OPD along x, produced predominantly by the wedges α, and the relatively small slope along y, which is realized through the small rotation δ. The angle δ can be calculated by

δ=tan1(1/M).
(2)

This rotation enables each sub-image to view sequentially larger values of OPD. An analogy to the original MFTS, depicted previously in Fig. 1, is that the wedge angle, α, replicates the tilt θ of mirror M1 along x. Similarly, the rotation of the wedges, δ, replicates the tilt φ of mirror M1 along y. To emphasize this, the sub-images in Fig. 4(a) are numbered 1-16, with images 1 and 16 representing the most negative and positive OPD samples, respectively [23

23. M. W. Kudenov and E. L. Dereniak, “Compact snapshot birefringent imaging Fourier transform spectrometer,” Proc. SPIE 7812, 781206, 781206-11 (2010). [CrossRef]

]. Consequently, each sub-image samples a different “slice” of the 3D interferogram cube, as depicted in Fig. 4(b), which has dimensions (xi, yi, OPD); here, xi and yi are the spatial coordinates within the sub-images. Thus, an interferogram, and its corresponding spectrum, can be calculated at each spatial location within the scene. Performing the required post-processing calculations produces the datacube (xi, yi, λ).

4. Theoretical modeling

In order to design and fabricate the SHIFT spectrometer, an interference model was developed in Zemax and Matlab using the Matlab-Zemax Direct Data Exchange (MZDDE) toolbox. This toolbox enabled polarization ray tracing to be conducted by Zemax. Matlab subsequently used Zemax’s output to calculate the interference pattern and shape of the fringe localization plane.

4.1 Fringe localization

For spatially and spectrally incoherent illumination, the localization plane of a Nomarski prism is located where the extraordinary (e) and ordinary (o) rays intersect [19

19. M. Francon and S. Mallick, Polarization Interferometers (John Wiley & Sons Ltd., 1971).

]. Since the e and o rays change between NP1 and NP2, we will refer to the polarization states as either Ex or Ey. The basic interferometer’s Zemax configuration is depicted in Fig. 5
Fig. 5 Side profile of the birefringent interferometer without the lenslet array. All dimensions are in mm and α = 3.15μ.
, sans the lenslet array. It consists of two quartz Nomarski prisms, an ideal HWP, an ideal linear analyzer (A), and the FPA’s coverglass (CG). The CG’s glass type (BK7), thickness, and the rear vertex thickness to the FPA (γ = 0.69 mm) were obtained from the FPA’s datasheet (Kodak KAI-04022). Relocalizing the fringe localization plane [20

20. C. C. Montarou and T. K. Gaylord, “Analysis and design of modified Wollaston prisms,” Appl. Opt. 38(31), 6604–6616 (1999). [CrossRef] [PubMed]

] is achieved by tilting the fast axes of one prism, within both Nomarski prisms, by an angle β with respect to the y axis.

Calculating α was accomplished with Eq. (1). Using Nyquist-Shannon sampling theory, at a cutoff wavelength λc = 400 nm [24

24. V. Saptari, Fourier-transform spectroscopy instrumentation engineering (SPIE Press, 2004).

], yielded an optimal angle of α = 3.15 degrees. Next, β was selected using the simulation. The Ex and Ey ray’s separation on the FPA (ΔExEy) versus β, for γ = 0.69 mm, was calculated. These results yielded ΔExEy = 0 mm for β = 16.2 degrees.

4.2 SHIFT model

The SHIFT spectrometer can be modeled by incorporating a lenslet in front of the interferometer, analyzed previously in Fig. 5. A two-dimensional side profile of the SHIFT spectrometer, from Zemax, is depicted in Fig. 6
Fig. 6 Side profile of the SHIFT spectrometer for one lenslet in the array. The x and y polarization states are demonstrated as blue and green lines, respectively, for the two respective Zemax configurations that were used in the simulations.
for one lenslet in the array. The lenslet’s radius of curvature (R) is 2.4 mm, its clear aperture is 1 mm, and it is modeled with BK7 glass. This creates a 4.6 mm focal length lenslet with a focal-ratio of F/4.6. Optimization is completed by focusing the lenslet onto the FPA by adjusting the thickness t2. For the final design, t2 = 0.25 mm.

Since the interferometer's retardance varies as a function of incidence angle, it is expected that the fringe contrast will depend on the lenslet's focal-ratio. However, analysis of this effect is beyond the scope of the current paper.

4.2 Interference simulations

Using the aforementioned model, interference can be simulated by:

  • 1. Translating the lenslet to a given area on the prism and FPA. This is simulated by changing the thicknesses d1 and d2.
  • 2. Tracing rays for each pixel across a +/− 5 degree field of view. This is achieved by tracing one ray, from the entrance pupil to the image plane, for each pixel on the detector. Outputs from this ray trace are the complex electromagnetic field components along x (Ex) and y (Ey). Here, we only trace the chief ray because error between the marginal and chief ray's calculated interference intensity is relatively small at 2.8%.
  • 3. Calculating the intensity of the two-beam interference at each pixel using I=|Ex+Ey|2.
  • 4. Repeating the process for all N×Mlenslets and FPA pixels.

It should be mentioned that the linear polarization analyzer’s function is not incorporated into the Zemax model. Rather, Matlab is used to apply the linear polarizer’s Jones matrix (step 3) after the e and o ray’s complex fields have been acquired.

Simulating the interference pattern is of interest for three values of γ: (1) The nominal value, γ = 0.69 mm; (2) Small positive error, γ = 0.94 mm; and (3) Small negative error, γ = 0.44 mm. Note that the error in γ is within ± 0.25 mm; this is equal to the tolerance of γ specified in the camera manufacturer’s datasheet. Simulating the interference for an N = 3, M = 3 lenslet array produces the data depicted, for each of the aforementioned cases, in Fig. 7
Fig. 7 Simulated interference pattern for 3x3 lenslets. (a) γ = 0.44 mm, (b) γ = 0.69 mm, and (c) γ = 0.94 mm.
.

These data demonstrate that defocusing error influences the fringe’s relative magnification within each of the array’s sub-images. This changes the fringe’s effective frequency, which creates a phase-shift between adjacent sub-images. The phase-shift is depicted more clearly in the x and y cross-sections provided in Fig. 8(a)
Fig. 8 One-dimensional cross-section of the two-dimensional data, depicted previously in Fig. 7, across the (a) columns and (b) rows. The light gray lines indicate boundaries between adjacent sub-images and cross-sections were extracted along the red dashed lines per Fig. 7. (c) Spatial frequency, in cycles per mm, as a function of defocus.
and Fig. 8(b), respectively. Note that the axial position of each sub-image (i.e., xi = yi = 0) remains at a constant OPD; however, the off-axis OPD changes due to the magnification error. Furthermore, the magnification-induced frequency shift is linear as a function of defocus, as illustrated in the Zemax data provided in Fig. 8(c). Resolving errors induced by the optical magnification is accomplished using a least-squares fitting procedure. This phase-shift was observed experimentally in our prototype sensor and its correction and calibration will be discussed in further detail per Section 6.

5. Laboratory prototype of the SHIFT spectrometer

A schematic of the prototype SHIFT spectrometer is depicted in Fig. 9(a)
Fig. 9 (a) The SHIFT spectrometer’s optical schematic. Acronyms: Objective Lens (OL), Collimating Lens (CL), Aperture Array (AA), and Lens Array (LA). (b) Benchtop image.
. Both prisms are fabricated from quartz birefringent crystal with α = 3.15 degrees and β = 16.2 degrees and have been rotated by an angle δ = 6.3 degrees. The polymer achromatic HWP has a retardance variation of ± 0.008 waves over 420-680 nm and the analyzing and generating polarizers are polymer and wire-grid, respectively. Meanwhile, the 10x10 lenslet array, which is positioned behind a 10x10 aperture array, is made from silica glass and each lenslet has a 2.4 mm radius of curvature and a 4.6 mm focal length. The 50 mm focal length objective and collimating lenses are identical and have focal-ratios of 1.8 and the FPA is a Kodak Megaplus camera with a 2048x2048 element array with 7.4x7.4 μm pixels. A view of the sensor on the benchtop is depicted in Fig. 9(b).

6. Calibration

Calibration consisted of the procedure detailed within sections 6.1 through 6.5.

6.1 Flatfield calibration

The aperture array was fabricated with a CNC milling machine in Delrin plastic. Unfortunately, milling the part created burrs within some of the apertures. These burrs create obscurations within the pupil, thus causing some lenslets to produce a dimmer image than others. To remove intensity non-uniformity caused by mismatched aperture obscurations, cos4 effects, and vignetting, a flatfield normalization procedure was utilized. The flatfield was taken with the generating polarizer removed while the system stared into an integrating sphere. This enabled a uniform and unpolarized field to be viewed without interference fringes. Dividing this field by all subsequent data enabled normalization of the intensity

6.2 Image registration

A target containing high spatial frequency information is viewed with the spectrometer in order to calculate the image registration coefficients. However, illuminating the target with white light produces white-light interference fringes across the sub-images. In order to remove these fringes for accurate calculation of the image registration coefficients, the generating polarizer (G) must be rotated by +/− 45°. This makes the incident polarization state parallel to one of the interferometer's eigenvectors and removes the polarization interference fringes. Each sub-image has its centroid calculated using a correlation algorithm. These centroids are used to extract the sub-images into an unregistered array of images, which are then spatially registered to one of the sub-images near the center of the lenslet array. Image registration is accomplished with a sub-pixel spatial registration algorithm in Matlab [26

26. J. R. Bergen, P. Anandan, K. J. Hanna, and R. Hingorani, “Hierarchical model-based motion estimation,” in Proceedings of the Second European Conference on Computer Vision, Springer-Verlag 588, 237–252, (1992).

]. Both the centroids and image registration coefficients are applied to all subsequent measurements.

6.3 Optical path difference calibration

Each pixel’s OPD can be calculated by first filling the sensor’s FOV by the exit port of an integrating sphere. Two interferogram images are then recorded: one image is taken with the sphere illuminated by a helium neon laser and the other by a white-light tungsten halogen lamp. By recording the interference fringes, generated by the white-light source, the zero-order fringe can be established, and the actual tilt angle (δ) of the Nomarski prisms measured. The white-light source also yields the pixel locations of zero-OPD, to which a linear polynomial is fitted. Using this polynomial enables the zero-OPD x-pixel location, xo, to be calculated at the y position that contains the lenslet’s centroid, as illustrated in Fig. 10(a)
Fig. 10 Interference fringes generated by viewing through the exit port of an integrating sphere that is illuminated by (a) a white-light tungsten halogen lamp and (b) a HeNe laser.
.

The OPD as a function of the FPA's Cartesian coordinates, OPD2(x, y), can be described using the 2D functional form
OPD2(x,y)n,m=4Btan(αe)((x+x0)cos(δ)ysin(δ))+(nΔϕn+mΔϕm)/2πσc,
(3)
where δ is the rotation or tilt angle of the interferometer, xo is the x offset of the zero-OPD reference position, B is the quartz birefringence at the helium neon’s laser wavenumber of σc = 1.58E6 m−1, αe is the effective prism’s apex angle, Δϕn and Δϕm denote the effective phase change between adjacent rows and columns, respectively, and n and m are integers denoting a sub-image’s row and column indices, respectively. Note that (n, m) spans integers from (0, 0) to (N-1, M-1) and that (n, m) = (0, 0) is defined for the sub-image with the most negative OPD. A theoretical intensity pattern can be calculated using OPD2 by

I(x,y)=12(1+cos(2πσcOPD2(x,y)n,m)).
(4)

To calculate the OPD at each pixel, Eqs. (3) and (4) were used to generate a theoretical interferogram. This was then fit to the calibration laser’s interference image, per Fig. 10(b), using an iterative least-squares fitting procedure. Known variables included B, σc, and x0. Meanwhile, Δϕn, Δϕm, δ, and αe were used as variables. Initial estimates of Δϕn and Δϕm were obtained by comparing fringes of equal OPD at adjacent sub-image boundaries and an estimate of δ was obtained using the zero-order polynomial. Finally, an initial estimate of the fringe’s frequency was determined by setting αe to the prism’s nominal value α.

Performing the fitting procedure yielded Δϕn=84.1°, Δϕm=0.63°, δ=6.4°, and αe=2.76°. While αe is 0.41° lower than the nominal value of α, αe is implicitly compensating for the magnification factor. Since the fringes are magnified (i.e., γ exceeds the nominal value), the effective prism angle must be decreased to reduce the frequency. Once the OPD is established for each pixel on the FPA, a cubic interpolation is used on the extracted interferograms to re-sample the OPD onto a uniformly spaced sampling grid. This step is required as standard Fast Fourier Transform (FFT) algorithms assume uniformly sampled data.

6.4 Spectral calibration

For a birefringent Fourier transform spectrometer (FTS), dispersion in the birefringence must be compensated when performing the spectral calibration [27

27. A. R. Harvey and D. W. Fletcher-Holmes, “Birefringent Fourier-transform imaging spectrometer,” Opt. Express 12(22), 5368–5374 (2004). [CrossRef] [PubMed]

]. Spectral calibration first required that the spectral wavenumber axis, σ1, be calculated assuming that OPD2(x, y, σc) was achromatic. This is standard to all Fourier transform spectroscopy, as dictated by Nyquist-Shannon sampling theory [24

24. V. Saptari, Fourier-transform spectroscopy instrumentation engineering (SPIE Press, 2004).

]. However, σ1 requires adjustment due to the crystal’s birefringence dispersion. A birefringent prism’s OPD can be expressed by a multiplication between an achromatic OPD and a birefringence ratio as
OPD=OPDcB(σ1)B(σc),
(5)
where OPDc is the optical path difference calculated at the calibration laser’s wavenumber. After Fourier transformation of the interferogram, the birefringence ratio remains as a multiplicative factor to the wavenumber σ1. This wavenumber axis can be considered an initial approximation to the actual wavenumber axis, σ2, where σ1 is only correct at σc. Therefore, σ1 must be multiplied by the birefringence ratio’s inverse to obtain
σ2=σ1B(σc)B(σ1).
(6)
This procedure removes the birefringence dispersion’s influence on the spectral calibration.

6.5 Reconstruction algorithms in the graphics processing unit

The reconstruction algorithms, which were initially developed in Matlab, were converted for use in the graphics processing unit (GPU) using Nvidia CUDA. Reconstruction times were reduced from 2 minutes per datacube within an Intel Core 2 Duo T7700 to 400 ms per datacube within an Nvidia GeForce 445M. We estimate that the reconstruction time per datacube can be reduced to 82 ms (or 12 datacubes per second) by running the current code on a more powerful Nvidia Tesla M2090 GPU. Reducing the reconstruction time beyond this could be achieved by further optimizing the GPU-based reconstruction algorithms.

7. Results

Three emission sources were imaged to verify the SHIFT spectrometer’s spectral calibration. These included mercury (Hg), neon (Ne), and high pressure sodium (Na) gas discharge lamps. Due to the sodium lamp’s high intensity, its light was viewed after reflecting from a spectralon diffuser. The diffuser contained multiple panels with reflectivities of 99%, 80%, 60%, 50%, and 20%. The measured spectrally band-integrated image of the scene is depicted in Fig. 11
Fig. 11 Spectrally band integrated image of the scene, from the SHIFT spectrometer, containing the gas discharge lamps. The numbers identify the type of emission source as follows: 1 – Na; 2 – Ne; 3 – Hg.
.

Recovered spectra from the SHIFT spectrometer were compared to spectra acquired using an Ocean Optics USB 2000 spectrometer with a full-width spectral resolution of approximately 25 cm−1 at 633 nm. This comparison is depicted in Fig. 12(a)
Fig. 12 (a) Spectral data from the SHIFT (blue solid line) and Ocean Optics (black dashed lines) spectrometers. (b) 2D spectral slices from the 3D datacube at various wavelengths (nm).
, while the spectral slices of the 2D scene are provided in Fig. 12(b). For the purposes of this comparison, the Ocean Optics spectrometer's spectra were down-sampled to a spectral resolution equal to that of the SHIFT spectrometer (600 cm−1). The spectra were then normalized to their average intensity for qualitative comparison. While these data have not been corrected for the relative spectral response between the two spectrometers, the spectral locations of the discharge lamp's emission features demonstrates good agreement.

Additional test data from the SHIFT proof of concept sensor have been collected for outdoor targets. One outdoor scene of several moving automobiles at a traffic light intersection is depicted in Fig. 13(a)
Fig. 13 (a) Red, green, and blue composite image that was generated using the SHIFT spectrometer’s data. (b) Full-resolution spectra for a traffic light, car, and grass.
. These data were acquired with an integration time of 1/40th of a second in the late afternoon on a clear day. Notable in the grass's spectrum is the increased signal for vegetation past 714 nm due to chlorophyll.

7.1 Noise comparison: SHIFT versus spectral filters

An FTS, operating in the visible spectrum, does not benefit from the multiplex advantage. Rather, for schott noise limited detectors, it suffers from a multiplexing disadvantage [28

28. E. Voigtman and J. D. Winefordner, “The multiplex disadvantage and excess low-frequency noise,” Appl. Spectrosc. 41(7), 1182–1184 (1987). [CrossRef]

]. This disadvantage states that the (white) photon noise is uniformly distributed throughout the bright (e.g., emission) and dark (e.g., absorption) features of a measured spectrum, whereas in a comparable grating spectrometer, photon noise is localized only to the spectral measurement's bright features. Therefore, an important question to answer is how the SHIFT spectrometer compares to other methods of spectral measurement.

While a thorough comparison is beyond the scope of the current publication, a simple experiment can be done by measuring the SHIFT spectrometer's signal-to-noise ratio (SNR) at λ = 633 nm. This can then be compared to a “fringeless” measurement, imaged through a 633 nm bandpass filter, of the same scene using the same illumination conditions, optics, and camera. Here, “fringeless” refers to acquiring an image with the generating polarizer removed from the system. The SHIFT spectrometer's SNR was measured by performing the following procedure:

  • 1. A uniformly illuminated scene was imaged. The scene was generated using the exit port of an integrating sphere, and the sphere was illuminated by tungsten halogen lamp.
  • 2. Thirty time sequential images were acquired at an integration time of 80 ms. All frames were reconstructed using the aforementioned post-processing techniques. The full-width spectral resolution of the reconstructions is 600 cm−1.
  • 3. The average signal-to-noise ratio was calculated by SNR=σSS/mSS, where σSS and mSS are the standard deviation and mean value of the measurement versus time. Note that the SNR was calculated independently at 100 spatial locations within a 20x20 pixel area before the SNR results were averaged. This sampled region was centered in the middle of the spectrometer's field of view.

Meanwhile, acquisition of the 633 nm bandpass filter comparison data included:

  • 1. Imaging the same uniform scene as previously described in (1). The only difference is inclusion of the 633 nm bandpass filter in place of the generating polarizer (G) per Fig. 9(a).
  • 2. Thirty frames of data were measured time sequentially with the same integration time of 80 ms. The filter has a full-width spectral bandwidth of 20 nm, yielding a spectral resolution of 475 cm−1.
  • 3. The average signal-to-noise ratio was established by calculating the SNR at the central pixel behind each lenslet in the 10x10 lenslet array. Similar to (3) previously, the SNR was calculated using SNR=σSS/mSS. This yielded 100 independent SNR measurements that were subsequently averaged.
  • 4. In a realistic filtered camera, the analyzing polarizer would not be present. Therefore, the SNR was divided by the analyzer's transmission of 40.5 percent. Furthermore, the spectral resolution was also scaled from 475 cm−1 to 600 cm−1 by multiplying the measured SNR by their ratio of 1.26.

Results from this procedure yielded an SNR for the SHIFT spectrometer, SNRS, of 85.2, while the bandpass filter's SNR, SNRBP, was 61.2. Therefore, the SHIFT spectrometer produces an SNR approximately 1.4 times greater than that of the filtered camera for these conditions. However, a shott noise limited Fourier transform spectrometer can be a disadvantage when measuring the depth of spectral absorption features, or weak spectral emission features outside of strong spectral emission lines. Therefore, it is expected that the performance of the system would be diminished in these situations, similar to Ref. [29

29. E. A. Stubley and G. Horlick, “Measurement of inductively coupled plasma emission spectra using a Fourier transform spectrometer,” Appl. Spectrosc. 39(5), 805–810 (1985). [CrossRef]

]. Therefore, the primary advantages of this system include its inherent compactness; ease of fabrication and assembly compared to filter arrays, image slicers, image mappers, and holograms; and its octave-exceeding free-spectral-range. Additionally, the multiplex advantage exists at longer wavelengths; therefore, an SNR advantage may exist for wavelengths spanning 8-12 microns.

8. Conclusion

The design, modeling, calibration and validation of a Snapshot Hyperspectral Imaging Fourier Transform (SHIFT) spectrometer have been provided. Theoretical modeling was demonstrated using Zemax, in conjunction with Matlab, to perform interference simulations. Simulations demonstrated that a small deviation between the lenslet array’s focus and the fringe localization plane creates a phase shift between sub-images. This phase error was observed in our experimental data, and subsequent compensation was incorporated in our calibration procedure using a least-squares fitting procedure. Lastly, the calibration of the sensor was validated using sodium, mercury, and neon gas discharge lamps. Acquired spectra were compared to an independent measurement by an Ocean Optics USB2000 spectrometer, which demonstrated qualitative agreement. Applications of the SHIFT spectrometer can be found in surveillance for micro-unmanned air vehicles, use in ground vehicles for target identification, biomedical imaging, and endoscopy.

References and links

1.

C. M. Biradar, P. S. Thenkabail, A. Platonov, X. Xiao, R. Geerken, P. Noojipady, H. Turral, and J. Vithanage, “Water productivity mapping methods using remote sensing,” J. Appl. Remote Sens. 2(1), 023544 (2008). [CrossRef]

2.

R. M. Levenson and J. R. Mansfield, “Multispectral imaging in biology and medicine: slices of life,” Cytometry A 69(8), 748–758 (2006). [CrossRef] [PubMed]

3.

J. S. Tyo, D. L. Goldstein, D. B. Chenault, and J. A. Shaw, “Review of passive imaging polarimetry for remote sensing applications,” Appl. Opt. 45(22), 5453–5469 (2006). [CrossRef] [PubMed]

4.

R. Glenn Stellar and D. G, Boreman, “Classification of imaging spectrometers for remote sensing applications,” Opt. Eng. 44, 013602 (2004).

5.

P. L. P. Dillon, D. M. Lewis, and F. G. Kaspar, “Color imaging system using a single CCD area array,” IEEE Trans. Electron. Dev. 25(2), 102–107 (1978). [CrossRef]

6.

J. Mercier, T. Townsend, and R. Sundberg, “Utility assessment of a multispectral snapshot LWIR imager,” in Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2010 2nd, 1–5 (2010).

7.

R. T. Kester, N. Bedard, L. Gao, and T. S. Tkaczyk, “Real-time snapshot hyperspectral imaging endoscope,” J. Biomed. Opt. 16(5), 056005 (2011). [CrossRef] [PubMed]

8.

A. Bodkin, A. Sheinis, A. Norton, J. Daly, S. Beaven, and J. Weinheimer, “Snapshot hyperspectral imaging – the hyperpixel array camera,” Proc. SPIE 7334, 73340H, 73340H-11 (2009). [CrossRef]

9.

A. A. Wagadarikar, N. P. Pitsianis, X. Sun, and D. J. Brady, “Video rate spectral imaging using a coded aperture snapshot spectral imager,” Opt. Express 17(8), 6368–6388 (2009). [CrossRef] [PubMed]

10.

M. R. Descour and E. L. Dereniak, “Nonscanning no-moving-parts imaging spectrometer,” Proc. SPIE 2480, 48–64 (1995). [CrossRef]

11.

M. R. Descour, C. E. Volin, E. L. Dereniak, T. M. Gleeson, M. F. Hopkins, D. W. Wilson, and P. D. Maker, “Demonstration of a computed-tomography imaging spectrometer using a computer-generated hologram disperser,” Appl. Opt. 36(16), 3694–3698 (1997). [CrossRef] [PubMed]

12.

D. W. Fletcher-Holmes and A. R. Harvey, “Real-time imaging with a hyperspectral fovea,” J. Opt. A, Pure Appl. Opt. 7(6), S298–S302 (2005). [CrossRef]

13.

N. Gat, G. Scriven, J. Garman, M. De Li, and J. Zhang, “Development of four-dimensional imaging spectrometer,” Proc. SPIE 6302, 63020M, 63020M-11 (2006). [CrossRef]

14.

M. Kise, B. Park, K. C. Lawrence, and W. R. Windham, “Compact multi-spectral imaging system for contaminant detection on poultry carcass,” Proc. SPIE 6503, 650305, 650305-11 (2007). [CrossRef]

15.

A. Gorman, D. W. Fletcher-Holmes, and A. R. Harvey, “Generalization of the Lyot filter and its application to snapshot spectral imaging,” Opt. Express 18(6), 5602–5608 (2010). [CrossRef] [PubMed]

16.

M. W. Kudenov, M. E. L. Jungwirth, E. L. Dereniak, and G. R. Gerhart, “White-light Sagnac interferometer for snapshot multispectral imaging,” Appl. Opt. 49(21), 4067–4076 (2010). [CrossRef] [PubMed]

17.

J. C. Ramella-Roman and S. A. Mathews, “Spectroscopic measurements of oxygen saturation in the retina,” IEEE J. Sel. Top. Quantum Electron. 13(6), 1697–1703 (2007). [CrossRef]

18.

A. Hirai, T. Inoue, K. Itoh, and Y. Ichioka, “Application of multiple-image Fourier transform spectral imaging to measurement of fast phenomena,” Opt. Rev. 1(2), 205–207 (1994). [CrossRef]

19.

M. Francon and S. Mallick, Polarization Interferometers (John Wiley & Sons Ltd., 1971).

20.

C. C. Montarou and T. K. Gaylord, “Analysis and design of modified Wollaston prisms,” Appl. Opt. 38(31), 6604–6616 (1999). [CrossRef] [PubMed]

21.

K. Oka and T. Kaneko, “Compact complete imaging polarimeter using birefringent wedge prisms,” Opt. Express 11(13), 1510–1519 (2003). [CrossRef] [PubMed]

22.

J. Van Delden, U.S. Patent No. 6,674,532 B2, Jan. 6, 2004.

23.

M. W. Kudenov and E. L. Dereniak, “Compact snapshot birefringent imaging Fourier transform spectrometer,” Proc. SPIE 7812, 781206, 781206-11 (2010). [CrossRef]

24.

V. Saptari, Fourier-transform spectroscopy instrumentation engineering (SPIE Press, 2004).

25.

J. Renau, P. K. Cheo, and H. G. Cooper, “Depolarization of linearly polarized EM waves backscattered from rough metals and inhomogeneous dielectrics,” J. Opt. Soc. Am. 57(4), 459–466 (1967). [CrossRef] [PubMed]

26.

J. R. Bergen, P. Anandan, K. J. Hanna, and R. Hingorani, “Hierarchical model-based motion estimation,” in Proceedings of the Second European Conference on Computer Vision, Springer-Verlag 588, 237–252, (1992).

27.

A. R. Harvey and D. W. Fletcher-Holmes, “Birefringent Fourier-transform imaging spectrometer,” Opt. Express 12(22), 5368–5374 (2004). [CrossRef] [PubMed]

28.

E. Voigtman and J. D. Winefordner, “The multiplex disadvantage and excess low-frequency noise,” Appl. Spectrosc. 41(7), 1182–1184 (1987). [CrossRef]

29.

E. A. Stubley and G. Horlick, “Measurement of inductively coupled plasma emission spectra using a Fourier transform spectrometer,” Appl. Spectrosc. 39(5), 805–810 (1985). [CrossRef]

OCIS Codes
(120.3180) Instrumentation, measurement, and metrology : Interferometry
(260.1440) Physical optics : Birefringence
(260.5430) Physical optics : Polarization
(300.6300) Spectroscopy : Spectroscopy, Fourier transforms
(110.4234) Imaging systems : Multispectral and hyperspectral imaging

ToC Category:
Imaging Systems

History
Original Manuscript: April 12, 2012
Revised Manuscript: May 31, 2012
Manuscript Accepted: July 11, 2012
Published: July 23, 2012

Citation
Michael W. Kudenov and Eustace L. Dereniak, "Compact real-time birefringent imaging spectrometer," Opt. Express 20, 17973-17986 (2012)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-20-16-17973


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. C. M. Biradar, P. S. Thenkabail, A. Platonov, X. Xiao, R. Geerken, P. Noojipady, H. Turral, and J. Vithanage, “Water productivity mapping methods using remote sensing,” J. Appl. Remote Sens.2(1), 023544 (2008). [CrossRef]
  2. R. M. Levenson and J. R. Mansfield, “Multispectral imaging in biology and medicine: slices of life,” Cytometry A69(8), 748–758 (2006). [CrossRef] [PubMed]
  3. J. S. Tyo, D. L. Goldstein, D. B. Chenault, and J. A. Shaw, “Review of passive imaging polarimetry for remote sensing applications,” Appl. Opt.45(22), 5453–5469 (2006). [CrossRef] [PubMed]
  4. R. Glenn Stellar and D. G, Boreman, “Classification of imaging spectrometers for remote sensing applications,” Opt. Eng.44, 013602 (2004).
  5. P. L. P. Dillon, D. M. Lewis, and F. G. Kaspar, “Color imaging system using a single CCD area array,” IEEE Trans. Electron. Dev.25(2), 102–107 (1978). [CrossRef]
  6. J. Mercier, T. Townsend, and R. Sundberg, “Utility assessment of a multispectral snapshot LWIR imager,” in Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2010 2nd, 1–5 (2010).
  7. R. T. Kester, N. Bedard, L. Gao, and T. S. Tkaczyk, “Real-time snapshot hyperspectral imaging endoscope,” J. Biomed. Opt.16(5), 056005 (2011). [CrossRef] [PubMed]
  8. A. Bodkin, A. Sheinis, A. Norton, J. Daly, S. Beaven, and J. Weinheimer, “Snapshot hyperspectral imaging – the hyperpixel array camera,” Proc. SPIE7334, 73340H, 73340H-11 (2009). [CrossRef]
  9. M. R. Descour and E. L. Dereniak, “Nonscanning no-moving-parts imaging spectrometer,” Proc. SPIE2480, 48–64 (1995). [CrossRef]
  10. M. R. Descour, C. E. Volin, E. L. Dereniak, T. M. Gleeson, M. F. Hopkins, D. W. Wilson, and P. D. Maker, “Demonstration of a computed-tomography imaging spectrometer using a computer-generated hologram disperser,” Appl. Opt.36(16), 3694–3698 (1997). [CrossRef] [PubMed]
  11. D. W. Fletcher-Holmes and A. R. Harvey, “Real-time imaging with a hyperspectral fovea,” J. Opt. A, Pure Appl. Opt.7(6), S298–S302 (2005). [CrossRef]
  12. N. Gat, G. Scriven, J. Garman, M. De Li, and J. Zhang, “Development of four-dimensional imaging spectrometer,” Proc. SPIE6302, 63020M, 63020M-11 (2006). [CrossRef]
  13. M. Kise, B. Park, K. C. Lawrence, and W. R. Windham, “Compact multi-spectral imaging system for contaminant detection on poultry carcass,” Proc. SPIE6503, 650305, 650305-11 (2007). [CrossRef]
  14. A. Gorman, D. W. Fletcher-Holmes, and A. R. Harvey, “Generalization of the Lyot filter and its application to snapshot spectral imaging,” Opt. Express18(6), 5602–5608 (2010). [CrossRef] [PubMed]
  15. M. W. Kudenov, M. E. L. Jungwirth, E. L. Dereniak, and G. R. Gerhart, “White-light Sagnac interferometer for snapshot multispectral imaging,” Appl. Opt.49(21), 4067–4076 (2010). [CrossRef] [PubMed]
  16. J. C. Ramella-Roman and S. A. Mathews, “Spectroscopic measurements of oxygen saturation in the retina,” IEEE J. Sel. Top. Quantum Electron.13(6), 1697–1703 (2007). [CrossRef]
  17. A. Hirai, T. Inoue, K. Itoh, and Y. Ichioka, “Application of multiple-image Fourier transform spectral imaging to measurement of fast phenomena,” Opt. Rev.1(2), 205–207 (1994). [CrossRef]
  18. M. Francon and S. Mallick, Polarization Interferometers (John Wiley & Sons Ltd., 1971).
  19. C. C. Montarou and T. K. Gaylord, “Analysis and design of modified Wollaston prisms,” Appl. Opt.38(31), 6604–6616 (1999). [CrossRef] [PubMed]
  20. K. Oka and T. Kaneko, “Compact complete imaging polarimeter using birefringent wedge prisms,” Opt. Express11(13), 1510–1519 (2003). [CrossRef] [PubMed]
  21. J. Van Delden, U.S. Patent No. 6,674,532 B2, Jan. 6, 2004.
  22. M. W. Kudenov and E. L. Dereniak, “Compact snapshot birefringent imaging Fourier transform spectrometer,” Proc. SPIE7812, 781206, 781206-11 (2010). [CrossRef]
  23. V. Saptari, Fourier-transform spectroscopy instrumentation engineering (SPIE Press, 2004).
  24. J. Renau, P. K. Cheo, and H. G. Cooper, “Depolarization of linearly polarized EM waves backscattered from rough metals and inhomogeneous dielectrics,” J. Opt. Soc. Am.57(4), 459–466 (1967). [CrossRef] [PubMed]
  25. J. R. Bergen, P. Anandan, K. J. Hanna, and R. Hingorani, “Hierarchical model-based motion estimation,” in Proceedings of the Second European Conference on Computer Vision, Springer-Verlag 588, 237–252, (1992).
  26. A. R. Harvey and D. W. Fletcher-Holmes, “Birefringent Fourier-transform imaging spectrometer,” Opt. Express12(22), 5368–5374 (2004). [CrossRef] [PubMed]
  27. E. Voigtman and J. D. Winefordner, “The multiplex disadvantage and excess low-frequency noise,” Appl. Spectrosc.41(7), 1182–1184 (1987). [CrossRef]
  28. E. A. Stubley and G. Horlick, “Measurement of inductively coupled plasma emission spectra using a Fourier transform spectrometer,” Appl. Spectrosc.39(5), 805–810 (1985). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited