OSA's Digital Library

Virtual Journal for Biomedical Optics

Virtual Journal for Biomedical Optics

| EXPLORING THE INTERFACE OF LIGHT AND BIOMEDICINE

  • Editors: Andrew Dunn and Anthony Durkin
  • Vol. 7, Iss. 12 — Dec. 19, 2012
« Show journal navigation

Structured illumination microscopy for in-vivo human retinal imaging: a theoretical assessment

Sabah Chetty and Steve Gruppetta  »View Author Affiliations


Optics Express, Vol. 20, Issue 23, pp. 25700-25710 (2012)
http://dx.doi.org/10.1364/OE.20.025700


View Full Text Article

Acrobat PDF (839 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Structured illumination microscopy applied to in-vivo retinal imaging has the potential to provide a low-cost and powerful diagnostic tool for retinal disease. In this paper the key parameters that affect performance in structured illumination ophthalmoscopy are studied theoretically. These include the number of images that need to be acquired in order to generate a sectioned image, which is affected by the non-stationary nature of the retina during acquisition, the choice of spatial frequency of the illuminating sinusoid, the effect of typical ocular aberrations on axial resolution and the nature of the sinusoidal pattern produced by the illumination system. The results indicate that structured illumination ophthalmoscopy can be a robust technique for achieving axial sectioning in retinal imaging without the need for complex optical systems.

© 2012 OSA

1. Introduction

Structured illumination is a microscopy technique first proposed by Neil et al. [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

] in 1997 as a method which delivers enhanced axial resolution compared to conventional imaging. Structured illumination requires projection of structured light, in the form of a sinusoidal intensity grating, on to the sample to be imaged, achieved either by using a physical grid which is imaged onto the sample or via interference of coherent light to produce fringes [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

3

3. D. Karadaglić and T. Wilson, “Image formation in structured illumination wide-field fluorescence microscopy,” Micron 39, 808–818 (2008). [CrossRef]

]. Structured illumination can also be used to improve lateral resolution beyond the diffraction limit [4

4. M. G. L. Gustafsson, “Nonlinear structured-illumination microscopy: Wide-field fluorescence imaging with theoretically unlimited resolution,” Proceedings of the National Academy of Science 102(37), 13,081–13,086 (2005). [CrossRef]

]. This paper focusses on the enhancement of axial resolution through structured illumination which enables imaging of resolved slices of the sample, allowing a three-dimensional reconstruction with reduced light contributions from neighbouring planes.

Structured illumination has been used to obtain sectioned images of various three-dimensional samples, including live cells [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

3

3. D. Karadaglić and T. Wilson, “Image formation in structured illumination wide-field fluorescence microscopy,” Micron 39, 808–818 (2008). [CrossRef]

, 5

5. P. Kner, B. B. Chhun, E. R. Griffis, L. Winoto, and M. G. L. Gustafsson, “Super-resolution video microscopy of live cells by structured illumination,” Nature Methods 6, 339–342 (2009). [CrossRef] [PubMed]

]. We have recently proposed the use of this technique applied to imaging the retina in the living human eye using a Structured Illumination Ophthalmoscope (SIO) [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

]. The SIO uses a modified, incoherent fringe projection technique to produce the structured illumination, using the interference pattern generated by a Michelson interferometer setup [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

, 7

7. S. Gruppetta, “Optical Imaging System,” WO Patent WO/2012/059564, PCT/EP2011/069375 (2011).

]. The application of structured illumination to ophthalmoscopy has the potential to provide a low-cost, three-dimensional retinal imaging device that offers significant advantages over existing retinal imaging techniques [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

].

The axial sectioning capability in structured illumination microscopy and ophthalmoscopy depends on a number of parameters which include the spatial frequency, relative phase and modulation of the illuminating sinusoidal pattern, and the aberrations present within the imaging system. The latter cannot be ignored in the case of ophthalmoscopy in which the aberrations introduced by the human eye are non negligible. In this Paper the influence of these parameters on the imaging capability of the SIO is studied.

A simulation model developed in MATLAB (The MathWorks, Inc.), in which the image formation process of the SIO is simulated, was used to investigate the parameter space. Following a brief review of the imaging theory of structured illumination microscopy in Section 2, we present a modified method more suited for retinal imaging applications in Section 3 with emphasis on the effect of random phase shifts of the illuminating sinusoidal patterns. In Section 4, the choice of spatial frequency of the illuminating sinusoid is discussed. Section 5 investigates the effect of adding aberrations that are representative of human eye aberrations on the axial sectioning capabilities of the SIO. The final section explores how changes in modulation of the illuminating sinusoidal pattern with axial distance affect the imaging performance. Empirical data obtained from an SIO prototype were used to determine the decrease in modulation with distance from focus of the SIO.

2. Theoretical background of the Structured Illumination Ophthalmoscope

The theory of structured illumination microscopy is well documented elsewhere [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

3

3. D. Karadaglić and T. Wilson, “Image formation in structured illumination wide-field fluorescence microscopy,” Micron 39, 808–818 (2008). [CrossRef]

,8

8. D. Karadaglić, “Wide-field Optical Sectioning Microscopy using Structured Illumination,” Ph.D. thesis, University of Oxford (2004).

,9

9. N. Hagan, L. Gao, and T. S. Tkaczyk, “Quantitative sectioning and noise analysis for structured illumination microscopy,” Optics Express 20(1), 403–413 (2012). [CrossRef]

] and the adaptation to ophthalmoscopy has been described by the authors in Ref. [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

]. In summary, the sample to be imaged is illuminated by a sinusoidally-varying intensity pattern given by:
Iillumination(to,wo)=1+μcos(νto+ϕ),
(1)
where μ is the modulation, ϕ is the phase and ν is the normalised spatial frequency defined as ν = νλ/NA with ν′ being the actual spatial frequency, λ is the wavelength and NA is the numerical aperture. (to, wo) are the normalised lateral optical coordinates at the object plane that relate to the actual lateral coordinates (xo, yo) via the equation (to, wo) = k(xo, yo)n sinα, as defined by Wilson and Sheppard [10

10. T. Wilson and C. Sheppard, eds., Theory and Practice of Scanning Optical Microscopy (Academic Press, 1984).

], where k = 2π/λ and nsinα is the numerical aperture. For simplicity, the modulation is fixed for all simulations at μ = 1 with the exception of the simulations in Section 6.

It can be shown that the intensity distribution of the image produced at the image plane (ti, wi) is given by [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

]:
I(ti,wi)=I0(ti,wi)+μ2eiϕIν(ti,wi)+μ2eiϕIν(ti,wi),
(2)
where
I0(ti,wi)=ρ(to,wo)|h(ti+to,wi+wo)|2dtodwo,
(3)
Iν(ti,wi)=eiνtoρ(to,wo)|h(ti+to,wi+wo)|2dtodwo,
(4)
Iν(ti,wi)=eiνtoρ(to,wo)|h(ti+to,wi+wo)|2dtodwo,
(5)
and ρ(to, wo) is the intensity reflectance at the object plane and h is the amplitude point spread function of the objective.

The image described by Eq. (2) is composed of three distinct parts. The first is a conventional image, I0, which does not exhibit axial sectioning [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

, 10

10. T. Wilson and C. Sheppard, eds., Theory and Practice of Scanning Optical Microscopy (Academic Press, 1984).

]. The other two components, Iν and Iν, are the higher frequency terms that do exhibit axial sectioning.

The original technique proposed by Neil et al. [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

] relies on acquiring three successive images where the sinusoidal pattern has distinct phases ϕ1 = 0, ϕ2 = 2π/3, ϕ3 = −2π/3. These three images can be combined to produce a final image, given by |I±ν|, which eliminates the conventional image term I0 resulting in an image that exhibits axial sectioning [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

,6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

], which means that the overall intensity of the image will decrease with increased distance from the focal plane. Throughout this paper the raw images, given by Eq. (2) and which have a sinusoidal modulation will be referred to as sub-images to distinguish them from the final image obtained, |I±ν|.

3. Structured illumination with randomly-shifted illumination patterns

The restriction imposed by having to use three fixed phases for the collection of the sub-images is much easier to implement on a stationary target, such is the case with the conventional microscope using structured illumination. In the case of a moving, non-stationary target such as a living human retina, it is possible to maintain a fixed sinusoidal pattern and use eye movements themselves to provide the phase difference between the sub-images collected. The phases would then be random and would need to be calculated in post-processing. An alternative approach to extract the final image |I±ν| from the sub-images has been proposed by Shroff et al. [11

11. S. A. Shroff, J. R. Fienup, and D. R. Williams, “Phase-shift estimation in sinusoidally illuminated images for lateral superresolution,” J. Opt. Soc. Am. A 26(2), 413–424 (2009). [CrossRef]

, 12

12. S. Shroff, J. Fienup, and D. Williams, “Lateral superresolution using a posteriori phase shift estimation for a moving object: experimental results,” JOSA A 27(8), 1770–1782 (2010). [CrossRef] [PubMed]

] in which the sub-images acquired are expressed as a system of simultaneous linear equations. We propose a modified approach to that of Shroff et al. by defining sets of equations in the spatial domain rather than the Fourier domain, as shown below.

It is possible to rewrite Eq. (2) for N sub-images, corresponding to N random phases of the sinusoidal illumination pattern as a series of linear equations in matrix notation:
b=Ax,
(6)
where
b=[I1I2IN]N×1,
(7)
A=[1μ2eiϕ1μ2eiϕ11μ2eiϕ2μ2eiϕ21μ2eiϕnμ2eiϕn]N×3,
(8)
x=[I0IνIν]3×1.
(9)

By finding the pseudo-inverse of matrix A using singular value decomposition (SVD) it is possible to solve for x. Since x consists of three terms, a minimum of three sub-images is required to solve for x. The case where N = 3 and ϕ1 = 0, ϕ2 = 2π/3, ϕ3 = −2π/3 corresponds to the analytical case described above and is the optimum solution. For phases that are different to this set, more sub-images will be required to yield a good solution, as will be discussed below. This method thus makes it possible to collect a sequence of sub-images sequentially from a moving target which will result in a random phase shift between sub-images.

The validity of the solution in this case will depend on the condition number κ(A) of matrix A, defined as the ratio of the largest singular value of the matrix to the smallest [13

13. G. J. Gbur, Mathematical Methods for Optical Physics and Engineering (Cambridge University Press, 2011).

]. The ideal case where N = 3 and ϕ1 = 0, ϕ2 = 2π/3, ϕ3 = −2π/3 yields a condition number κ = 1 which corresponds to the exact solution provided by this scenario. In general, κ ≥ 1 where the higher the value of the condition number, the less stable the SVD of the matrix is, which yields a poor solution in the presence of noise and measurement error.

In order to investigate the effect of relying on random phase shifts to extract the final image from N sub-images, 100 sets were generated each containing three randomly selected phases. These were input in matrix A (Eq. (8)). The condition number κ(A) was calculated for each set. The procedure was then repeated for sets of N random phases where N = 4, 5,...9. Figure 1 shows the plot of the median condition number κ(A) obtained for each value of N. Although a random selection of three phases is likely to yield a high κ(A) resulting in an unstable solution, even the addition of a fourth sub-image is sufficient to bring the median value of κ(A) down considerably. The addition of further sub-images has a diminishing effect on κ(A) indicating that on average, about 5 to 7 sub-images would be sufficient to yield a good condition number as this is the point where the median value is close to its asymptotic value and the variability (illustrated by the 25th and 75th centiles in Fig. 1) is low, making a stable solution likely.

Fig. 1 Plot of median condition number κ(A) (blue line) for sets of 100 matrices with N random phases, N = 3, 4,...,9. The dotted green lines indicate the 25th and 75th centiles (75th centile for N = 3 is 22.9) and the dashed black line indicates the floor for this plot given by the ideal case κ(A) = 1.

As discussed above, high condition numbers lead to unstable solutions which are very sensitive to small errors in determining the exact relative phases ϕn. The threshold value for κ(A) above which solutions are unacceptable depends on numerous factors including the spatial frequency ν of the sinusoidal illumination (as the higher the frequency, and hence the smaller the period of the sinusoid, the larger the error in determining the phase will be) and the noise within the images acquired. A practical implementation of this method would also need to take into account the condition number κ(A) of each individual set of N sub-images. The 100 sets generated with N = 5 which yielded the median value of κ(A) = 2.2 shown in Fig. 1 ranged from a minimum κ(A) = 1.2 to a maximum κ(A) = 58.8. In the latter case, further sub-images would be required to give a stable solution. An implementation of this method would consist of acquiring a sequence of sub-images of the moving sample, registering the images and extracting the relative phase of the sinusoidal patterns from the Fourier transform of the sub-images, then calculating the condition number for the first three sub-images initially and adding further sub-images until the desired condition number is achieved. Within such an implementation, a two-second acquisition at 20Hz would yield 40 frames (sub-images) from which multiple processed images can be calculated. A short acquisition time is preferable to maximise light exposure.

4. Effect of frequency ν of the sinusoidal illumination pattern on axial resolution

There are many metrics used to assess imaging systems but one in particular is relevant when discussing axial sectioning, namely the full width at half maximum (FWHM) of the plot of normalised integrated intensity with defocus [10

10. T. Wilson and C. Sheppard, eds., Theory and Practice of Scanning Optical Microscopy (Academic Press, 1984).

], which represents the variation of overall light intensity (irradiance) at the image plane with axial distance. This plot will be referred to as the integrated intensity plot throughout this paper. The normalised axial coordinate u will be used, which relates to the actual axial coordinate z through the equation u = 4knzsin2(α/2) [10

10. T. Wilson and C. Sheppard, eds., Theory and Practice of Scanning Optical Microscopy (Academic Press, 1984).

]. In the human eye, for a wavelength of 600nm and a pupil diameter of 5mm, 1 unit of u is equivalent to 4.6μm at the retina.

Figure 2 compares the integrated intensity plots for several imaging systems. The conventional microscope (green plot) does not offer any resolving ability. Each plane contributes the same amount of energy to the final image, irrespective of axial position. The plots in Fig. 2 provide evidence that the imaging performance of the SIO (blue plot represents theoretical response; red plot is obtained from simulation) is comparable to the Scanning Laser Ophthalmoscope (SLO, dotted line plot). There are many potential advantages of the SIO compared with the SLO which are discussed in depth by the authors in Ref. [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

].

Fig. 2 Comparison of the integrated intensity curves for the conventional imaging system (green), the Scanning Laser Ophthalmoscope (SLO, dotted), and the SIO (red and blue). The blue plot represents the theoretical performance of the SIO as given by the Stokseth approximation [2] while the red plot represents the simulation results.

4.1. Assessing axial resolution directly from the MTF: an intuitive interpretation of axial sectioning within structured illumination microscopy

The multiplication of the sinusoidal illumination pattern with the reflectance of the sample (which corresponds to a convolution in the Fourier domain) has the effect of creating copies of the object Fourier transform centred at ±ν in the Fourier domain. Therefore the transmission of the information present in these first order terms is determined by the value of the Modulation Transfer Function (MTF) of the optical system at that frequency, given by |ℋ(ν)|, where ℋ is the Optical Transfer Function (OTF). If we now consider the defocused MTF given by |ℋ (ν; u)| we note that the magnitude falls at different rates for different values of ν; |ℋ (ν = 0;u)| is a constant which is the reason why conventional imaging does not exhibit axial sectioning, as shown in Fig. 2 whereas |ℋ (ν ≠ 0;u)| varies at different rates depending on the value of ν. The movie in Fig. 3 ( Media 1) shows |ℋ (ν; u = uk)| for a number of different values of uk.

Fig. 3 Modulation transfer function at various normalised units of defocus. ( Media 1) The red dot represents the optimum frequency ν = 1.

Following this interpretation of structured illumination imaging, it is therefore possible to derive the normalised integrated intensity plot directly from |ℋ (ν; u)|. The movie in Fig. 3 ( Media 1) highlights the point representing ν = 1 with a red dot, and a plot of |ℋ (ν = 1;u)| therefore yields the same integrated intensity plot shown in Fig. 2. This approach provides a powerful tool to assess the axial sectioning capability of structured illumination imaging systems especially when dealing with aberrated pupil functions, as discussed in Section 5.

4.2. Axial resolution as a function of spatial frequency

Neil et al. [1

1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

] showed that the spatial frequency that gives the best axial resolution in structured illumination imaging is given by ν = 1 or equivalently ν′ = NA/λ. This can be explained qualitatively by looking at the movie in Fig. 3 ( Media 1) and noting that the rate of change of |ℋ| with respect to u is largest at ν = 1. This means that the contribution to the final image falls rapidly with defocus.

By plotting the value of |ℋ (ν = νk; u)| for different values of νk it is therefore possible to obtain integrated intensity curves for spatial frequencies across the whole range (0 ≤ ν ≤ 2;ν = 2 is the cut-off frequency).

The FWHM, which is a measure of axial resolution, was extracted from these generated integrated intensity plots. Figure 4 shows a plot of the FWHM against normalised spatial frequency ν showing that the minimum of the FWHM curve occurs at ν = 1. Figure 4 also shows that the FWHM does not change appreciably for a considerable range around the optimal frequency of ν = 1. The FWHM at ν = 0.8 is only 4% higher than the minimum FWHM, and at ν = 0.7 it is 6% higher. This result is important as there is a trade-off to be made between axial resolution and contrast of the sinusoidal pattern in the sub-images acquired. The relatively low loss of axial resolution brought about by choosing a spatial frequency of, for example, ν = 0.7 instead of ν = 1 would result in an increased contrast owing to the higher MTF value at this frequency. The increased contrast decreases the error in estimating the phase of the sinusoidal pattern leading to more stability in the pseudo-inversion of matrix A (Eq. (8)).

Fig. 4 A plot of the FWHM of the integrated intensity curves as a function of spatial frequency ν. The FWHM is a measure of axial resolution.

5. Effect of ocular aberrations on axial resolution

Aberrations and their effect on imaging play a more important role in ophthalmoscopy than in other microscopy applications since the optics of the eye, which have relatively poor optical quality when compared to manufactured optics, must act as an objective lens for the imaging system. In order to investigate the effect of ocular aberrations on the performance of the SIO, a set of 100 wavefront aberration maps representative of ocular aberrations was generated by using the distribution of Zernike coefficients of a population of eyes published by Thibos et al. [14

14. L. N. Thibos, A. Bradley, and X. Hong, “A statistical model of the aberration structure of normal, well-corrected eyes,” Ophthal. Physiol. Opt. 22, 427–433 (2002). [CrossRef]

]. The generated wavefront maps consisted of the first 14 Zernike terms, ranging from the first to the fourth radial orders. Figure 5 shows the distribution of Zernike coefficients for the reference set of 100 virtual eyes which were used in the aberration simulations.

Fig. 5 Distribution of Zernike coefficients for 100 virtual eyes. Data point shows the mean coefficient while error bars represent 2 standard deviations. Both single- and double-index notations are used to label Zernike terms.

The defocus term Z20 present in the generated wavefront map was removed from each of the wavefronts since this simply represents a shift in focal plane, and then it was varied systematically to represent the axial scanning required to investigate the axial resolution of the SIO. The MTF for each wavefront map and for each focus position was then calculated at ν = 1 and used to generate the integrated intensity plot for each individual wavefront map, as discussed in Section 4.1. The wavefront maps were ranked based on the FWHM of the integrated intensity plots they produced and the 5th and 95th percentiles were used to represent the range of these axial resolution values to reduce the effect of any outliers in the set of 100 wavefront maps. Figure 6 shows integrated intensity plots for the wavefront maps representing the 5th, 50th and 95th percentile FWHM. The plot in blue is the integrated intensity curve in the absence of aberrations, showing that the axial resolution is improved for most of the aberration maps within the set. This result could seem counterintuitive, but by referring back to the discussion in Section 4.1 it can be explained by the fact that the rate of change of |ℋ| with respect to u increases for most aberrated MTFs. This means that the contribution to the final image of axial planes falls off more rapidly with defocus. It should be pointed out that the contrast of the sinusoidal illumination in the sub-images would also be reduced leading in decreased accuracy in estimating the phase of the illuminating pattern. Lateral resolution will still be affected adversely by aberrations as in conventional imaging.

Fig. 6 Integrated intensity plots corresponding to aberrated wavefronts (dashed lines) and the unaberrated case (solid blue line). The three plots shown for the aberrated cases correspond to those which yield the 5th, 50th and 95th percentile FWHM. Therefore 90% of all aberrated wavefronts generated would produce integrated intensity curves that fall within the limits shown.

6. Reduction of modulation μ of the sinusoidal illumination with defocus

In the two standard methods described in the literature for providing structured illumination, namely fringe projection and grid projection, the modulation of the sinusoidal pattern varies in a different manner with defocus [2

2. D. Karadaglić, “Image formation in conventional brightfield reflection microscopes with optical sectioning via structured illumination,” Micron 39, 302–310 (2008). [CrossRef]

, 3

3. D. Karadaglić and T. Wilson, “Image formation in structured illumination wide-field fluorescence microscopy,” Micron 39, 808–818 (2008). [CrossRef]

]. In the former, the fringe pattern is generated through interference of coherent shifted light beams and therefore the modulation is a constant for all values of defocus u within the sample. In the latter method, an incoherent grid pattern is imaged onto the sample and hence the modulation falls with defocus at a rate that depends on the frequency of the sinusoidal pattern and the MTF of the imaging system. In the SIO, a third illumination method has been proposed that is better suited for the application of structured illumination in ophthalmoscopy in which the sinusoidal illumination is the result of Fizeau fringes produced by an incoherent source through a Michelson interferometer set-up [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

]. Such Fizeau fringes are localised at a plane and therefore their modulation will also decrease with defocus, but an analytical solution that characterises the modulation as a function of defocus is not available for the general case [15

15. M. Born and E. Wolf, Principles of Optics, seventh ed. (Cambridge University Press, 1999).

].

In order to investigate the effect of a reduction in modulation of the sinusoidal pattern with defocus on the axial sectioning capabilities of the SIO, data was collected experimentally using a bench-based prototype of the SIO. The full details of this prototype will de described elsewhere; the illumination system used to collect the data will be described briefly here. An LED light source (630nm, Linos HighLED) was used with a Michelson interferometer consisting of a 50:50 beamsplitter and two mirrors. The common path illuminates the sample via an objective lens such that Fizeau fringes are formed at a plane that is conjugate to the plane containing the Michelson mirrors. In the SIO, the sample is the retina and the objective is the optics of the eye. However, for the purposes of collecting data on the modulation of the sinusoidal pattern as a function of defocus, a lens (f′ = +200mm) was used with a CMOS camera (Thorlabs 1280x1024 CMOS camera) at the retinal plane. Images of the sinusoidal pattern were acquired as the camera was translated axially and the modulation of these sinusoidal patterns was extracted from the Fourier transform of these images. The variation of modulation with distance from focus is plotted in Fig. 7.

Fig. 7 Experimentally determined modulation change with defocus in the SIO prototype using the modified Fizeau fringe projection illumination technique [6].

These data were input in the SIO simulation, with the results shown in Fig. 8. The plot shows that a decrease in modulation (red plot in Fig. 8) results in a small improvement in axial resolution (smaller FWHM of the integrated intensity curve) when compared with the case for constant modulation (blue plot). More marked is the absence of the side lobe which is present under constant modulation.

Fig. 8 Integrated intensity plot for constant modulation (μ = 1, blue) and for the variable modulation in Fig. 7 (red).

7. Conclusion

The methods used to implement structured illumination in microscopy cannot be used directly for retinal imaging owing to the various restrictions imposed by the human eye [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

, 11

11. S. A. Shroff, J. R. Fienup, and D. R. Williams, “Phase-shift estimation in sinusoidally illuminated images for lateral superresolution,” J. Opt. Soc. Am. A 26(2), 413–424 (2009). [CrossRef]

]. To address these issues, a new illumination system for structured illumination has been proposed by the authors [6

6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

, 7

7. S. Gruppetta, “Optical Imaging System,” WO Patent WO/2012/059564, PCT/EP2011/069375 (2011).

] and an algorithm to extract the final sectioned image from sub-images with illumination patterns having random phase shifts was introduced by Shroff et al. [11

11. S. A. Shroff, J. R. Fienup, and D. R. Williams, “Phase-shift estimation in sinusoidally illuminated images for lateral superresolution,” J. Opt. Soc. Am. A 26(2), 413–424 (2009). [CrossRef]

, 12

12. S. Shroff, J. Fienup, and D. Williams, “Lateral superresolution using a posteriori phase shift estimation for a moving object: experimental results,” JOSA A 27(8), 1770–1782 (2010). [CrossRef] [PubMed]

] with a modified approach presented here in Section 3. In this paper, parameters that affect axial resolution in structured illumination imaging have been studied from the perspective of the retinal imaging application.

It has been shown that the non-stationarity of the sample in the retinal imaging case means that more than three sub-images will be needed in general, however on average between five to seven sub-images will be sufficient to give a stable solution, as discussed in Section 3 and illustrated in Fig. 1. This means that dynamic imaging will still be possible for applications that require time series of retinal images. The stability of this algorithm will still depend on the error in measuring the phase shifts of the sinusoidal illumination patterns in between frames. Lower spatial frequencies will in general provide a better estimate of the phase for two reasons: Firstly, the accuracy in measuring the phase of the sinusoidal pattern is limited by the pixel size of the camera used to acquire the images in relation to the period of the sinusoid; this error will be a smaller fraction of the period for lower frequency sinusoids. Secondly, the modulation of the imaged sinusoidal pattern will be larger for lower frequencies since the MTF will have a higher value (this is always true for the unaberrated case and also true for most aberrated cases.) The results in Section 4 show that it is possible to use frequencies lower than the theoretically optimum frequency of ν = 1 without appreciable loss of axial resolution.

Aberrations typical of human eyes have been used to investigate the effect on the axial resolution of the SIO, showing that contrary to other imaging modalities, aberrations in general do not affect the axial resolution adversely. A modest improvement in axial resolution is noted, although the lower modulation of the imaged sinusoidal pattern in the aberrated case will lead to an increase in phase estimation error of the sinusoidal pattern. The variable modulation imposed by the illumination system also improves imaging performance by decreasing the contribution of out-of-focus planes even further.

The interdependency of the various parameters is a complex one. Of particular interest is the effect of aberrations on the optimal frequency and the rate of change of axial resolution with frequency (as shown in Fig. 4 for the unaberrated case.) Future work will study this interdependency further both theoretically and experimentally.

Acknowledgments

This work is supported by a grant from the Engineering and Physical Sciences Research Council (EPSRC), UK, EP/H017933/1.

References and links

1.

M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters 22(24), 1905–1907 (1997). [CrossRef]

2.

D. Karadaglić, “Image formation in conventional brightfield reflection microscopes with optical sectioning via structured illumination,” Micron 39, 302–310 (2008). [CrossRef]

3.

D. Karadaglić and T. Wilson, “Image formation in structured illumination wide-field fluorescence microscopy,” Micron 39, 808–818 (2008). [CrossRef]

4.

M. G. L. Gustafsson, “Nonlinear structured-illumination microscopy: Wide-field fluorescence imaging with theoretically unlimited resolution,” Proceedings of the National Academy of Science 102(37), 13,081–13,086 (2005). [CrossRef]

5.

P. Kner, B. B. Chhun, E. R. Griffis, L. Winoto, and M. G. L. Gustafsson, “Super-resolution video microscopy of live cells by structured illumination,” Nature Methods 6, 339–342 (2009). [CrossRef] [PubMed]

6.

S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express 2(2), 255–263 (2011). [CrossRef] [PubMed]

7.

S. Gruppetta, “Optical Imaging System,” WO Patent WO/2012/059564, PCT/EP2011/069375 (2011).

8.

D. Karadaglić, “Wide-field Optical Sectioning Microscopy using Structured Illumination,” Ph.D. thesis, University of Oxford (2004).

9.

N. Hagan, L. Gao, and T. S. Tkaczyk, “Quantitative sectioning and noise analysis for structured illumination microscopy,” Optics Express 20(1), 403–413 (2012). [CrossRef]

10.

T. Wilson and C. Sheppard, eds., Theory and Practice of Scanning Optical Microscopy (Academic Press, 1984).

11.

S. A. Shroff, J. R. Fienup, and D. R. Williams, “Phase-shift estimation in sinusoidally illuminated images for lateral superresolution,” J. Opt. Soc. Am. A 26(2), 413–424 (2009). [CrossRef]

12.

S. Shroff, J. Fienup, and D. Williams, “Lateral superresolution using a posteriori phase shift estimation for a moving object: experimental results,” JOSA A 27(8), 1770–1782 (2010). [CrossRef] [PubMed]

13.

G. J. Gbur, Mathematical Methods for Optical Physics and Engineering (Cambridge University Press, 2011).

14.

L. N. Thibos, A. Bradley, and X. Hong, “A statistical model of the aberration structure of normal, well-corrected eyes,” Ophthal. Physiol. Opt. 22, 427–433 (2002). [CrossRef]

15.

M. Born and E. Wolf, Principles of Optics, seventh ed. (Cambridge University Press, 1999).

OCIS Codes
(110.0180) Imaging systems : Microscopy
(170.4460) Medical optics and biotechnology : Ophthalmic optics and devices
(170.4470) Medical optics and biotechnology : Ophthalmology
(170.6900) Medical optics and biotechnology : Three-dimensional microscopy
(180.6900) Microscopy : Three-dimensional microscopy

ToC Category:
Medical Optics and Biotechnology

History
Original Manuscript: September 5, 2012
Revised Manuscript: October 15, 2012
Manuscript Accepted: October 17, 2012
Published: October 29, 2012

Virtual Issues
Vol. 7, Iss. 12 Virtual Journal for Biomedical Optics

Citation
Sabah Chetty and Steve Gruppetta, "Structured illumination microscopy for in-vivo human retinal imaging: a theoretical assessment," Opt. Express 20, 25700-25710 (2012)
http://www.opticsinfobase.org/vjbo/abstract.cfm?URI=oe-20-23-25700


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. M. A. A. Neil, R. Juskaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Optics Letters22(24), 1905–1907 (1997). [CrossRef]
  2. D. Karadaglić, “Image formation in conventional brightfield reflection microscopes with optical sectioning via structured illumination,” Micron39, 302–310 (2008). [CrossRef]
  3. D. Karadaglić and T. Wilson, “Image formation in structured illumination wide-field fluorescence microscopy,” Micron39, 808–818 (2008). [CrossRef]
  4. M. G. L. Gustafsson, “Nonlinear structured-illumination microscopy: Wide-field fluorescence imaging with theoretically unlimited resolution,” Proceedings of the National Academy of Science102(37), 13,081–13,086 (2005). [CrossRef]
  5. P. Kner, B. B. Chhun, E. R. Griffis, L. Winoto, and M. G. L. Gustafsson, “Super-resolution video microscopy of live cells by structured illumination,” Nature Methods6, 339–342 (2009). [CrossRef] [PubMed]
  6. S. Gruppetta and S. Chetty, “Theoretical study of multispectral structured illumination for depth resolved imaging of non-stationary objects: focus on retinal imaging,” Biomedical Optics Express2(2), 255–263 (2011). [CrossRef] [PubMed]
  7. S. Gruppetta, “Optical Imaging System,” WO Patent WO/2012/059564, PCT/EP2011/069375 (2011).
  8. D. Karadaglić, “Wide-field Optical Sectioning Microscopy using Structured Illumination,” Ph.D. thesis, University of Oxford (2004).
  9. N. Hagan, L. Gao, and T. S. Tkaczyk, “Quantitative sectioning and noise analysis for structured illumination microscopy,” Optics Express20(1), 403–413 (2012). [CrossRef]
  10. T. Wilson and C. Sheppard, eds., Theory and Practice of Scanning Optical Microscopy (Academic Press, 1984).
  11. S. A. Shroff, J. R. Fienup, and D. R. Williams, “Phase-shift estimation in sinusoidally illuminated images for lateral superresolution,” J. Opt. Soc. Am. A26(2), 413–424 (2009). [CrossRef]
  12. S. Shroff, J. Fienup, and D. Williams, “Lateral superresolution using a posteriori phase shift estimation for a moving object: experimental results,” JOSA A27(8), 1770–1782 (2010). [CrossRef] [PubMed]
  13. G. J. Gbur, Mathematical Methods for Optical Physics and Engineering (Cambridge University Press, 2011).
  14. L. N. Thibos, A. Bradley, and X. Hong, “A statistical model of the aberration structure of normal, well-corrected eyes,” Ophthal. Physiol. Opt.22, 427–433 (2002). [CrossRef]
  15. M. Born and E. Wolf, Principles of Optics, seventh ed. (Cambridge University Press, 1999).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Supplementary Material


» Media 1: MOV (4411 KB)     

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited