OSA's Digital Library

Biomedical Optics Express

Biomedical Optics Express

  • Editor: Joseph A. Izatt
  • Vol. 3, Iss. 2 — Feb. 1, 2012
  • pp: 225–239
« Show journal navigation

Adaptive optics with pupil tracking for high resolution retinal imaging

Betul Sahin, Barbara Lamory, Xavier Levecq, Fabrice Harms, and Chris Dainty  »View Author Affiliations


Biomedical Optics Express, Vol. 3, Issue 2, pp. 225-239 (2012)
http://dx.doi.org/10.1364/BOE.3.000225


View Full Text Article

Acrobat PDF (1441 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics.

© 2012 OSA

1. Introduction

Imaging the human retina at high resolution may have an impact on diverse areas of research such as treatment of retinal diseases, human cognition, nervous system and metabolism, as the highly transparent retina is an extension of the brain and includes blood vessels. Lower order aberrations of the eye, i.e., myopia and hyperopia, can be corrected by spectacles where rapidly changing higher order (i.e., more irregular) aberrations, which exist in all human eyes in low magnitudes, cannot be corrected efficiently by conventional refractive optics.

Adaptive optics is the assembly of auxiliary tools that are used to compensate for changing aberrations in real time in an optical system. A wavefront corrector (reflective or refractive) is a must in such an auxiliary system where the algorithm that controls the wavefront reshaping process can be based on wavefront sensor measurements or any other relevant parameter such as image quality [1

1. H. Hofer, N. Sredar, H. Queener, C. Li, and J. Porter, “Wavefront sensorless adaptive optics ophthalmoscopy in the human eye,” Opt. Express 19(21), 14160–14171 (2011). [CrossRef] [PubMed]

]. It was first used in astronomical telescopes and was adapted to ophthalmology by Liang et al. [2

2. J. Liang, D. R. Williams, and D. T. Miller, “Supernormal vision and high–resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A 14(11), 2884–2892 (1997). [CrossRef]

] where the higher order aberrations of the eye were corrected in an open loop, using a static deformable mirror and wavefront sensing. This was followed by dynamic corrections which used the deformable mirror and the wavefront sensor in a closed loop and resulted in greatly improved resolution of retinal images [3

3. H. Hofer, P. Artal, B. Singer, J. L. Aragon, and D. R. Williams, “Dynamics of the eye’s wave aberration,” J. Opt. Soc. Am A 18(3), 497–506 (2001). [CrossRef]

].

Higher order aberrations of the eye are usually attributed to the shape and position of the lens and the surface of the cornea [4

4. M. Zhu, M. Collins, and D. R. Iskander, “Microfluctuations of wavefront aberrations of the eye,” Ophthal. Physiol. Opt. 24(6), 562–571 (2004). [CrossRef]

], and among the major reasons for the rapid changes of the aberrations with respect to the wavefront sensor are head and fixational eye movements, crystalline lens fluctuations and changes in the thickness of the tear film. Fixational eye movements, i.e., tremors, drifts and micro saccades, are an important part of vision as our nervous system is based on visual adaptation and if our eyes were to stay still, the world would fade from view [5

5. S. Martinez-Conde, S. L. Macknick, and D. Hubel, “The role of fixational eye movements in visual perception,” Nat. Rev. Neurosci. 5, 229–240 (2004). [CrossRef] [PubMed]

]. Tremor is a wavelike motion of the eyes with small amplitudes (∼diameter of a cone in the fovea) while micro saccades are fast (10 – 100 deg/s) and jerky eye movements that correct for the displacements caused by drifts (0.5 deg/s) that carry the eye away from the fixation target. Changes in higher order aberrations of the eye can be as fast as 80 Hz as measured by a 300 Hz bandwidth wavefront sensor [6

6. T. Nirmaier, G. Pudasaini, and J. Bille, “Very fast wave–front measurements at the human eye with a custom CMOS–based Hartmann–Shack sensor,” Opt. Express 11(21), 2704–2716 (2003). [CrossRef] [PubMed]

], which is reasonable considering high frequency components of eye movements, i.e., tremors, are measured to be at ∼ 88 Hz [7

7. N. Collins, M. alKalbani, G. Boyle, C. Baily, D. Kilmartin, and D. Coakley, “Characterisation of the tremor component of fixational eye movements,” Special issue Conference Abstracts. 14th European Conference on Eye Movements, J. Eye Movem. Res. 1(ECEM2007 Abstracts) , 54 (2007).

].

Tracking the fixational eye movements in real time to estimate and compensate for the change of aberrations caused by the translations of the pupil with respect to the wavefront sensor can be an efficient and cost effective way of improving the resolution of retinal imaging systems designed for clinical research. In the following sections we will first briefly describe the pupil tracking system developed for this purpose, then introduce the method of its integration into the adaptive optics of a compact retinal imaging system and finally demonstrate the experimental results, where higher order aberrations of a moving model eye and three human subjects in vivo were compensated for in real time with the contribution of pupil tracking.

2. Methods

2.1. Pupil tracking

The pupil tracker measures the position of the center of the pupil in real time in the video of eye images assuming all the displacements as horizontal and vertical translations in the range of central ±10° of the visual field [8

8. B. Sahin, F. Harms, B. Lamory, and L. vabre, “A pupil tracking system for adaptive optics retinal imaging,” Proc. SPIE 699169910G (2008). [CrossRef]

]. The pupil tracking algorithm is based on thresholding of the histogram of the homogenously illuminated eye images produced by near infrared LED arrays (950 nm) in which the pupil is the darkest part; see Fig. 1.

Fig. 1 The near infrared eye image showing superimposed estimation of the parabolic fits to the pupil borders, fright (x,y) and fleft (x,y), also indicated are their calculated minimum or maximum (xright, yright) and (xleft, yleft).

After the right and left pupil borders are detected by thresholding, they are filtered and parabolic fits are made to estimate their minimum or maximum. First, the horizontal position of the minimum or the maximum of the parabolas of the pupil borders are derived followed by the vertical positions. Calculation of the coordinates of the right and left pupil border peaks, (xright, yright) and (xleft, yleft), are followed by the calculation of the center and the diameter of the pupil by Eq. (1),
(xcenter,ycenter,D)=(xright+xleft2,yright+yleft2,xrightxleft).
(1)
The diameter of the pupil, which is not directly used in the adaptive optics control algorithm based on pupil tracking, is also estimated by the pupil tracker as it provides a useful insight on the pupil dynamics.

When a model eye is used, measurement of the position of a pupil moving within central ±1 mm of the eye image (visual field of ±3°) and a pupil close to the edge of the eye image had different accuracies: 6±2 μm and 11±8 μm respectively because pupil borders were truncated when the pupil was close to the edge. Pupil diameter measurements had a precision of 1 μm for the model eye and 20 μm in vivo, assuming all the deviations in the measured diameters were due to tracking after the subjects’ pupils were temporarily paralysed (Tropicamide 1%). Precision of the position of the pupil center in vivo on x axis can be estimated indirectly to be 10 μm, i.e., half of the precision of the pupil diameter in vivo based on the calculation of the pupil diameter in Eq. (1). Misalignment of the eye with respect to the system or a defocused image of the eye (which is normally corrected by the operator immediately) decreases the sharpness of the pupil-iris border and contributes to the error of pupil center estimation. The accuracy of the pupil tracker when the model eye (focused well at the center of the image) was moved by 5 mm forwards and backwards was 15 ± 4 μm [9

9. B. Sahin, F. Harms, and B. Lamory, “Performance assessment of a pupil tracking system for adaptive optics retinal imaging,” Proc. SPIE 7139713911 (2008). [CrossRef]

].

The pupil tracker could follow all the drifts and most micro saccades (with speeds up to 50 deg/s on the retina; 25 mm/s on the pupil plane) with its default accuracy as it was estimated using a rotating model eye. Its ability to follow fast eye movements was inversely proportional to the exposure time of the tracking camera as a shorter exposure time means less motion blur in the acquired image. As a result of an unexpected software discrepancy, the pupil tracker which worked at ∼ 85 Hz in continuous mode, worked at ∼ 20 Hz when triggered after being integrated to the retinal imaging system. The reduction in the execution rate of the tracker was due to the processes after the camera exposure, therefore the accuracy of measurements taken during fast eye movements was not affected. But because of the reduced rate, the pupil tracker could not notify the adaptive optics control algorithm fast enough to compensate for those movements on time, the consequences of which will be discussed in the following sections.

Commercial eye trackers have moderate accuracies (0.5°, i.e., approximately 150 microns on the pupil plane) and span a wide field of view (40°–50°) with high rates (500–1000 Hz) at high costs: in contrast, the pupil tracker described above aims for high accuracy in the short range of fixational eye movements at a low price. Although the response time of the pupil tracker needs to be improved, taking into account the present exposure times of the retinal imaging camera, very fast rates of tracking do not seem to be necessary as most of the retinal images acquired during fast eye movements suffer from serious motion blur and are eliminated.

2.2. Adaptive optics

The adaptive optics fundus camera designed for clinical research (rtx1, Imagine Eyes, France) is a compact system that can produce 4° × 4° high resolution images of the retina, especially the cone photoreceptor mosaic [10

10. M. Zacharria, B. Lamory, and N. Château, “Biomedical imaging: new view of the eye,” Nat. Photon. 5(1), 24–26 (2011). [CrossRef]

, 11

11. C. Viard, K. Nakashima, B. Lamory, M. Pâques, X. Levecq, and N. Château, “Imaging microscopic structures in pathological retinas using a flood–illumination adaptive optics retinal camera,” Proc. SPIE 7885788509 (2011). [CrossRef]

]. Its adaptive optics comprise a magnetic membrane deformable mirror of 52 actuators, a Shack-Hartmann wavefront sensor with 32 × 40 lenslets (mirao 52-e, HASO 32-eye, both from Imagine Eyes, France) and a superluminescent diode of 750 nm central wavelength that served the sensor.

In a classical adaptive optics correction based on wavefront sensor measurements, the control algorithm calculates the command vector v to be applied to the deformable mirror at each loop, using the measured slopes vector s of the wavefront as shown in Eq. (2),
v=I×s,
(2)
where I is the pseudo inverse of the interaction matrix (I) that was recorded before.

We developed a new adaptive optics control algorithm based on pupil tracking measurements so that the deformable mirror could also work based on pupil tracking, see Sahin et al. [12

12. B. Sahin, B. Lamory, X. Levecq, L. Vabre, and C. Dainty, “Retinal imaging system with adaptive optics enhanced with pupil tracking,” Proc. SPIE 7885788517 (2011). [CrossRef]

, 13

13. B. Sahin, “Correction of the aberrations of the eye using adaptive optics with pupil tracking,” Ph.D. thesis (School of Physics, National University of Ireland, Galway, 2011); http://optics.nuigalway.ie/theses.

]. Figure 2 shows the basic schematics of the final adaptive optics part of the retinal imaging system which has control algorithms based on wavefront sensing and pupil tracking. Both can be used separately or at the same loop synchronously.

Fig. 2 Adaptive optics system for retinal imaging: the light sources for wavefront sensing (dashed), imaging (solid) and pupil tracking (dotted) that are reflected off the eye are selectively filtered by two dichroic beam splitters (BS1 and BS2). The first and second control algorithms are called based on wavefront sensor and pupil tracking measurements respectively to calculate the commands for the desired shape so that the deformable mirror reshapes the imaging beam (the paths showing light sources entering the eye and optics necessary to conjugate the wavefront sensor and the deformable mirror to the pupil of the eye are not shown for simplicity).

2.3. Adaptive optics with pupil tracking

Fig. 3 (top row) Plots of polynomials representing coma f(x,y) = 3y3 + 3x2y – 2y and its derivative on x axis fx=6xy, (bottom row) subtraction of 3×fx from f(x,y) yields f(x,y)3×fx=3y3+3x2y18xy2y, which is equivalent to f(x – 3,y) except for the tilt term.

Figure 3 describes the basic idea of the method for the control algorithm based on pupil tracking using the Zernike polynomial that represents pure coma aberration, f (x,y), its derivative on the x axis, fx, and the three units shifted coma on the x axis, f(x – 3,y), which is equal to f(x,y)3×fx except for the tilt, plotted in Cartesian coordinates by using Matlab 7.5.

2.4. Software development

After the control algorithms were programmed in both C++ and Labview, the adaptive optics of the imaging system comprised of five elements, the deformable mirror, the wavefront sensor, the control algorithm based on wavefront sensing, the pupil tracker and finally the control algorithm based on pupil tracking. Using those five elements four different types of loops (i.e., repeating measurements of its kind) were constructed in C++ and Labview; see Fig. 4.

Fig. 4 Four loops named WFPT, AOPT, WFPTa and AOPTL designed for the experiments all of which worked at ∼8.4 Hz. The deformable mirror (DM), the wavefront sensor (WFS), the pupil tracker (PT), the control algorithm based on wavefront sensing (CA1) and the control algorithm based on pupil tracking (CA2) are shown accordingly. The solid line linking two elements means that there is a feedback mechanism controlled by one of the algorithms. The deformable mirror outside the WFPTa loop means that it is not updated at each loop: it is static.

The first and the most simple loop is WFPT which incorporates the wavefront sensor and the pupil tracker only. There is no adaptive optics correction in this loop as it is intended to provide information on the aberration dynamics of the eye in relation to pupil displacements. Second, the AOPT loop incorporates the wavefront sensor, the deformable mirror, the pupil tracker and the control algorithm based on wavefront sensing. This loop provides the classical adaptive optics correction in real time based on wavefront sensing while pupil tracking is also active without any contribution to the correction. Third, the WFPTa loop incorporates the wavefront sensor, the pupil tracker and the deformable mirror that corrects for the aberrations statically. This loop is carried out by keeping the deformable mirror at its last correction shape after the aberrations of the eye are corrected by the AOPT loop, thus it provides an insight into the correlation between eye movements and induced aberrations. Lastly, the AOPTL loop incorporates the wavefront sensor, the deformable mirror, the pupil tracker and the control algorithm based on pupil tracking. This loop corrects for the aberrations of the eye in real time based on pupil tracking measurements while the wavefront sensor is also active, although it does not contribute to the correction in real time.

All of the loops provided recordings of the position, diameter of the pupil and wavefront slopes data of the aberrations of the eye. Although the retinal camera was not engaged in these first experiments, as they aimed to assess the wavefront aberration correction only, Table 1 gives the exposure and acquisition times of all the cameras in the system. The exposure and acquisition times not only define the execution rate of the adaptive optics loop but also influence the outcome drastically, as will be discussed in the following sections. The cameras were synchronised in time, while all of the loops were executed at ∼ 8.4 Hz.

Table 1. Exposure and acquisition times for the cameras in the retinal imaging system and calculation times for the control algorithms.

table-icon
View This Table
| View All Tables

3. Results and discussions

3.1. Experiments with a model eye

The first experiments were carried out using a model eye with spherical aberration which had a pupil of 7 mm diameter and was simply a rod lens with a convex top and a reflective and diffusing surface attached at the other end. Unlike the human eye, the model eye had no intrinsic factors that would result in rapid changes in aberrations, therefore measured aberration changes were expected to be purely dependent on motion. This would enable us evaluate the ability of the control algorithm based on pupil tracking to compensate for the aberrations in real time. The model eye was attached to a robust and stable stage positioned in front of the retinal imaging system and the loops were executed while the stage was pushed and pulled away mechanically by hand resulting in a quasi-periodic motion in micron scale; see Fig. 5.

Fig. 5 Wavefront RMS of the WFPT and residual wavefront RMSs of the AOPT, WFPTa and AOPTL loop experiments with a model eye along with the simulation of the WFPTa loop and its residual error RMS; also shown in the same color is the respective pupil positions (P.) of the loops. The data represents only one session performed for each type of loop and it is not an average of several sessions.

Fig. 6 Correlation of the RMS of the measured and the simulated wavefronts of the moving eye in an open loop (WFPTa). The data was categorized into two: the green colored data indicates that the model eye was moving away from the initial position and the purple-grey colored data was taken while the eye was returning back to its initial position

A careful examination of the simulation (blue dotted line) of the WFPTa loop (blue solid line) tells us that there is a time lag in between the pupil tracker and the wavefront sensor. In the correlation plot of the measured and simulated wavefronts there was hysteresis: the plot was elliptical not linear; see Fig. 6. During the experiments the model eye was pushed and pulled away by hand from its initial position and this was immediately followed by a come-back because of the robustness of the set up. The disturbance (going away) took longer time than the restoration (coming back) to the initial position as can be clearly seen by comparing the numbers of green and purple-grey colored data which represent the model eye that goes away and comes back towards the resting position respectively in Fig. 6. When the eye was going away, the control algorithm based on pupil tracking underestimated the aberrations and when the eye was coming back, it overestimated, because the pupil tracker was always measuring the position of the pupil with a slight error, i.e., it was following behind.

This systematic error originated mainly from the configuration of the synchronisation of the exposures of the wavefront sensor and the pupil tracking cameras in time. During the experiments the exposure for the wavefront sensor and the pupil tracker cameras started at the same time while their exposure durations were different: 30 ms and 10 ms respectively, see Table 1. Therefore after the pupil tracker completed the exposure, the wavefront sensor continued for another 20 ms more, which became the source of error in the estimation of the position of the wavefront. A pupil tracker camera that started the exposure 10 ms after the wavefront sensor camera might have given a better estimate of the position of the measured wavefront.

The following in vivo measurements were taken before the synchronisation was updated, therefore this time lag might have been responsible from the large spikes of RMS error during the aberration correction based on pupil tracking, especially when the eye made fast and large movements.

3.2. Experiments in vivo

The experiments were performed in vivo with healthy volunteers (the system has no potential hazard under ISO 15004-2:2007 [11

11. C. Viard, K. Nakashima, B. Lamory, M. Pâques, X. Levecq, and N. Château, “Imaging microscopic structures in pathological retinas using a flood–illumination adaptive optics retinal camera,” Proc. SPIE 7885788509 (2011). [CrossRef]

]) positioned in front of the adaptive optics retinal camera, where their heads were stabilized with a standard ophthalmic chinrest. They were asked to fixate their eyes to the dim red point image of the super luminescent diode light source of the wavefront sensor. They were not applied pupil dilating medicaments or any other solutions. All of the aberration corrections were carried out by the deformable mirror; correcting lenses or the Badal were not used. Figure 7 displays the results of Subject 1, aged 22, for WFPT, AOPT, WFPTa loops and the simulation for the WFPTa loop.

Fig. 7 Subject 1 : (a) Wavefront RMS of the WFPT and residual wavefront RMSs of the (b) AOPT, (c) WFPTa loop experiments in vivo with Zernike orders up to five (tilt or defocus terms are not included in the total RMS or in the Zernike coefficients). The square and diagonal markers indicate a discontinuity in the wavefront sensor and pupil tracker measurements respectively. (d) The simulated wavefront for the WFPTa loop and its error RMS and (e) its correlation with the measured wavefront RMS are also shown. The data represents only one session performed for each type of loop and it is not an average of several sessions.

The mean RMS of the wavefront measurements for the WFPT loop, i.e., the average wavefront aberrations of the subject’s eye (excluding defocus), was 2.16 ± 0.11 μm; see Fig. 7-(a). The most dominant component of the aberrations was the second order Zernike, i.e., the astigmatism, then the third order, i.e., coma. In the course of the AOPT loop, during which the pupil moved 51 ± 67 μm on average in between each measurement, the mean residual wavefront RMS was measured to be 0.12 ± 0.05 μm; see Fig. 7-(b). The mean residual RMS was 0.39 ± 0.09 μm for the WFPTa loop while the pupil moved 67 ± 72 μm on average in between each measurement; see Fig. 7-(c). After the WFPTa experiment, the changes in the wavefront aberrations were simulated using the pupil tracking data, the first wavefront slopes measurement and a reference wavefront yielding a mean wavefront RMS of 0.35 ± 0.10 μm (mean error RMS of the simulations was 0.21 ± 0.07 μm); see Fig. 7-(d). The measured and simulated wavefronts were correlated by a correlation coefficient of 0.9, confirming that the algorithm was able to simulate the changes in the aberrations due to motion; see Fig. 7-(e).

Finally the AOPTL loop, i.e., correction for the aberrations in real time based on pupil tracking, was performed giving a mean residual wavefront RMS of 0.21 ± 0.08 μm; see Figs. 8 and 9.

Fig. 8 Subject 1: Residual wavefront RMS and Zernike orders up to five for the AOPTL loop in vivo (tilt or defocus terms are not included in the total RMS or in the Zernike coefficients). The square and diagonal markers indicate a discontinuity following that moment in the wavefront sensor and pupil tracker measurements respectively. The data represents only one session performed for each type of loop and it is not an average of several sessions.
Fig. 9 Subject 1: Wavefront RMS of the WFPT and residual wavefront RMSs of the AOPT, WFPTa and AOPTL loop experiments in vivo along with the residual error of the WFPTa simulations and their respective pupil positions (P.) shown in the same color.

3.3. Discussions

Table 2 gives a summary of the experiments done with the model eye and experiments in vivo for three subjects.

Table 2. Measurement data for the model eye, Subject 1 (aged 22), Subject 2 (aged 27) and Subject 3 (aged 40); PS means mean pupil shift at each loop; PA is the area in which the pupil center normally was (2σx × 2σy); NL means number of lenslets that was used for wavefront sensor measurements; DP means mean diameter of the pupil of the eye during the measurements; WFPTa Sim is the simulated wavefront, WFPTa Err is the residual wavefront error of the simulations. All the units are in microns, except for the PA (μm2) and NL (lenslets).

table-icon
View This Table
| View All Tables

Subject 2 (aged 27) had the same mean total aberration RMS with the model eye; a smaller pupil and larger eye movements (the WFPT loop). One would expect that the correction based on pupil tracking, i.e., the AOPTL loop, would be more efficient with a model eye as it had no other factors that may cause aberration changes than motion. However Subject 2 and the model eye had similar mean residual wavefront error RMSs for the AOPTL loop. While, of course, Subject 2 had a varying pupil size, their mean pupil diameters and mean pupil displacements were also similar. In the case of Subject 2, the intrinsic factors that caused rapid changes in the higher order aberrations were minor. Simulations for the Subject 2 had a large residual error RMS (0.19 ± 0.03 μm) due probably to a wrong choice of reference wavefront that did not represent the aberrations of the eye successfully. Choice of the reference wavefront is crucial for the algorithm to estimate the displaced wavefronts. Subject 3 (aged 40) had equal residual mean aberration RMSs for the AOPT and AOPTL loops. In this case, the performance of the AOPT loop, i.e., the correction based on wavefront sensing, was below the average, due probably to a wrong wavefront sensor measurement at the beginning of the iterations.

Although defocus term is not included in the calculated RMSs and there is not enough data to make a statistical analysis and arrive to a conclusion on the percentage of the aberration changes that can be corrected based on pupil tracking, the data presented is promising. The proposed method is not superior to the classical correction, when used alone, but adaptive optics may be improved when pupil tracking is used in collaboration with wavefront sensing. This is the major purpose of this research and is detailed in the following section.

3.3.1. Adaptive optics with wavefront sensing and pupil tracking

Figure 10 shows the basic schematics of the loop configuration by which pupil tracking is aimed to enhance the correction done based on wavefront sensing.

Fig. 10 The adaptive optics loop which incorporates all the active elements: the deformable mirror (DM), the wavefront sensor (WFS), the pupil tracker (PT), the control algorithm based on wavefront sensing (CA1) and the control algorithm based on pupil tracking (CA2), where the pupil tracker and the deformable mirror is called two times in a loop. The solid line linking the elements means that there is a feedback mechanism controlled by one of the algorithms.

In this loop the deformable mirror is called two times in order to compensate for the shift of the eye that takes place during the exposure, acquisition of the wavefront sensing camera and calculations done by the control algorithm. In wavefront sensing there is a compromise between speed and precision where fast wavefront sensing also comes with a high cost. High precision and a reasonable cost may lead to slow wavefront sensing, e.g., a large number of lenslets necessitate longer calculation time for the control algorithm and longer exposure times as the light beam is shared between more lenslets. We propose that to compensate for the increased sensing time pupil tracking can be used with no extra cost.

3.3.2. Power spectra

Fig. 11 Power spectra of WFPT loop (70 s) and pupil tracking (214 s) data showing a 1/fα like trend (several recordings were added for a longer sequence).

Recently Hampson et al. [17

17. K. M. Hampson and E. H. Mallen, “Multifractal nature of ocular aberration dynamics of the human eye,” Biomed. Opt. Express 2(3), 464–477 (2011). [CrossRef] [PubMed]

] applied wavelet based fractal analysis which is best suited when the goal is to analyse the self similarity in times series [20

20. A. L. Goldberger, L. A. N. Amaral, J. M. Hausdorff, P. C. Ivanov, C.-K. Peng, and H. E. Stanley, “Fractal dynamics in physiology: alterations with disease and aging,” Proc. Natl. Acad. Sci. U.S.A. 99(3), 2466–2472 (2002). [CrossRef] [PubMed]

] and confirmed the multifractal (containing more than one process with self similarity) nature of aberration dynamics of the eye. It is not only the movements; also branching of the neurons and the small vessels in the retina have fractal properties [21

21. V. I. H. Kwa and O. L. Lopez, “Fractal analysis of retinal vessels: Peeping at the tree of life?” Neurology 74(14), 1088–1089 (2010). [CrossRef] [PubMed]

]. Taking into account the fractal properties of the aberration dynamics may lead to better control algorithms in the future.

4. Conclusions

Using an adaptive optics retinal camera developed for clinical research it was shown that changes of higher order aberrations of the eye including astigmatism were highly correlated with the pupil displacements. Based on this fact it was possible to correct for the aberrations of the eye in real time using a reference wavefront measurement, a pupil tracker and a deformable mirror without the real time contribution of the wavefront sensor measurements.

Although it could not be tested, a better and more stable adaptive optics correction might be achieved with no extra cost, in a configuration where a fast pupil tracker works in collaboration with the wavefront sensor and the deformable mirror is called more than once. Also a moving phase plate (specific to the individual’s ocular aberrations) and a pupil tracker can be used to upgrade an old fashioned retinal camera to an adaptive optics retinal camera with a little cost.

A better correction for the ocular aberrations as measured by the wavefront sensor does not assure always high resolution retinal images even with healthy eyes. To overcome the challenges of retinal imaging with all types of eyes in high resolution and develop a modality that is suitable for clinical use seems to require a better understanding of the visual processes, especially the function and importance of eye movements in vision.

Acknowledgments

References and links

1.

H. Hofer, N. Sredar, H. Queener, C. Li, and J. Porter, “Wavefront sensorless adaptive optics ophthalmoscopy in the human eye,” Opt. Express 19(21), 14160–14171 (2011). [CrossRef] [PubMed]

2.

J. Liang, D. R. Williams, and D. T. Miller, “Supernormal vision and high–resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A 14(11), 2884–2892 (1997). [CrossRef]

3.

H. Hofer, P. Artal, B. Singer, J. L. Aragon, and D. R. Williams, “Dynamics of the eye’s wave aberration,” J. Opt. Soc. Am A 18(3), 497–506 (2001). [CrossRef]

4.

M. Zhu, M. Collins, and D. R. Iskander, “Microfluctuations of wavefront aberrations of the eye,” Ophthal. Physiol. Opt. 24(6), 562–571 (2004). [CrossRef]

5.

S. Martinez-Conde, S. L. Macknick, and D. Hubel, “The role of fixational eye movements in visual perception,” Nat. Rev. Neurosci. 5, 229–240 (2004). [CrossRef] [PubMed]

6.

T. Nirmaier, G. Pudasaini, and J. Bille, “Very fast wave–front measurements at the human eye with a custom CMOS–based Hartmann–Shack sensor,” Opt. Express 11(21), 2704–2716 (2003). [CrossRef] [PubMed]

7.

N. Collins, M. alKalbani, G. Boyle, C. Baily, D. Kilmartin, and D. Coakley, “Characterisation of the tremor component of fixational eye movements,” Special issue Conference Abstracts. 14th European Conference on Eye Movements, J. Eye Movem. Res. 1(ECEM2007 Abstracts) , 54 (2007).

8.

B. Sahin, F. Harms, B. Lamory, and L. vabre, “A pupil tracking system for adaptive optics retinal imaging,” Proc. SPIE 699169910G (2008). [CrossRef]

9.

B. Sahin, F. Harms, and B. Lamory, “Performance assessment of a pupil tracking system for adaptive optics retinal imaging,” Proc. SPIE 7139713911 (2008). [CrossRef]

10.

M. Zacharria, B. Lamory, and N. Château, “Biomedical imaging: new view of the eye,” Nat. Photon. 5(1), 24–26 (2011). [CrossRef]

11.

C. Viard, K. Nakashima, B. Lamory, M. Pâques, X. Levecq, and N. Château, “Imaging microscopic structures in pathological retinas using a flood–illumination adaptive optics retinal camera,” Proc. SPIE 7885788509 (2011). [CrossRef]

12.

B. Sahin, B. Lamory, X. Levecq, L. Vabre, and C. Dainty, “Retinal imaging system with adaptive optics enhanced with pupil tracking,” Proc. SPIE 7885788517 (2011). [CrossRef]

13.

B. Sahin, “Correction of the aberrations of the eye using adaptive optics with pupil tracking,” Ph.D. thesis (School of Physics, National University of Ireland, Galway, 2011); http://optics.nuigalway.ie/theses.

14.

G.-M. Dai, Wavefront Optics for Vision Correction (SPIE, 2008). [CrossRef]

15.

A. Guirao, I. G. Cox, and D. R. Williams, “Effect of rotation and translation on the expected benefit of an ideal method to correct the eye’s higher order aberrations,” J. Opt. Soc. Am. A 18(5), 1003–1015 (2001). [CrossRef]

16.

L. Diaz-Santana, C. Torti, I. Munro, P. Gasson, and C. Dainty, “Benefit of higher closedloop bandwidths in ocular adaptive optics,” Opt. Express 11(20), 2597–2605 (2003). [CrossRef] [PubMed]

17.

K. M. Hampson and E. H. Mallen, “Multifractal nature of ocular aberration dynamics of the human eye,” Biomed. Opt. Express 2(3), 464–477 (2011). [CrossRef] [PubMed]

18.

W. H. Press, “Flicker noises in astronomy and elsewhere,” Comments Astrophys. 7(4), 103–119 (1978).

19.

D. Aks, G. J. Zelinsky, and J. C. Sprott, “Memory across eye-movements: 1/f dynamic in visual search,” Nonlinear Dynam. Psychol. Life Sci. 6(1), 1–25 (2002). [CrossRef]

20.

A. L. Goldberger, L. A. N. Amaral, J. M. Hausdorff, P. C. Ivanov, C.-K. Peng, and H. E. Stanley, “Fractal dynamics in physiology: alterations with disease and aging,” Proc. Natl. Acad. Sci. U.S.A. 99(3), 2466–2472 (2002). [CrossRef] [PubMed]

21.

V. I. H. Kwa and O. L. Lopez, “Fractal analysis of retinal vessels: Peeping at the tree of life?” Neurology 74(14), 1088–1089 (2010). [CrossRef] [PubMed]

OCIS Codes
(170.3890) Medical optics and biotechnology : Medical optics instrumentation
(170.4460) Medical optics and biotechnology : Ophthalmic optics and devices
(100.4999) Image processing : Pattern recognition, target tracking
(110.1080) Imaging systems : Active or adaptive optics

ToC Category:
Active and Adaptive Optics

History
Original Manuscript: October 24, 2011
Revised Manuscript: December 19, 2011
Manuscript Accepted: December 21, 2011
Published: January 3, 2012

Citation
Betul Sahin, Barbara Lamory, Xavier Levecq, Fabrice Harms, and Chris Dainty, "Adaptive optics with pupil tracking for high resolution retinal imaging," Biomed. Opt. Express 3, 225-239 (2012)
http://www.opticsinfobase.org/boe/abstract.cfm?URI=boe-3-2-225


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. H. Hofer, N. Sredar, H. Queener, C. Li, and J. Porter, “Wavefront sensorless adaptive optics ophthalmoscopy in the human eye,” Opt. Express19(21), 14160–14171 (2011). [CrossRef] [PubMed]
  2. J. Liang, D. R. Williams, and D. T. Miller, “Supernormal vision and high–resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A14(11), 2884–2892 (1997). [CrossRef]
  3. H. Hofer, P. Artal, B. Singer, J. L. Aragon, and D. R. Williams, “Dynamics of the eye’s wave aberration,” J. Opt. Soc. Am A18(3), 497–506 (2001). [CrossRef]
  4. M. Zhu, M. Collins, and D. R. Iskander, “Microfluctuations of wavefront aberrations of the eye,” Ophthal. Physiol. Opt.24(6), 562–571 (2004). [CrossRef]
  5. S. Martinez-Conde, S. L. Macknick, and D. Hubel, “The role of fixational eye movements in visual perception,” Nat. Rev. Neurosci.5, 229–240 (2004). [CrossRef] [PubMed]
  6. T. Nirmaier, G. Pudasaini, and J. Bille, “Very fast wave–front measurements at the human eye with a custom CMOS–based Hartmann–Shack sensor,” Opt. Express11(21), 2704–2716 (2003). [CrossRef] [PubMed]
  7. N. Collins, M. alKalbani, G. Boyle, C. Baily, D. Kilmartin, and D. Coakley, “Characterisation of the tremor component of fixational eye movements,” Special issue Conference Abstracts. 14th European Conference on Eye Movements, J. Eye Movem. Res. 1(ECEM2007 Abstracts), 54 (2007).
  8. B. Sahin, F. Harms, B. Lamory, and L. vabre, “A pupil tracking system for adaptive optics retinal imaging,” Proc. SPIE699169910G (2008). [CrossRef]
  9. B. Sahin, F. Harms, and B. Lamory, “Performance assessment of a pupil tracking system for adaptive optics retinal imaging,” Proc. SPIE7139713911 (2008). [CrossRef]
  10. M. Zacharria, B. Lamory, and N. Château, “Biomedical imaging: new view of the eye,” Nat. Photon.5(1), 24–26 (2011). [CrossRef]
  11. C. Viard, K. Nakashima, B. Lamory, M. Pâques, X. Levecq, and N. Château, “Imaging microscopic structures in pathological retinas using a flood–illumination adaptive optics retinal camera,” Proc. SPIE7885788509 (2011). [CrossRef]
  12. B. Sahin, B. Lamory, X. Levecq, L. Vabre, and C. Dainty, “Retinal imaging system with adaptive optics enhanced with pupil tracking,” Proc. SPIE7885788517 (2011). [CrossRef]
  13. B. Sahin, “Correction of the aberrations of the eye using adaptive optics with pupil tracking,” Ph.D. thesis (School of Physics, National University of Ireland, Galway, 2011); http://optics.nuigalway.ie/theses.
  14. G.-M. Dai, Wavefront Optics for Vision Correction (SPIE, 2008). [CrossRef]
  15. A. Guirao, I. G. Cox, and D. R. Williams, “Effect of rotation and translation on the expected benefit of an ideal method to correct the eye’s higher order aberrations,” J. Opt. Soc. Am. A18(5), 1003–1015 (2001). [CrossRef]
  16. L. Diaz-Santana, C. Torti, I. Munro, P. Gasson, and C. Dainty, “Benefit of higher closedloop bandwidths in ocular adaptive optics,” Opt. Express11(20), 2597–2605 (2003). [CrossRef] [PubMed]
  17. K. M. Hampson and E. H. Mallen, “Multifractal nature of ocular aberration dynamics of the human eye,” Biomed. Opt. Express2(3), 464–477 (2011). [CrossRef] [PubMed]
  18. W. H. Press, “Flicker noises in astronomy and elsewhere,” Comments Astrophys.7(4), 103–119 (1978).
  19. D. Aks, G. J. Zelinsky, and J. C. Sprott, “Memory across eye-movements: 1/f dynamic in visual search,” Nonlinear Dynam. Psychol. Life Sci.6(1), 1–25 (2002). [CrossRef]
  20. A. L. Goldberger, L. A. N. Amaral, J. M. Hausdorff, P. C. Ivanov, C.-K. Peng, and H. E. Stanley, “Fractal dynamics in physiology: alterations with disease and aging,” Proc. Natl. Acad. Sci. U.S.A.99(3), 2466–2472 (2002). [CrossRef] [PubMed]
  21. V. I. H. Kwa and O. L. Lopez, “Fractal analysis of retinal vessels: Peeping at the tree of life?” Neurology74(14), 1088–1089 (2010). [CrossRef] [PubMed]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited