## Marginal blind deconvolution of adaptive optics retinal images |

Optics Express, Vol. 19, Issue 23, pp. 23227-23239 (2011)

http://dx.doi.org/10.1364/OE.19.023227

Acrobat PDF (1425 KB)

### Abstract

Adaptive Optics corrected flood imaging of the retina has been in use for more than a decade and is now a well-developed technique. Nevertheless, raw AO flood images are usually of poor contrast because of the three-dimensional nature of the imaging, meaning that the image contains information coming from both the in-focus plane and the out-of-focus planes of the object, which also leads to a loss in resolution. Interpretation of such images is therefore difficult without an appropriate post-processing, which typically includes image deconvolution. The deconvolution of retina images is difficult because the point spread function (PSF) is not well known, a problem known as blind deconvolution. We present an image model for dealing with the problem of imaging a 3D object with a 2D conventional imager in which the recorded 2D image is a convolution of an invariant 2D object with a linear combination of 2D PSFs. The blind deconvolution problem boils down to estimating the coefficients of the PSF linear combination. We show that the conventional method of joint estimation fails even for a small number of coefficients. We derive a marginal estimation of the unknown parameters (PSF coefficients, object Power Spectral Density and noise level) followed by a MAP estimation of the object. We show that the marginal estimation has good statistical convergence properties and we present results on simulated and experimental data.

© 2011 OSA

## 1. Introduction

4. A. Roorda, F. Romero-Borja, I. William Donnelly, H. Queener, T. Hebert, and M. Campbell, “Adaptive optics scanning laser ophthalmoscopy,” Opt. Express **10**, 405–412 (2002). [PubMed]

- imaging is fundamentally 3D and we only record 2D images. This aspect should be taken into account in the image model in order to enable a high-quality deconvolution ;
- the point spread function (PSF) is not well known, therefore we must estimate the PSF together with the object, a technique known as blind deconvolution.

*i.e.*, the photoreceptor size does not vary significantly over the depth of focus of the instrument and the photoreceptor are more or less parallel to the optical axis). We show that this hypothesis, although it is a simplifying one, is very effective on experimental AO retinal images with a visible and measurable effect on the lateral resolution of the images. Section 2 presents the imaging model and the PSF parameterization we will use. In Section 3, we describe the joint estimation of the object and the PSF before showing, both on simulation and theoretically, that it is not suited for our problem. In Section 4, we derive a marginal estimator and show its performance on simulation. In Section 5, we show results of blind marginal deconvolution of experimental in vivo retinal images. Section 6 summarizes the results.

## 2. Imaging model and PSF parameterization

*i*

_{3D}of 2D images focused at different depths in the object, a reasonable image formation model, after background subtraction, can be written as a 3D convolution: where

**i**

_{3D}is the 3D image,

**o**

_{3D}is the 3D object, *

_{3D}denotes the 3D convolution operator,

**h**

_{3D}is the 3D PSF and

**n**is the noise.

*α*(

*z*) is the normalized flux emitted by the plane at depth

*z*(∫

*α*(

*z*)d

*z*= 1).

*μ*

*m*for an AO flood imager, ≈ 10–15

*μ*

*m*for a confocal imager). Indeed, planes farther than the depth of focus from the image plane contribute to the image with a PSF that has a very narrow spectrum thus their contribution is almost a constant background.

*z*= 0 for instance:

*h*

_{2D}an

*effective*2D PSF which depends on the longitudinal brightness distribution of the object

*α*(

*z*) and on the 3D PSF:

*i*(

*x,y*) at the focal plane of the instrument is the 2D convolution of a 2D object and a global PSF

*h*which is the linear combination of the individual 2D PSFs (each one conjugated with a different plane of the object) weighted by the back-scattered flux at each plane.

*h*(

_{j}*x, y*) ≜

*h*

_{3D}(

*x, y, z*) the 2D lateral PSF at depth

_{j}*z*and

_{j}*α*=

_{j}*α*(

*z*)Δ

_{j}*z*where Δ

_{j}*z*is the effective thickness of the

_{j}*j*th layer. We define

*α*= {

*α*}

_{j}*as the vector of unknowns that parameterize the PSF.*

_{j}*α*is normalized (Σ

*α*= 1) and each parameter is positive (

_{j}*α*≥ 0). We search for

_{j}*h*

_{2D}as a linear combination of a basis of PSF’s, each corresponding to a given plane. In the following, we consider short-exposure diffractive PSF’s so that each

*h*can be computed from the residual aberrations measured with a WFS and the knowledge of the defocus of plane

_{j}*z*.

_{j}## 3. Joint estimation

*e.g.*, Blanc-Féraud [5]). The conventional blind deconvolution approach is to perform an estimation of both the object and the PSF, jointly (see,

*e.g.*, Ayers [6

6. G. R. Ayers and J. C. Dainty, “Iterative blind deconvolution and its applications,” Opt. Lett. **13**, 547–549 (1988). [CrossRef] [PubMed]

7. L. M. Mugnier, T. Fusco, and J.-M. Conan, “Mistral: a myopic edge-preserving image restoration method, with application to astronomical adaptive-optics-corrected long-exposure images,” J. Opt. Soc. Am. A **21**, 1841–1854 (2004). [CrossRef]

### 3.1. Method

*a posteriori*(jmap) estimator: where,

*p*(

**i**,

**o**,

*α;θ*) is the joint probability density of the data (

**i**), of the 2D object (

**o**), and of the PSF decomposition coefficients (

*α*). It may depend on set of regularization parameters or hyperparameters (

*θ*).

*p*(

**i**|

**o**,

*α;θ*) is the likelihood of the data

**i**,

*p*(

**o**;

*θ*) is the

*a priori*probability density function of the object

**o**and

*p*(

*α;θ*) is the

*a priori*probability density function of the coefficients

*α*. In the following, we will not use any regularization on the set of coefficients

*α*because we don not have any probability law for the PSF coefficients. However, since we only need to estimate a small number of these coefficients, this is not a problem.

*σ*

^{2}. For the object, we choose a stationary Gaussian prior probability distribution with a mean value

**o**

_{m}and a covariance matrix

**R**

_{o}. The set of hyperarameters is therefore

*θ*= (

*σ*

^{2},

**o**

_{m},

**R**

_{o}). Under these assumptions, we have: where

**H**is the operator performing the convolution by the PSF

**h**, det(

*x*) is the determinant of matrix

*x*and

*N*

^{2}is the number of pixels in the image.

**ô**and

*α*̂ can therefore be defined as the estimated object and coefficients that minimize a criterion

*J*(

**o**,

*α*) defined as follows: where

*J*

_{i}(

**o**,

*α*) = −ln

*p*(

**i**|

**o**,

*α;θ*) (data-fidelity) and

*J*

_{o}= − ln

*p*(

**o**;

*θ*) (regularization term). The criterion to be minimized reads: where C is a constant. By cancelling the derivative of

*J*(

**o**,

*α*) with respect to the object, we obtain an analytical expression of the object

**ô**(

*α;θ*) that minimizes the criterion for a given (

*α;θ*) : Since the matrices

**H**(convolution operator) and

**R**

_{o}(covariance matrix of an object with a stationary probability density) are Toeplitz-block-Toeplitz, we can write the joint criterion

*J*

_{jmap}and the analytical expression of the object

**ô**(

*α*,

*θ*) in the Fourier domain with a circulant approximation: where

*S*

_{n}is the noise power spectral density (PSD),

*S*

_{o}is the object PSD (the new set of hyperparameters in the Fourier domain is {

*S*

_{n},

*S*

_{o}}),

*ν*is the spatial frequency and

*x̃*denotes the two-dimensional Fast Fourier Transform of

*x*.

**i**and is easily computed.

### 3.2. Simulation results

- the global PSF is the sum of only two weighted PSF’s, the first one
**h**_{foc}being focused and the second one**h**_{defoc}defocused. We assume that the focused PSF has no aberration (AO correction is perfect). The defocus is equal to*π*radian RMS; - The object used is a 128×128 pixel portion of an experimental AO image obtained with the XV-XX retinal imager developed by the Observatoire de Paris [2];
2. M. Glanc, E. Gendron, F. Lacombe, D. Lafaille, J.-F. Le Gargasson, and P. Léna, “Towards wide-field retinal imaging with adaptive optics,” Opt. Commun.

**230**, 225–238 (2004). [CrossRef] - Noise
**n**is stationary Gaussian with a standard deviation*σ*= 0.01*max(**o**), corresponding roughly to photon noise for an average of 10000 photons/pixel; *α*= 0.3.

*S*

_{o}and the noise PSD

*S*

_{n}are known although it is not the case in practice. Therefore, we perform a so-called “supervised” estimation of

*α*: we compute the joint criterion

*J*

_{jmap}(

*α*;

*S*

_{o}

*, S*

_{n}) (see Eq. 13)) for values of

*α*ranging from 0 to 1 to find the value of

*α*that minimizes the joint criterion. Figure 3 shows the result of such a computation.

*α*= 1 whereas the real value of

*α*is 0.3. The joint estimation fails to retrieve the actual value even in this very simple case (two point spread functions, known hyperparameters). Figure 4 shows the restored object for the value of

*α*that minimizes

*J*

_{jmap}(

*α*;

*S*

_{o},

*S*

_{n}). The image is poorly deconvolved since the estimated PSF is perfectly focused whereas the actual global PSF is only 30% focused.

*S*

_{n}and

*S*

_{o}. Actually, a close look at Eq. (13) helps us understand why this joint estimator is actually degenerate in this case: if, for instance, the mean object we use to compute the joint criterion is constant (

**o**

_{m}=

*β*), and since the PSF and the set of parameters are both normalized, then the numerator does not depend on the set of parameters

*α*. Minimizing

*J*′

_{jmap}is equivalent to maximizing this denominator,

*i.e.*, to choosing the PSF with the highest MTF |

*h̃*|, which is the most focused PSF.

8. R. J. A. Little and D. B. Rubin, “On jointly estimating parameters and missing data by maximizing the complete-data likelihood,” The American Statistician **37**, 218–220 (1983). [CrossRef]

9. J. C. Christou, A. Roorda, and D. R. Williams, “Deconvolution of adaptive optics retinal images,” J. Opt. Soc. Am. A **21**, 1393–1401 (2004). [CrossRef]

10. G. Harikumar and Y. Bresler, “Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms,” *IEEE Trans. Image Processing*8, 202–219 (1999). [CrossRef]

11. J. Idier, L. Mugnier, and A. Blanc, “Statistical behavior of joint least square estimation in the phase diversity context,” *IEEE Trans. Image Processing*14, 2107–2116 (2005). [CrossRef]

## 4. Marginal estimation

13. A. Blanc, L. M. Mugnier, and J. Idier, “Marginal estimation of aberrations and image restoration by use of phase diversity,” J. Opt. Soc. Am. A **20**, 1035–1045 (2003). [CrossRef]

**o**out of the problem (

*i.e.*, marginalize the posterior likelihood [5]). We integrate the joint probability of the object

**o**and the PSF parameters

*α*over all the possible values of object

**o**.

*α*, the object is restored by Wiener filtering of the image with the estimated global PSF and hyperparameters.

### 4.1. Marginal criterion

*σ*

^{2}, stationary Gaussian prior probability distribution with a mean value

**o**

_{m}and covariance matrix

**R**

_{o}for the object. Since

**i**is a linear combination of a Gaussian object and a Gaussian noise, it is also Gaussian. Its associated probability density reads: where A is a constant,

**R**

_{i}is the image covariance matrix and

**i**

_{m}=

**Ho**

_{m}. Since we only need to estimate a small number of parameters, there is no need to regularize the solution over

*α*. We therefore use a Maximum Likelihood (ML) estimator rather than a Maximum

*A Posteriori*(MAP) estimator.

*p*(

**i**|

*α;θ*) is equivalent to minimizing the opposite of its logarithm: where

*B*is a constant and

**R**

_{i}=

**HR**

_{o}

**H**

*+*

^{t}*σ*

^{2}

*I*(

_{d}*I*is the identity matrix). The marginal criterion can be written in the Fourier domain as follows:

_{d}13. A. Blanc, L. M. Mugnier, and J. Idier, “Marginal estimation of aberrations and image restoration by use of phase diversity,” J. Opt. Soc. Am. A **20**, 1035–1045 (2003). [CrossRef]

### 4.2. Marginal estimation results

**i**= (

*α*

^{*}**h**

_{foc}+ (1 –

*α*)

**h**

_{defoc}) *

**o**+

**n**, with

*α*= 0.3). The marginal criterion of Eq. (19) was computed for 0 ≤

*α*≤ 1 with known hyperparameters. The object is restored by Wiener filtering. Results of the computation are shown on figure 5 and the restored object on figure 6.

*α*= 0.3, which is the true value of

*α*used in the simulation. The marginal estimator accurately estimates the parameter of interest in our simulation. As a result, the restored object, shown in Figure 6, is much sharper than the simulated image and much closer to the actual object used in the simulation than the object restored with the joint estimation.

### 4.3. Hyperparameter estimation

*θ*(actually the object PSD

*S*

_{o}and noise PSD

*S*

_{n}) together with the PSF coefficients in an automatic manner. This method is called unsupervised estimation: In order to reduce the number of hyperparameters we must estimate, we choose to model the object PSD

*S*

_{o}in the following way [15

15. J.-M. Conan, L. M. Mugnier, T. Fusco, V. Michau, and G. Rousset, “Myopic deconvolution of adaptive optics images by use of object and point-spread function power spectra,” Appl. Opt. **37**, 4614–4622 (1998). [CrossRef]

*S*

_{n}= constant. The criterion

*J*

_{ML}(

*α*) becomes

*J*

_{ML}(

*α*

*,S*

_{n}

*, k,*

*ν*

_{o}

*, p*) and must now be minimized versus the PSF coefficients

*α*and the hyperparameters

*S*

_{n}

*, k,*

*ν*

_{o}and

*p*.

*μ*=

*S*

_{n}

*/k*, if we cancel the derivative of the criterion with respect to

*k*, we obtain an analytical expression for

*k̂*(

*α*,

*μ*,

*ν*

_{o}

*, p*) that minimizes the criterion for a given value of the other parameters therefore only four hyperparameters remain,

*μ*

*̂*,

*ν*

*̂*

_{0},

*p̂*and

*S*

_{n}[13

13. A. Blanc, L. M. Mugnier, and J. Idier, “Marginal estimation of aberrations and image restoration by use of phase diversity,” J. Opt. Soc. Am. A **20**, 1035–1045 (2003). [CrossRef]

### 4.4. Asymptotic properties

*i.e.*, it tends toward the actual values of the estimated parameters as the noise tends toward zero or as the size of data tends toward infinity. It is also known to be asymptotically normal [12] so that the neg-log-likelihood is asymptotically quadratic thus convex.

**i**= (*α*^{*}**h**_{foc}+ (1 –*α*)**h**_{defoc}) ***o**+**n**, with*α*= 0.3;- Noise RMS varies from 1% of the maximum value of the image to 20% of the maximum value of the image;
- 50 noise realizations were computed for each noise RMS value;
- The simulation was performed on 3 different subimages varying in size: a 32×32 pixel central region of image
**i**, a 64×64 pixel central region of image**i**and the whole 128×128 pixel image**i**.

*i.e.*, the estimated parameters

*α*tends towards the exact value) when noise decreases. Even more interestingly, for a given noise value, error tends towards zero as the size of data increases. In particular, for a 128 × 128 pixel image and for noise

*σ*= 5% of the max value of the image, the RMS error on the PSF coefficient

*α*estimation is less than 3%. For a noise RMS value of 1% of the image maximum, the unsupervised estimator basically shows the same performance as the supervised estimation.

## 5. Preliminary experimental results

*in vivo*retinal images:

- The experimental image (Figure 8) is a 256 × 256 pixel image recorded on a healthy subject with the AO eye-fundus imager of the Center for Clinical Investigation of the Quinze-Vingts Hospital in Paris, developed by the Observatoire de Paris-Meudon [2];
**230**, 225–238 (2004). [CrossRef] - We model the global PSF as a linear combination of 3 PSFs, the first one being focused, the second one being defocused with a focus
*ϕ*_{1}=*π*/2 rad RMS and the third one being defocused by*ϕ*_{2}=*π*rad RMS. - We assume that the adaptive optics has perfectly corrected the wavefront and that the focused PSF is a Airy disk.

*α*= {

*α*

_{1},

*α*

_{2},

*α*

_{3}}.

*α*= {0.24, 0.22, 0.54}, the resulting estimated PSF is shown on Figure 9. For this image, the main contribution (more than half of the energy) comes from the most out-of-focus plane and only a little less than 25% of the energy comes from the in-focus-plane.

*i.e.*, the spatial frequency corresponding to the cone photoreceptor size and separation. This frequency enhancement is clearly visible on Figure 12 that shows, in solid line, the estimated Optical Transfer Function (OTF) of our instrument (AO Flood imager+eye) as well as the deconvolution transfer function (dotted line) and the global (instrument+deconvolution) transfer function (dashed line). The deconvolution restores the spatial frequencies damped by the instrument transfer function up to 300 cycles/mm, a frequency that is beyond the spatial frequency of the cone photoreceptors in our image. These preliminary results show that our image model and the marginal estimator are well adapted to the deconvolution of adaptive optics corrected photoreceptor images. Motion artifacts due to eye movement during image acquisition and resulting in blurred images could possibly be addressed by changing the PSF basis to include motion induced PSFs and not only purely diffractive PSFs.

## 6. Conclusion

*i.e.*, to perform an unsupervised estimation. We have showed on simulations that this estimator is capable of restoring the PSF accurately even in the unsupervised case. The good statistical properties of the unsupervised marginal estimation have also been demonstrated.

## Acknowledgments

## References and links

1. | J. Liang, D. R. Williams, and D. T. Miller, “Supernormal vision and high-resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A |

2. | M. Glanc, E. Gendron, F. Lacombe, D. Lafaille, J.-F. Le Gargasson, and P. Léna, “Towards wide-field retinal imaging with adaptive optics,” Opt. Commun. |

3. | J. Rha, R. S. Jonnal, K. E. Thorn, J. Qu, Y. Zhang, and D. T. Miller, “Adaptive optics flood-illumination camera for high speed retinal imaging,” Opt. Express |

4. | A. Roorda, F. Romero-Borja, I. William Donnelly, H. Queener, T. Hebert, and M. Campbell, “Adaptive optics scanning laser ophthalmoscopy,” Opt. Express |

5. | L. Blanc-Féraud, L. Mugnier, and A. Jalobeanu, “Blind image deconvolution,” in “ |

6. | G. R. Ayers and J. C. Dainty, “Iterative blind deconvolution and its applications,” Opt. Lett. |

7. | L. M. Mugnier, T. Fusco, and J.-M. Conan, “Mistral: a myopic edge-preserving image restoration method, with application to astronomical adaptive-optics-corrected long-exposure images,” J. Opt. Soc. Am. A |

8. | R. J. A. Little and D. B. Rubin, “On jointly estimating parameters and missing data by maximizing the complete-data likelihood,” The American Statistician |

9. | J. C. Christou, A. Roorda, and D. R. Williams, “Deconvolution of adaptive optics retinal images,” J. Opt. Soc. Am. A |

10. | G. Harikumar and Y. Bresler, “Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms,” |

11. | J. Idier, L. Mugnier, and A. Blanc, “Statistical behavior of joint least square estimation in the phase diversity context,” |

12. | E. Lehmann, |

13. | A. Blanc, L. M. Mugnier, and J. Idier, “Marginal estimation of aberrations and image restoration by use of phase diversity,” J. Opt. Soc. Am. A |

14. | Y. Goussard, G. Demoment, and J. Idier, “A new algorithm for iterative deconvolution of sparse spike,” in “ICASSP,” (1990), pp. 1547–1550. |

15. | J.-M. Conan, L. M. Mugnier, T. Fusco, V. Michau, and G. Rousset, “Myopic deconvolution of adaptive optics images by use of object and point-spread function power spectra,” Appl. Opt. |

16. | É. Thiébaut, “Optimization issues in blind deconvolution algorithms,” in “Astronomical Data Analysis II,”, vol. 4847, J.-L. Starck and F. D. Murtagh, eds. (Proc. Soc. Photo-Opt. Instrum. Eng., 2002), vol. 4847, pp. 174–183. |

**OCIS Codes**

(010.1080) Atmospheric and oceanic optics : Active or adaptive optics

(100.3190) Image processing : Inverse problems

(100.6890) Image processing : Three-dimensional image processing

(170.4470) Medical optics and biotechnology : Ophthalmology

(100.1455) Image processing : Blind deconvolution

**ToC Category:**

Adaptive Optics

**History**

Original Manuscript: July 12, 2011

Revised Manuscript: September 16, 2011

Manuscript Accepted: September 17, 2011

Published: November 1, 2011

**Virtual Issues**

Vol. 7, Iss. 1 *Virtual Journal for Biomedical Optics*

**Citation**

L. Blanco and L. M. Mugnier, "Marginal blind deconvolution of adaptive optics retinal images," Opt. Express **19**, 23227-23239 (2011)

http://www.opticsinfobase.org/vjbo/abstract.cfm?URI=oe-19-23-23227

Sort: Year | Journal | Reset

### References

- J. Liang, D. R. Williams, and D. T. Miller, “Supernormal vision and high-resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A14, 2884–2892 (1997). [CrossRef]
- M. Glanc, E. Gendron, F. Lacombe, D. Lafaille, J.-F. Le Gargasson, and P. Léna, “Towards wide-field retinal imaging with adaptive optics,” Opt. Commun.230, 225–238 (2004). [CrossRef]
- J. Rha, R. S. Jonnal, K. E. Thorn, J. Qu, Y. Zhang, and D. T. Miller, “Adaptive optics flood-illumination camera for high speed retinal imaging,” Opt. Express14, 4552–4569 (2006). [CrossRef] [PubMed]
- A. Roorda, F. Romero-Borja, I. William Donnelly, H. Queener, T. Hebert, and M. Campbell, “Adaptive optics scanning laser ophthalmoscopy,” Opt. Express10, 405–412 (2002). [PubMed]
- L. Blanc-Féraud, L. Mugnier, and A. Jalobeanu, “Blind image deconvolution,” in “Inverse Problems in Vision and 3D Tomography,”, A. Mohammad-Djafari, ed. (ISTE / John Wiley, London, 2010), chap. 3, pp. 97–121.
- G. R. Ayers and J. C. Dainty, “Iterative blind deconvolution and its applications,” Opt. Lett.13, 547–549 (1988). [CrossRef] [PubMed]
- L. M. Mugnier, T. Fusco, and J.-M. Conan, “Mistral: a myopic edge-preserving image restoration method, with application to astronomical adaptive-optics-corrected long-exposure images,” J. Opt. Soc. Am. A21, 1841–1854 (2004). [CrossRef]
- R. J. A. Little and D. B. Rubin, “On jointly estimating parameters and missing data by maximizing the complete-data likelihood,” The American Statistician37, 218–220 (1983). [CrossRef]
- J. C. Christou, A. Roorda, and D. R. Williams, “Deconvolution of adaptive optics retinal images,” J. Opt. Soc. Am. A21, 1393–1401 (2004). [CrossRef]
- G. Harikumar and Y. Bresler, “Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms,” IEEE Trans. Image Processing8, 202–219 (1999). [CrossRef]
- J. Idier, L. Mugnier, and A. Blanc, “Statistical behavior of joint least square estimation in the phase diversity context,” IEEE Trans. Image Processing14, 2107–2116 (2005). [CrossRef]
- E. Lehmann, Theory of point estimation (John Wiley, New York, NY, 1983).
- A. Blanc, L. M. Mugnier, and J. Idier, “Marginal estimation of aberrations and image restoration by use of phase diversity,” J. Opt. Soc. Am. A20, 1035–1045 (2003). [CrossRef]
- Y. Goussard, G. Demoment, and J. Idier, “A new algorithm for iterative deconvolution of sparse spike,” in “ICASSP,” (1990), pp. 1547–1550.
- J.-M. Conan, L. M. Mugnier, T. Fusco, V. Michau, and G. Rousset, “Myopic deconvolution of adaptive optics images by use of object and point-spread function power spectra,” Appl. Opt.37, 4614–4622 (1998). [CrossRef]
- É. Thiébaut, “Optimization issues in blind deconvolution algorithms,” in “Astronomical Data Analysis II,”, vol. 4847, J.-L. Starck and F. D. Murtagh, eds. (Proc. Soc. Photo-Opt. Instrum. Eng., 2002), vol. 4847, pp. 174–183.

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.