OSA's Digital Library

Biomedical Optics Express

Biomedical Optics Express

  • Editor: Joseph A. Izatt
  • Vol. 5, Iss. 7 — Jul. 1, 2014
  • pp: 2376–2389
« Show journal navigation

Multiplexed coded illumination for Fourier Ptychography with an LED array microscope

Lei Tian, Xiao Li, Kannan Ramchandran, and Laura Waller  »View Author Affiliations


Biomedical Optics Express, Vol. 5, Issue 7, pp. 2376-2389 (2014)
http://dx.doi.org/10.1364/BOE.5.002376


View Full Text Article

Acrobat PDF (2140 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Fourier Ptychography is a new computational microscopy technique that achieves gigapixel images with both wide field of view and high resolution in both phase and amplitude. The hardware setup involves a simple replacement of the microscope’s illumination unit with a programmable LED array, allowing one to flexibly pattern illumination angles without any moving parts. In previous work, a series of low-resolution images was taken by sequentially turning on each single LED in the array, and the data were then combined to recover a bandwidth much higher than the one allowed by the original imaging system. Here, we demonstrate a multiplexed illumination strategy in which multiple randomly selected LEDs are turned on for each image. Since each LED corresponds to a different area of Fourier space, the total number of images can be significantly reduced, without sacrificing image quality. We demonstrate this method experimentally in a modified commercial microscope. Compared to sequential scanning, our multiplexed strategy achieves similar results with approximately an order of magnitude reduction in both acquisition time and data capture requirements.

© 2014 Optical Society of America

1. Introduction

The LED array microscope is a powerful new platform for computational microscopy in which a wide range of capabilities are enabled by a single hardware modification to a traditional brightfield microscope - the replacement of the source with a programmable LED array [1

1. G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36, 3987–3989 (2011). [CrossRef] [PubMed]

, 2

2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier Ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]

] (Fig. 1). This simple, inexpensive hardware modification allows patterning of the illumination at the Fourier plane of the sample (assuming Köhler geometry). Thus, each LED in the array corresponds to illumination of the sample by a unique angle. Conveniently, the range of illumination angles that can be patterned is much larger than the range of angles that pass through the objective [set by its numerical aperture (NA)]. This means that illumination by the central LEDs produces brightfield images, whereas illumination by the outer LEDs (outside the NA of the objective) produces dark field images [1

1. G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36, 3987–3989 (2011). [CrossRef] [PubMed]

]. Alternatively, by sequentially taking a pair of images with either half of the source on, we obtain phase derivative measurements by differential phase contrast (DPC) [3

3. L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39, 1326–1329 (2014). [CrossRef] [PubMed]

5

5. T. N. Ford, K. K. Chu, and J. Mertz, “Phase-gradient microscopy in thick tissue with oblique back-illumination,” Nature Methods 9, 1195–1197 (2012). [CrossRef] [PubMed]

]. Finally, a full sequential scan of the 2D array of LEDs (angles), while taking 2D images at each angle, captures a 4D dataset similar to a light field [6

6. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009). [CrossRef]

] or phase space measurement [7

7. L. Waller, G. Situ, and J. Fleischer, “Phase-space measurement and coherence synthesis of optical beams,” Nat. Photonics 6, 474–479 (2012). [CrossRef]

]. This enables all the computational processing of light field imaging. For example, angular information can be traded for depth by using digital refocusing algorithms to get 3D intensity [1

1. G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36, 3987–3989 (2011). [CrossRef] [PubMed]

] or 3D phase contrast [3

3. L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39, 1326–1329 (2014). [CrossRef] [PubMed]

]. When the sample is thin, angular information can instead be used to improve resolution by computationally recovering a larger synthetic NA, limited only by the largest illumination angle of the LED array [2

2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier Ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]

]. This method, named Fourier Ptychography, enables one to use a low NA objective, having a very large field of view (FoV), but still obtain high resolution across the entire image, resulting in gigapixel images. The aberrations in the imaging system can also be estimated without separate characterization [8

8. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22, 4960–4972 (2014). [CrossRef] [PubMed]

]. All of these imaging modalities are achieved in the same optical setup, with no moving parts, simply by choosing the appropriate LEDs to turn on.

Fig. 1 Summary of Fourier Ptychography (FP) in an LED array microscope. (a) A sample is illuminated from different angles by turning on different LEDs of an array. (b) Our experimental setup on a Nikon TE300 microscope. (c) Images taken with different LEDs contain information from different spatial frequency areas of the sample. The central (brightfield) LED fills an area defined by the NA (0.1) of the objective. Images taken with top and left (dark field) LEDs result in accentuated edges along the corresponding orientations. (d) FP reconstructs a high resolution image from many LEDs, while simultaneously estimating aberrations.

The main limitations of Fourier Ptychography are the large amount of data captured and the long acquisition times required. An image must be collected while scanning through each of the LEDs in the array, leading to hundreds of images in each dataset. This is compounded by the fact that each LED has limited intensity, requiring long exposures. The multiplexed illumination scheme that we propose here is capable of reducing both acquisition time and the number of images required by orders of magnitude.

We demonstrate two different multiplexing schemes in which multiple LEDs from the array are turned on for each captured image. First, we describe the case where we take the same number of images as in the sequential scan, but reduce the exposure time for each, since turning on more LEDs provides more light throughput. As long as the random patterns are linearly independent, the resulting images can be interpreted as a linear combination of images from each of the LEDs, implying that the data contains the same information as in the sequential scan. The more interesting multiplexing situation involves reducing the total number of images. In this second scheme, we show that a random coding strategy is capable of significantly reducing the data requirements, since each image now contains information from multiple areas of the sample’s Fourier space. To solve the inverse problem, we develop a modified Fourier Ptychography algorithm that applies to both multiplexing situations.

2. Theory and method

2.1. Fourier Ptychography

In microscopy, one must generally choose between large FoV and high resolution. Where both are needed (e.g. in digital pathology [9

9. L. Pantanowitz, “Digital images and the future of digital pathology,” J. Pathology Informatics 1, 15 (2010). [CrossRef]

]), lateral mechanical scanning of the sample is the most common solution. Fourier Ptychography (FP) [2

2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier Ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]

] is a new computational illumination technique that achieves the same space-bandwidth product, but by scanning of the source in Fourier space with a programmable LED array. The setup involves no moving parts, so can be made fast. FP involves an embedded phase retrieval algorithm and so also produces high resolution, large FoV quantitative phase images [10

10. X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via fourier ptychographic microscopy,” Opt. Lett. 38, 4845–4848 (2013). [CrossRef] [PubMed]

]. In addition, FP uses a low NA objective, which is less expensive and provides a longer working distance and larger depth of field than high NA objectives.

The process of sequential FP is summarized in Fig. 1. For each image captured, the sample is illuminated from a unique angle by turning on a single LED. In Fourier space, this corresponds to a shift proportional to the angle of illumination. Thus, for each LED, a different area of Fourier space passes through the pupil. An example is given in Fig. 1(c), showing the Fourier space areas covered by three LEDs. The goal of the FP algorithm is to stitch together the images from different areas of Fourier space in order to achieve a large effective NA. The final NA is the sum of the objective NA and illumination NA. The caveat of using Fourier space stitching is that phase information is required. Typically, this is done with coherent methods by measuring phase for each angle via synthetic aperture methods [11

11. S. A. Alexandrov, T. R. Hillman, T. Gutzler, and D. D. Sampson, “Synthetic aperture fourier holographic optical microscopy,” Phys. Rev. Lett. 97, 168102 (2006). [CrossRef] [PubMed]

,12

12. T. R. Hillman, T. Gutzler, S. A. Alexandrov, and D. D. Sampson, “High-resolution, wide-field object reconstruction with synthetic aperturefourier holographic optical microscopy,” Opt. Express 17, 7873–7892 (2009). [CrossRef] [PubMed]

]. In FP, however, phase is inferred by a phase retrieval optimization [13

13. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982). [CrossRef] [PubMed]

] based on angular diversity, analogous to Ptychography based on spatial translation [14

14. J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85, 4795–4797 (2004). [CrossRef]

17

17. P. Thibault, M. Dierolf, O. Bunk, A. Menzel, and F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy 109, 338–343 (2009). [CrossRef] [PubMed]

]. For such algorithms to converge, significant overlap (> 60%) is required between the Fourier space areas of neighboring LEDs [2

2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier Ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]

, 18

18. O. Bunk, M. Dierolf, S. Kynde, I. Johnson, O. Marti, and F. Pfeiffer, “Influence of the overlap parameter on the convergence of the ptychographical iterative engine,” Ultramicroscopy 108, 481–487 (2008). [CrossRef]

] (see Section 2.4).

Both mechanical scanning and Fourier scanning result in huge datasets, on the order of gigapixels, presenting data storage and manipulation challenges. In FP, the overlap requirement means that sequential Fourier scanning requires much more data than mechanical scanning. However, by using the multiplexing methods described here, data capture requirements become comparable to or even lower than those of mechanical scanning. Analogous multiplexing in the spatial domain for traditional Ptychography would be very difficult to achieve.

2.2. Coding strategies for multiplexed Fourier Ptychography

There are many choices of coding strategies for multiplexed measurements. In computational photography, multiplexed illumination has been evaluated for reflectance images [19

19. Y. Y. Schechner, S. K. Nayar, and P. N. Belhumeur, “Multiplexing for optimal lighting,” IEEE Trans. Pattern Analysis Machine Intelligence 29, 1339–1354 (2007). [CrossRef]

, 20

20. N. Ratner and Y. Y. Schechner, “Illumination multiplexing within fundamental limits,” in “Computer Vision and Pattern Recognition” (IEEE, 2007), pp. 1–8.

], but differs from our system in that FP involves a phase retrieval algorithm which is nonlinear. This makes it difficult to analytically derive the optimal coding strategy.

By turning on M LEDs for each image, we can linearly reduce the exposure time by a factor of M while maintaining the same photon budget. Since each image covers M times more area of Fourier space, we may also be able to reduce the total number of images by a factor of M. Thus, the possible reduction in total acquisition time is M2, as long as the algorithm does not break. Clearly, we cannot go to the extreme case of turning on all the LEDs at once and capturing only one image, since there will be no diversity for providing phase contrast. Intuitively, we would like each pattern to turn on LEDs which are far away from each other, such that they represent distinct (non-overlapping) areas of Fourier space. However, we still need overlap across all the images captured, so later coded images should cover the overlapping areas.

Here, we choose a general random coding scheme in which the number of LEDs on is fixed, but their location varies randomly for each captured image, subject to some simple rules. We need to use each LED at least once in each dataset, even when we reduce the number of images. Thus, the first image in the set chooses which M LEDs to turn on randomly (with uniform probability), but later images will exclude those already used LEDs in choosing which to turn on. This process generates a set of NLED/M patterns which fully cover all LEDs. To generate the additional images needed for matching the number of images to the sequential scan, we repeat this process M times. This scheme achieves good mixing of information and balanced coverage of Fourier space for each image. As an example, illumination patterns designed according to our random codes and their resulting images are shown in Fig. 2.

Fig. 2 Sample datasets for multiplexed illumination coding in Fourier Ptychography. (Top) Four randomly chosen LEDs are turned on for each measurement. (Middle) The captured images corresponding to each LED pattern. (Bottom) Fourier coverage of the sample’s Fourier space for each of the LED patterns (drawn to scale). Turning on multiple well-separated LEDs allows information from multiple areas of Fourier space to pass through the system simultaneously. The center unshaded circle represents the NA of the objective lens.

2.3. Forward problem for multiplexed Fourier Ptychography

Consider a thin sample described by the complex transmission function o(r), where r = (x, y) denotes the lateral coordinates at the sample plane. Each LED generates illumination at the sample that is treated as a (spatially coherent) local plane wave with a unique spatial frequency km = (kxm, kym) for LEDs m = 1,···, NLED, where NLED is the total number of LEDs in the array. The exit wave from the sample is the multiplication of the two: u(r) = o(r)exp(ikm · r). Thus, the sample’s spectrum O(k) is shifted to be centered around km = (sinθxm, sinθym), where (θxm, θym) define the illumination angle for the mth LED and λ is the wavelength.

At the pupil plane, the field corresponding to the Fourier transform of the exit wave O(kkm) is low-pass filtered by the pupil function P(k). Therefore, the intensity at the image plane resulting from a single LED illumination (neglecting magnification and noise) is
im(r)=|[O(kkm)P(k)](r)|2,
(1)
where [(·)](r) is the 2D Fourier transform.

For multiplexed images, the sample is illuminated by different sets of LEDs according to a coding scheme. The pth image turns on LEDs with indices p chosen from {1,···, NLED}. When multiple LEDs are on at once, the illumination must be considered partially coherent, with each LED being mutually incoherent with all others, representing a single coherent mode. The total intensity of the pth multiplexed image Ip(r) is the sum of intensities from each:
Ip(r)=mpim(r)=mp|[O(kkm)P(k)](r)|2,
(2)
where the symbol ∈ denotes that m is an element from the set p.

Assuming that the entire multiplexed FP captures a total of Nimg intensity images, the multiplexing scheme can be described by a Nimg × NLED binary coding matrix A = [Ap,m], where the element of the matrix is defined by
Ap,m={1,mp0,otherwise,m=1,,NLED,p=1,,Nimg.
(3)
Any coding matrix should satisfy the following general requirements: (1) any column of A should contain at least one non-zero element, meaning that each LED has to be turned on at least once; (2) any row of A should contain at least one non-zero element, excluding the trivial case that no LEDs are turned on in a certain measurement; (3) all the row vectors should be linearly independent with each other, implying that every new multiplexed image is non-redundant.

2.4. Inverse problem formulation

For the multiplexed illumination case, we develop a new algorithm to handle multi-LED illumination. Our algorithm is similar in concept to that of [8

8. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22, 4960–4972 (2014). [CrossRef] [PubMed]

], which jointly recovers the sample’s Fourier space O(k) and the unknown pupil function P(k). We split the FoV into patches whose area is on the order of the spatial coherence area of a single LED illumination and incorporate our multi-LED forward model in the optimization procedure. To improve robustness, we also add new procedures for background estimation and regularization for tuning noise performance.

The least squares formulation for reconstruction is a non-convex optimization problem. We minimize the square of the difference between the actual and estimated measurements, based on the forward model [Eq. (2)], with an additional term describing the background offset bp,
minO(k),P(k),{bp}p=1Nimgp=1Nimgr|Ip(r)(mp|[O(kkm)P(k)](r)|2+bp)|2.
(4)
Since there are multiple variables involved in the optimization, we take a divide-and-conquer approach which optimizes for each sequentially. First, we estimate the background in a single step for each image p and subtract it to produce the corrected intensity image
I^p(r)=Ip(r)b^p.
(5)

Next, we start an iterative process which estimates both the object and the pupil functions simultaneously. We initialize O(k) to be the Fourier transform of the square root of any of the images which contain a brightfield LED, and initialize P(k) to be a binary circle whose radius is determined by the NA. We introduce the auxiliary function Ψm(k) defined as
Ψm(k)=O(kkm)P(k),
(6)
which is the field immediately after the pupil from the mth-LED illumination. Then, using the corrected intensity image Îp(r), we iteratively update Ψm(k) for all m, along with the estimates of O(k) and P(k). The algorithm is derived in Appendix A and summarized below and in Fig. 3. The basic structure of the reconstruction algorithm is to update the auxiliary function incrementally from image p = 1 to Nimg in each iteration, and then to repeat the same process iteratively until the value of the merit function [the quantity from the minimization in Eq. (4)] falls below a certain tolerance. Each incremental update consists of the following two steps.

  1. Update the auxiliary function Φm(i)(k) using the pth intensity image:

    Let the estimate of the mth auxiliary function in the ith iteration be Ψm(i)(k)=O(i)(kkm)P(i)(k). First, compute the real space representation of the mth auxiliary function
    ψm(i)(r)=[Ψm(i)(k)](r),mp.
    (7)
    Then, perform a projection procedure similar to the Gerchberg-Saxton-Fienup type of update by rescaling the real space auxiliary function by an optimal intensity factor and returning back to the Fourier space representation [13

    13. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982). [CrossRef] [PubMed]

    , 21

    21. R. Gerchberg and W. Saxton, “Phase determination for image and diffraction plane pictures in the electron microscope,” Optik 34, 275–284 (1971).

    23

    23. P. Thibault and A. Menzel, “Reconstructing state mixtures from diffraction measurements,” Nature 494, 68–71 (2013). [CrossRef] [PubMed]

    ]:
    Φm(i)(k)=[ϕm(i)(r)]1(k)withϕm(i)(r)=I^p(r)mp|ψm(i)(r)|2ψm(i)(r),mp.
    (8)

  2. Update the sample spectrum O(i+1)(k) and the pupil function P(i+1)(k):
    O(i+1)(k)=O(i)(k)+mp|P(i)(k+km)|[P(i)(k+km)]*[Φm(i)(k+km)O(i)(k)P(i)(k+km)]|P(i)(k)|max(mp|P(i)(k+km)|2+δ1)
    (9)
    P(i+1)(k)=P(i)(k)+mp|O(i)(kkm)|[O(i)(kkm)]*[Φm(i)(k)O(i)(kkm)P(i)(k)]|O(i)(k)|max(mp|O(i)(kkm)|2+δ2),
    (10)
    where δ1 and δ2 are some regularization constants to ensure numerical stability, which is equivalent to an 2-norm/Tikhonov regularization on O(k) and P(k). The particular choice of updating step size, determined by the ratio between |P(i)(k)| and its maximum (|O(i)(k)| and its maximum), is shown to be robust [14

    14. J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85, 4795–4797 (2004). [CrossRef]

    ].
Fig. 3 Flow chart of the reconstruction algorithm.

3. Experimental results

The experimental setup is shown in Fig. 1. All samples are imaged with a 4 × 0.1NA objective and a scientific CMOS camera (PCO.edge). A programmable 32×32 LED array (Adafruit, 4mm spacing, controlled by an Arduino) is placed at 67.5mm above the sample to replace the light source on a Nikon TE300 inverted microscope. The central 293 red (central wavelength 629nm and 20nm bandwidth) LEDs are used for all the experiments reported here, resulting in a final synthetic NA of 0.6. In principle, our LED array could provide larger NA improvements, but it is practically limited by noise in the dark field images from high angle LEDs. To bypass these limitations, a better geometry would be a dome of LEDs, all pointed at the sample and covering the full hemisphere. Such an illumination unit was built in [24

24. D. Dominguez, L. Molina, D. B. Desai, T. O’Loughlin, A. A. Bernussi, and L. G. de Peralta, “Hemispherical digital optical condensers with no lenses, mirrors, or moving parts,” Opt. Express 22, 6948–6957 (2014). [CrossRef] [PubMed]

], but was not programmable and so not directly amenable to FP.

We first image a resolution target, whose low resolution image (from the central LED) is shown in Fig. 4(a) with a zoom-in in Fig. 4(b) to show that the smallest group of features (corresponding to 0.5 NA) are not resolvable. In Fig. 4(c–e), we show our recovered high-resolution images under three different coding strategies. The first, sequential scanning of a single LED across the full array, was taken with a 2s exposure time, resulting in a total acquisition time of T=586s. The coding matrix for sequential scanning is written as the NLED × NLED identity matrix. Next, we use random multiplexed illumination with 4 LEDs on for each image (i.e. M = 4) and a shorter exposure time (1s). Sample illumination patterns and images are shown in Fig. 2. The reconstruction result with the same number of images as the sequential scan Nimg = 293 is shown in Fig. 4(d). As expected, the same resolution enhancement is achieved with half the acquisition time. Finally, we test our partial measurement scheme in which the total number of images is reduced by a factor of 4. This cuts the number of images used in the reconstruction to 74 and the total time to 74s, without sacrificing the quality of the result [see Fig. 4(e)].

Fig. 4 Experimental results for multiplexed illumination of a resolution target. (a) The original low resolution image from a 4 × 0.1 NA objective taken with only the central LED on. (b) A zoom-in on the smallest features. (c) Reconstruction result from sequential FP with Nimg = 293 single LED images having a total acquisition time of T=586s. (d) Multiplexing 4 LEDs for each image while preserving the number of measurements Nimg = 293 reduces the total acquisition time to T=293s. (e) The multiplexed illumination also allows reduction of the number of measurements to Nimg = 74 with T=74s.

The same multiplexing scheme was also tested on a stained biological sample (Fig. 5). During post-processing, the final image was computed in 200×200 pixel patches. The background is estimated for each dark field image by taking the average intensity from a uniform region of the sample. FP reconstructions are compared with the different illumination schemes in Fig. 5(b) and 5(c). The sequential scan is the same as previously used, but the multiplexed measurement now uses 8 LEDs (i.e. M = 8). The reduced measurement is demonstrated in the second case with only 40 images used, corresponding to approximately 1/8 of the data size in a full sequential scan, and reducing the total acquisition time by a factor of 14.7.

Fig. 5 Experimental results for multiplexed illumination of a stained dog stomach cardiac region sample. (a) The original low resolution image from a 4× 0.1NA objective taken with only the central LED on. (b1, c1) Zoom-in of the regions denoted by red and green squares. (b2, c2) Amplitude and (b5, c5) phase reconstructions from sequential FP with Nimg = 293 single LED images and a total acquisition time of T=586s. (b3, c3) Amplitude and (b6, c6) phase reconstructions from multiplexing 8 LEDs with Nimg = 293 and T=293s. Amplitude (b4, c7) and phase (b4, c7) reconstructions for multiplexing 8 LEDs with Nimg = 40 and T=40s.

4. Discussion

To summarize, by exploiting illumination multiplexing, we experimentally demonstrated that both the acquisition time and data size requirements in Fourier Ptychography are significantly reduced. We achieved ∼ 2mm field of view with 0.5μm resolution, with data acquisition times reduced from ∼10 minutes (for sequential scanning) to less than 1 minute. By making the LEDs in the array brighter, we expect to be able to achieve sub-second data acquisition with our multiplexed scheme. Our data capture was reduced from 293 images to only 40 images.

Appendix A: derivations of the reconstruction algorithm

Since the original optimization in (4) involves P images which contain massive amounts of pixels, an efficient way is to use the incremental method in optimization theory that updates Ψm(k) “incrementally” for mp in a sequential manner for each p = 1 to Nimg.

Acknowledgments

References and links

1.

G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36, 3987–3989 (2011). [CrossRef] [PubMed]

2.

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier Ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]

3.

L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39, 1326–1329 (2014). [CrossRef] [PubMed]

4.

D. Hamilton and C. Sheppard, “Differential phase contrast in scanning optical microscopy,” J. Microscopy 133, 27–39 (1984). [CrossRef]

5.

T. N. Ford, K. K. Chu, and J. Mertz, “Phase-gradient microscopy in thick tissue with oblique back-illumination,” Nature Methods 9, 1195–1197 (2012). [CrossRef] [PubMed]

6.

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009). [CrossRef]

7.

L. Waller, G. Situ, and J. Fleischer, “Phase-space measurement and coherence synthesis of optical beams,” Nat. Photonics 6, 474–479 (2012). [CrossRef]

8.

X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22, 4960–4972 (2014). [CrossRef] [PubMed]

9.

L. Pantanowitz, “Digital images and the future of digital pathology,” J. Pathology Informatics 1, 15 (2010). [CrossRef]

10.

X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via fourier ptychographic microscopy,” Opt. Lett. 38, 4845–4848 (2013). [CrossRef] [PubMed]

11.

S. A. Alexandrov, T. R. Hillman, T. Gutzler, and D. D. Sampson, “Synthetic aperture fourier holographic optical microscopy,” Phys. Rev. Lett. 97, 168102 (2006). [CrossRef] [PubMed]

12.

T. R. Hillman, T. Gutzler, S. A. Alexandrov, and D. D. Sampson, “High-resolution, wide-field object reconstruction with synthetic aperturefourier holographic optical microscopy,” Opt. Express 17, 7873–7892 (2009). [CrossRef] [PubMed]

13.

J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982). [CrossRef] [PubMed]

14.

J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85, 4795–4797 (2004). [CrossRef]

15.

M. Guizar-Sicairos and J. R. Fienup, “Phase retrieval with transverse translation diversity: a nonlinear optimization approach,” Opt. Express 16, 7264–7278 (2008). [CrossRef] [PubMed]

16.

A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109, 1256–1262 (2009). [CrossRef] [PubMed]

17.

P. Thibault, M. Dierolf, O. Bunk, A. Menzel, and F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy 109, 338–343 (2009). [CrossRef] [PubMed]

18.

O. Bunk, M. Dierolf, S. Kynde, I. Johnson, O. Marti, and F. Pfeiffer, “Influence of the overlap parameter on the convergence of the ptychographical iterative engine,” Ultramicroscopy 108, 481–487 (2008). [CrossRef]

19.

Y. Y. Schechner, S. K. Nayar, and P. N. Belhumeur, “Multiplexing for optimal lighting,” IEEE Trans. Pattern Analysis Machine Intelligence 29, 1339–1354 (2007). [CrossRef]

20.

N. Ratner and Y. Y. Schechner, “Illumination multiplexing within fundamental limits,” in “Computer Vision and Pattern Recognition” (IEEE, 2007), pp. 1–8.

21.

R. Gerchberg and W. Saxton, “Phase determination for image and diffraction plane pictures in the electron microscope,” Optik 34, 275–284 (1971).

22.

C. Rydberg and J. Bengtsson, “Numerical algorithm for the retrieval of spatial coherence properties of partially coherent beams from transverse intensity measurements,” Opt. Express 15, 13613–13623 (2007). [CrossRef] [PubMed]

23.

P. Thibault and A. Menzel, “Reconstructing state mixtures from diffraction measurements,” Nature 494, 68–71 (2013). [CrossRef] [PubMed]

24.

D. Dominguez, L. Molina, D. B. Desai, T. O’Loughlin, A. A. Bernussi, and L. G. de Peralta, “Hemispherical digital optical condensers with no lenses, mirrors, or moving parts,” Opt. Express 22, 6948–6957 (2014). [CrossRef] [PubMed]

25.

L. Tian, J. Lee, S. B. Oh, and G. Barbastathis, “Experimental compressive phase space tomography,” Opt. Express 20, 8296–8308 (2012). [CrossRef] [PubMed]

26.

Y. Shechtman, A. Beck, and Y. Eldar, “GESPAR: Efficient phase retrieval of sparse signals,” IEEE Trans. Sig. Processing 62, 928–938 (2014). [CrossRef]

OCIS Codes
(100.5070) Image processing : Phase retrieval
(170.0180) Medical optics and biotechnology : Microscopy
(170.1630) Medical optics and biotechnology : Coded aperture imaging
(110.1758) Imaging systems : Computational imaging
(110.3010) Imaging systems : Image reconstruction techniques

ToC Category:
Microscopy

History
Original Manuscript: May 9, 2014
Revised Manuscript: June 12, 2014
Manuscript Accepted: June 13, 2014
Published: June 19, 2014

Citation
Lei Tian, Xiao Li, Kannan Ramchandran, and Laura Waller, "Multiplexed coded illumination for Fourier Ptychography with an LED array microscope," Biomed. Opt. Express 5, 2376-2389 (2014)
http://www.opticsinfobase.org/boe/abstract.cfm?URI=boe-5-7-2376


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett.36, 3987–3989 (2011). [CrossRef] [PubMed]
  2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier Ptychographic microscopy,” Nat. Photonics7, 739–745 (2013). [CrossRef]
  3. L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett.39, 1326–1329 (2014). [CrossRef] [PubMed]
  4. D. Hamilton and C. Sheppard, “Differential phase contrast in scanning optical microscopy,” J. Microscopy133, 27–39 (1984). [CrossRef]
  5. T. N. Ford, K. K. Chu, and J. Mertz, “Phase-gradient microscopy in thick tissue with oblique back-illumination,” Nature Methods9, 1195–1197 (2012). [CrossRef] [PubMed]
  6. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy235, 144–162 (2009). [CrossRef]
  7. L. Waller, G. Situ, and J. Fleischer, “Phase-space measurement and coherence synthesis of optical beams,” Nat. Photonics6, 474–479 (2012). [CrossRef]
  8. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express22, 4960–4972 (2014). [CrossRef] [PubMed]
  9. L. Pantanowitz, “Digital images and the future of digital pathology,” J. Pathology Informatics1, 15 (2010). [CrossRef]
  10. X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via fourier ptychographic microscopy,” Opt. Lett.38, 4845–4848 (2013). [CrossRef] [PubMed]
  11. S. A. Alexandrov, T. R. Hillman, T. Gutzler, and D. D. Sampson, “Synthetic aperture fourier holographic optical microscopy,” Phys. Rev. Lett.97, 168102 (2006). [CrossRef] [PubMed]
  12. T. R. Hillman, T. Gutzler, S. A. Alexandrov, and D. D. Sampson, “High-resolution, wide-field object reconstruction with synthetic aperturefourier holographic optical microscopy,” Opt. Express17, 7873–7892 (2009). [CrossRef] [PubMed]
  13. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt.21, 2758–2769 (1982). [CrossRef] [PubMed]
  14. J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett.85, 4795–4797 (2004). [CrossRef]
  15. M. Guizar-Sicairos and J. R. Fienup, “Phase retrieval with transverse translation diversity: a nonlinear optimization approach,” Opt. Express16, 7264–7278 (2008). [CrossRef] [PubMed]
  16. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy109, 1256–1262 (2009). [CrossRef] [PubMed]
  17. P. Thibault, M. Dierolf, O. Bunk, A. Menzel, and F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy109, 338–343 (2009). [CrossRef] [PubMed]
  18. O. Bunk, M. Dierolf, S. Kynde, I. Johnson, O. Marti, and F. Pfeiffer, “Influence of the overlap parameter on the convergence of the ptychographical iterative engine,” Ultramicroscopy108, 481–487 (2008). [CrossRef]
  19. Y. Y. Schechner, S. K. Nayar, and P. N. Belhumeur, “Multiplexing for optimal lighting,” IEEE Trans. Pattern Analysis Machine Intelligence29, 1339–1354 (2007). [CrossRef]
  20. N. Ratner and Y. Y. Schechner, “Illumination multiplexing within fundamental limits,” in “Computer Vision and Pattern Recognition” (IEEE, 2007), pp. 1–8.
  21. R. Gerchberg and W. Saxton, “Phase determination for image and diffraction plane pictures in the electron microscope,” Optik34, 275–284 (1971).
  22. C. Rydberg, J. Bengtsson, and , “Numerical algorithm for the retrieval of spatial coherence properties of partially coherent beams from transverse intensity measurements,” Opt. Express15, 13613–13623 (2007). [CrossRef] [PubMed]
  23. P. Thibault and A. Menzel, “Reconstructing state mixtures from diffraction measurements,” Nature494, 68–71 (2013). [CrossRef] [PubMed]
  24. D. Dominguez, L. Molina, D. B. Desai, T. O’Loughlin, A. A. Bernussi, and L. G. de Peralta, “Hemispherical digital optical condensers with no lenses, mirrors, or moving parts,” Opt. Express22, 6948–6957 (2014). [CrossRef] [PubMed]
  25. L. Tian, J. Lee, S. B. Oh, and G. Barbastathis, “Experimental compressive phase space tomography,” Opt. Express20, 8296–8308 (2012). [CrossRef] [PubMed]
  26. Y. Shechtman, A. Beck, and Y. Eldar, “GESPAR: Efficient phase retrieval of sparse signals,” IEEE Trans. Sig. Processing62, 928–938 (2014). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Figures

Fig. 1 Fig. 2 Fig. 3
 
Fig. 4 Fig. 5
 

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited