OSA's Digital Library

Optics Express

Optics Express

  • Editor: Andrew M. Weiner
  • Vol. 22, Iss. 11 — Jun. 2, 2014
  • pp: 13586–13599
« Show journal navigation

Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging

Siyuan Dong, Roarke Horstmeyer, Radhika Shiradkar, Kaikai Guo, Xiaoze Ou, Zichao Bian, Huolin Xin, and Guoan Zheng  »View Author Affiliations


Optics Express, Vol. 22, Issue 11, pp. 13586-13599 (2014)
http://dx.doi.org/10.1364/OE.22.013586


View Full Text Article

Acrobat PDF (2979 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

We report an imaging scheme, termed aperture-scanning Fourier ptychography, for 3D refocusing and super-resolution macroscopic imaging. The reported scheme scans an aperture at the Fourier plane of an optical system and acquires the corresponding intensity images of the object. The acquired images are then synthesized in the frequency domain to recover a high-resolution complex sample wavefront; no phase information is needed in the recovery process. We demonstrate two applications of the reported scheme. In the first example, we use an aperture-scanning Fourier ptychography platform to recover the complex hologram of extended objects. The recovered hologram is then digitally propagated into different planes along the optical axis to examine the 3D structure of the object. We also demonstrate a reconstruction resolution better than the detector pixel limit (i.e., pixel super-resolution). In the second example, we develop a camera-scanning Fourier ptychography platform for super-resolution macroscopic imaging. By simply scanning the camera over different positions, we bypass the diffraction limit of the photographic lens and recover a super-resolution image of an object placed at the far field. This platform’s maximum achievable resolution is ultimately determined by the camera’s traveling range, not the aperture size of the lens. The FP scheme reported in this work may find applications in 3D object tracking, synthetic aperture imaging, remote sensing, and optical/electron/X-ray microscopy.

© 2014 Optical Society of America

1. Introduction

Fourier ptychography (FP) is a phase retrieval technique that uses the concept of angular diversity to recover high-resolution complex sample images [1

1. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

3

3. S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]

]. Similar to other phase retrieval techniques [4

4. R. Gerchberg, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Stuttg.) 35, 237 (1972).

14

14. B. H. Dean and C. W. Bowers, “Diversity selection for phase-diverse phase retrieval,” J. Opt. Soc. Am. A 20(8), 1490–1504 (2003). [CrossRef] [PubMed]

], the recovery process of FP consists of alternating enforcement of the known sample information in the spatial domain, and a fixed constraint in the Fourier domain. In particular, the recovery process of FP shares its roots with ptychography [15

15. H. M. L. Faulkner and J. M. Rodenburg, “Movable Aperture Lensless Transmission Microscopy: A Novel Phase Retrieval Algorithm,” Phys. Rev. Lett. 93(2), 023903 (2004). [CrossRef] [PubMed]

27

27. J. M. Rodenburg and R. H. T. Bates, “The Theory of Super-Resolution Electron Microscopy Via Wigner-Distribution Deconvolution,” Philos. Trans. R. Soc., A 339(1655), 521–553 (1992). [CrossRef]

], a lensless imaging approach that applies translational diversity (i.e., shifting the sample laterally) to recover its complex image. FP, instead, imposes the panning spectrum constraint in the Fourier domain to simultaneously expand the Fourier passband and recover the complex sample image.

Current FP platforms are mainly developed for microscopy applications. In these platforms, an LED array is used to provide angle-varied illumination, and the corresponding intensity images of the sample are captured using a low numerical aperture (NA) objective lens. These images are then iteratively stitched together in the Fourier domain to recover a high-resolution, complex sample image. Prior work has demonstrated FP’s microscopic imaging capabilities well beyond the cutoff frequency defined by the objective lens [1

1. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

], its acquisition of quantitative phase [2

2. X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via Fourier ptychographic microscopy,” Opt. Lett. 38(22), 4845–4848 (2013). [CrossRef] [PubMed]

], and the spectral multiplexing capability [3

3. S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]

]. Recent reviews on the FP approach can be found in [28

28. G. Zheng, “Fourier ptychographic imaging,” IEEE Photonics Journal 6, April Issue (2014).

, 29

29. G. Zheng, X. Ou, R. Horstmeyer, J. Chung, and C. Yang, “Fourier Ptychographic Microscopy: A Gigapixel Superscope for Biomedicine,” Opt. Photon. News 25, 26–33 (2014).

].

Despite these successful demonstrations of Fourier ptychography, they still share a major limitation: their imaged samples must be thin [1

1. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

]. Only under this assumption will the low-resolution images obtained at different incident angles uniquely map to different passbands of the 2D sample spectrum, allowing the FP algorithm to accurately impose the panning spectrum constraint to recover a high-resolution complex sample image. If the sample is not thin, this one-to-one mapping relationship in the Fourier plane is invalid, and the panning spectrum constraint cannot be imposed.

Variable-angle illumination is not the only way to capture shifted versions of a sample’s spectrum. Instead, if the sample is illuminated with a single plane-wave, a linearly translating imaging aperture (perpendicular to the optical axis) can achieve a similar effect. Such a setup allows us to circumvent the thin specimen assumption noted above. In this paper, we demonstrate such a detection-path-based imaging scheme, termed aperture-scanning Fourier ptychography. The reported scheme imposes a scanning aperture at the Fourier plane of an imaging system and acquires multiple intensity images of a sample. The acquired images are then synthesized in the frequency domain to recover the optical wavefront exiting the sample at high resolution. Unlike the illumination-based FP approach, the reported scheme’s recovered images each depend upon how the complex wavefront exits the sample – not enters it. Therefore, the sample thickness becomes irrelevant during reconstruction. After recovery, the exiting complex wavefront can then be back-propagated to any plane along the optical axis for 3D holographic refocusing.

Furthermore, the aperture-scanning scheme extends the FP framework for macroscopic imaging settings, where a photographic lens’s aperture naturally serves as a support constraint in the Fourier domain. By simply scanning the camera to different positions perpendicular to the optical axis, we can bypass its aperture-defined diffraction limit. The maximum achievable resolution of a FP-reconstructed image in this case will instead be defined by the camera’s traveling range. We note that, it is not possible to implement the FP concept in a macroscopic imaging setting using the original illumination-based scheme [1

1. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

, 2

2. X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via Fourier ptychographic microscopy,” Opt. Lett. 38(22), 4845–4848 (2013). [CrossRef] [PubMed]

].

In the following, we will first demonstrate the aperture-scanning FP scheme for 3D holographic refocusing. We will show that a resolution better than the detector pixel limit, i.e., pixel super-resolution, can be achieved using our prototype setup. Next, we will implement the reported scheme in a macroscopic imaging setting. We will show that our prototype setup is able to bypass the diffraction limit of a conventional photographic lens and recover a super-resolution image of an object placed in the far field. Finally, we will summarize the results and discuss future directions.

2. 3D refocusing via aperture-scanning Fourier ptychography

The aperture-scanning Fourier ptychographic imaging scheme is shown in Fig. 1(a)
Fig. 1 Scheme of aperture-scanning FP. (a) Optical setup. A circular aperture is placed at the Fourier plane of a 4f system. (b) A x-y motion stage is used to scan the circular aperture at the pupil plane. (c) The prototype setup of the reported scheme. (d) Outline of the FP recovery algorithm.
, where a circular aperture mask is placed at the Fourier plane of a 4f system. Denoting the optical field exiting the sample as s(x,y), we assume the field at the Fourier plane is S^(kx,ky), the Fourier transform of s, following the well-known property of 4f setups. We used an x-y motion stage to scan the circular aperture a(kx,ky), which selectively transmits different passbands of the optical field s^(kx,ky) to the image plane, as shown in Fig. 1(b). For each position of the circular mask i, we acquire an intensity image Ii of the sample that takes the form, Ii=|[ais^]|2, where is the Fourier transform operator. We then synthesize these acquired images in the frequency domain to produce a complex sample image.

The recovery process of the aperture-scanning Fourier ptychographic imaging scheme is briefly outlined as follows. Figure 1(d) offers an algorithm sketch, while further details are available in [1

1. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

]. We note that this basic recovery process assumes the illumination field extending across the sample’s lateral extent is coherent. A more advanced FP recovery procedure may incorporate partial coherence modeling [3

3. S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]

]. The recovery algorithm starts with a high-resolution spectrum estimate of the sample, U^0(kx,ky). This initial guess can be random. Next, this sample spectrum estimate is sequentially updated with the low-resolution intensity measurements Imi (subscript m stands for measurement and i stands for the ith aperture position). For each update step, we select a small sub-region of U^0(kx,ky), corresponding to one position of the circular aperture, and apply Fourier transformation to generate a new low-resolution target image Ilieiφli (subscript l stands for low-resolution and i for the ith aperture position). We then replace the target image’s amplitude component Iliwith the square root of the measurement Imi to form an updated, low-resolution target image Imieiφl. This image is then used to update its corresponding sub-region of U^0(kx,ky). This replace-and-update sequence is repeated for all intensity measurements, and we iterate through the above process several times until solution convergence, at which point U^0(kx,ky) is Fourier transformed to the spatial domain to produce a high-resolution complex sample image U0(x,y).

We first tested the reported scheme using a US Air Force (USAF) resolution target. Figure 2(a)
Fig. 2 Pixel super-resolution demonstration using the aperture-scanning Fourier ptychographic imaging scheme. (a) The raw intensity image captured with the circular mask. The resolution is limited by the size of the circular mask. (b) The aperture-scanning FP reconstruction. The resolution is limited by the synthetic mask, which is 4 times larger than the mask used in Fig. 2(a). (c) The captured image with the circular aperture fully open. The resolution is twice of the pixel size, limited by the pixel aliasing problem.
shows the raw image captured by our prototype setup. The resolution of this raw image is limited by the size of the circular aperture (0.025 NA, 12.4 µm resolution). From Fig. 2(a), we can resolve group 5, element 3 (line width of 12.4 µm) of the USAF target, in a good agreement with the diffraction limit of the optical system. In the acquisition process, we used a 0.85 mm step size for aperture scanning. The corresponding synthetic NA of the recovered image is about 0.1 (3.15 µm resolution), 4 times better than the original intensity image. Figure 2(b) shows the recovered image of the reported scheme. In this figure, group 7, element 3 (line width of 3.11 µm) of the USAF target can be clearly resolved, again, in a good agreement with the theoretical prediction. The processing time for Fig. 2(b) is less than one second using an Intel i7 CPU. For comparison, we also show an image captured by our system with the aperture fully open in Fig. 2(c). In this case, the NA of optical system is about 0.27 (f number of 1.8), corresponding to 1.2 µm resolution. According to the sampling theorem, a 0.6 µm pixel size is needed to fully characterize the captured image. However, such a small detector pixel size is currently not commercially available. The pixel size of our camera is 5.5 µm, a typical size for CCD image sensors. Therefore, the resolution of Fig. 2(c) remains 11 µm (pixel size *2), limited by the problem of pixel aliasing.

Pixel aliasing is a limiting factor for many imaging applications [31

31. G. Zheng, S. A. Lee, S. Yang, and C. Yang, “Sub-pixel resolving optofluidic microscope for on-chip cell imaging,” Lab Chip 10(22), 3125–3129 (2010). [CrossRef] [PubMed]

, 32

32. W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010). [CrossRef] [PubMed]

]. The experimental results shown in Fig. 2 may provide a Fourier-domain solution for this problem. In the recovery algorithm, we model the aperture using a binary pupil function; no phase factor is introduced. If the system has some aberrations, we can also introduce a phase factor with different Zernike modes to compensate for the aberrations. We note that, in the reported scheme, the phase factor is different for each position of the employed aperture. We can use the adaptive correction framework [33

33. Z. Bian, S. Dong, and G. Zheng, “Adaptive system correction for robust Fourier ptychographic imaging,” Opt. Express 21(26), 32400–32410 (2013). [CrossRef] [PubMed]

] to find the global optimal Zernike coefficients. The reported scheme may provide a unique computational solution for compensating aberrations in large-format, high-NA imaging systems.

We also note that, in modern optical system design, there is a gap between the information capacity (i.e., the space-bandwidth product, SBP [34

34. A. W. Lohmann, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, and C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A. 13(3), 470–473 (1996). [CrossRef]

]) of an optical system and that of a digital recording device [35

35. O. S. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in Computational Photography (ICCP),2011IEEE International Conference on, (IEEE, 2011), 1–8. [CrossRef]

, 36

36. M. Ben-Ezra, “A digital gigapixel large-format tile-scan camera,” IEEE Comput. Graph. Appl. 31(1), 49–61 (2011). [CrossRef] [PubMed]

]. For example, a simple closed-circuit-television (CCTV) lens can provide a field-of-view of 75 mm2 and a diffraction limited resolution of 0.78 µm (characterized at 632 nm wavelength), leading to a SBP of 0.5 billion [37

37. G. Zheng, X. Ou, and C. Yang, “0.5 gigapixel microscopy using a flatbed scanner,” Biomed. Opt. Express 5(1), 1–8 (2014). [CrossRef] [PubMed]

]. To fully sample the field transferred by this lens to its image plane, we need at least 0.5 billion effective pixels on the image sensor, which is orders of magnitude higher than the pixel count of existing CCD/CMOS image sensors. To this end, the result demonstrated in Fig. 2 may provide a solution to bridge the SBP gap between current optical elements and digital detectors. We can, for example, first match the information capability between an optical system and a digital recording device by inserting a mask to the pupil plane (a low-pass filter). Combination of multiple acquired low-resolution images subsequently can yield a wide-field, high-resolution image with a final pixel count matching the full information capacity of the optical system.

Another advantage of the reported scheme is its ability to record the exiting complex wavefront, i.e., the hologram, of the sample. Unlike the conventional holographic imaging techniques, the reported scheme requires no interferometric measurements and thus reduces the coherence requirement of the light source (see Appendix A for details regarding FP’s required coherence conditions). We note that, phase contrast images can be derived from different apertures of the Fourier plane or by oblique incident angles [38

38. A. B. Parthasarathy, K. K. Chu, T. N. Ford, and J. Mertz, “Quantitative phase imaging using a partitioned detection aperture,” Opt. Lett. 37(19), 4062–4064 (2012). [CrossRef] [PubMed]

40

40. L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39(5), 1326–1329 (2014). [CrossRef] [PubMed]

]. Our work, on the other hand, aims to synthesize different apertures and recover the complex wavefront at the same time.

In Fig. 3
Fig. 3 Demonstration of 3D holographic refocusing using the aperture-scanning FP scheme. (a) The raw intensity image of a titled slide. (b) The aperture-scanning FP recovered intensity and phase images. The recovered sections at (c1) z = −1300 µm, (c2) z = −1000 µm, (c3) z = −700, and (c4) z = −400 µm.
, we demonstrate 3D holographic refocusing with an LED light source. The experimental setup is the same as before, but the imaged sample is an axially tilted microscope slide (corn stem cell, B&H). Figure 3(a) shows a raw intensity image captured by the prototype setup. Figure 3(b) shows the recovered complex wavefront exiting the sample. Similar to other holographic imaging techniques, we can propagate the recorded hologram to any plane along the optical axis. Figure 3(c1)-3(c4) shows intensity images of the recovered field after digitally refocusing it to different axial locations. For example, in Fig. 3(c1) we propagated the recovered hologram by −1.3 mm axially. As the sample is tilted with respect to the optical axis, we can see that different parts of the sample are brought into focus in different spatial regions, as highlighted by red arrows in Fig. 3(c1)-3(c4).

We also tested the reported scheme using an extended 3D object (a spider’s leg). Figure 4(a)
Fig. 4 (Media 1) Holographic refocusing of an extended 3D object. (a) The raw intensity image of a 3D object (leg of a spider). (b) The aperture-scanning FP recovered intensity and phase images. The recovered sections after digital propagation to (c1) z = −500 µm, (c2) z = −150 µm, (c3) z = + 150, and (c4) z = + 500 µm.
shows the raw intensity image of this sample, and Fig. 4(b) shows its recovered FP hologram. We then back-propagated this recovered hologram to different planes along the optical axis. Figure 4(c) shows the intensity of the recovered hologram propagated to different axial sections of the sample (also refer to Media 1). We note that, each recovered section contains information of the entire extended object; out-of-focus information will be superimposed on the in-focus information (such a feature can also be found in other holographic imaging techniques [41

41. M. Daneshpanah and B. Javidi, “Tracking biological microorganisms in sequence of 3D holographic microscopy images,” Opt. Express 15(17), 10761–10766 (2007). [CrossRef] [PubMed]

, 42

42. M. Daneshpanah, S. Zwick, F. Schaal, M. Warber, B. Javidi, and W. Osten, “3D Holographic Imaging and Trapping for Non-Invasive Cell Identification and Tracking,” Display Technology, Journalism 6, 490–499 (2010).

]). Much like a through-focus image stack from a conventional microscope setup, this data still contains useful information regarding the sample’s three dimensional structure.

3. Macroscopic imaging beyond the diffraction limit via camera-scanning FP

The key innovation of reported scheme is to impose a constraint at a Fourier conjugate plane within an imaging system. This simple concept may be directly implemented outside of a 4f system in a macroscopic imaging platform. Figure 5(a)
Fig. 5 Experimental setup of the camera-scanning FP approach. (a) The sample is placed at the far field and the aperture of the camera lens naturally serves as a support constraint at the sample’s Fourier conjugate plane. By scanning the entire camera at different x-y positions, we can synthesize a larger passband in Fourier space, enabling super-resolution imaging of the object. (b)The USAF target is placed at the far field and a 2D motion stage is used to scan the entire camera assembly through the x-y plane.
demonstrates a camera-scanning FP scheme, where the object is placed at the far field and the camera is scanned over different x-y positions to acquire images corresponding to different passbands. We note that, far field propagation is equivalent to performing Fourier transform of the light field. Therefore, the aperture of the camera lens naturally serves as a support constraint at the Fourier space. By scanning the entire camera at different x-y positions, we are able to synthesize a large passband in the Fourier space, and thus bypass the resolution limit imposed by the photographic lens.

Figure 5(b) shows the prototype camera-scanning FP setup. A USAF resolution target is used to characterize its imaging performance, and the entire camera is scanned through the x-y plane. We used a CCD camera with a 5.5 µm pixel size and a 50 mm Nikon photographic lens with a fixed f-number of 16 (we choose this f-number to avoid pixel aliasing problem of the image sensor; a smaller f-number can be used with a smaller pixel size). The same single LED is used as the illumination source as in Section 2, with a central wavelength of 632 nm. The step size of the mechanical scan is 1.2 mm in x and y, and we scanned to 7 x 7 locations in the included experiment.

Figure 6
Fig. 6 Imaging performance of the camera-scanning FP approach. (a1) The raw image of the object directly captured by the camera. (a2) The spectrum of (a1). (b) The recovered image using the camera-scanning FP scheme. Group 2, element 4 can be clearly resolved. (b2) The recovered spectrum of (b1). The remaining artifacts of the recovered image may be due to the incorrect modeling of the aperture shape and partial coherent effects of the light source.
demonstrates the camera-scanning FP setup’s imaging performance. Figure 6(a1) displays a section of one of the camera’s raw images, and Fig. 6(a2) shows the magnitude of its corresponding spectrum in Fourier space (on a log scale). Figure 6(b1) displays an example FP reconstruction from all 49 images, while the corresponding magnitude of this reconstructed image’s spectrum is in Fig. 6(b2). It is clear that the reported scheme is able recover an image with a resolution better than that of the photographic lens. For example, Fig. 6(b1) contains a resolved digit ‘4’, which is impossible to discern within the raw image. However, a number of artifacts persist in this reconstruction. One source of artifact may be the inadequate coherence of the illumination provided by the light source, which is detailed in Appendix A. Another source may be incorrect modeling of the aperture shape (we use a circular pupil in our FP reconstruction, while it should be an irregular pentagon corresponding to the Nikon photographic lens iris diaphragm). To reconstruct an improved-quality FP image, we can simultaneously recover both the high-resolution image and the irregular pupil function’s shape [43

43. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014). [CrossRef] [PubMed]

].

Synthetic aperture imaging is a super-resolution technique originally developed for radio telescopes [46

46. M. Ryle and A. Hewish, “The synthesis of large radio telescopes,” Mon. Not. R. Astron. Soc. 120, 220 (1960).

, 47

47. A. B. Meinel, “Aperture synthesis using independent telescopes,” Appl. Opt. 9(11), 2501 (1970). [CrossRef] [PubMed]

]. The concept has also been adopted in microscopy imaging systems in recent years [51

51. V. Mico, Z. Zalevsky, and J. García, “Superresolution optical system by common-path interferometry,” Opt. Express 14(12), 5168–5177 (2006). [CrossRef] [PubMed]

57

57. T. R. Hillman, T. Gutzler, S. A. Alexandrov, and D. D. Sampson, “High-resolution, wide-field object reconstruction with synthetic aperture Fourier holographic optical microscopy,” Opt. Express 17(10), 7873–7892 (2009). [CrossRef] [PubMed]

]. The basic idea of this technique is to combine images from a collection of telescopes in the Fourier domain to improve the achievable resolution. This technique’s data fusion process requires that each telescope measure the incoming signal’s amplitude and phase. While simple for radio frequencies, optical frequencies require an accurate interferometry setup to recover the incoming light field’s phase. This is why imaging with synthetic aperture technique has been used successfully in radio astronomy since the 1950s and in optical astronomy only since the 2000 decade. Like synthetic aperture setups, the reported camera-scanning FP approach also expands the detected field’s Fourier passband to improve the achievable resolution. However, it is different in two key regards: 1) the reported approach only records intensity information of the light field; no phase information is needed. 2) It requires aperture overlapping during successive acquisitions to enable our recovery algorithm to estimate the missing phase. The synthetic aperture technique does not require any redundant data, which is important to FP’s successful convergence.

Integral imaging [48

48. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]

50

50. J.-S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27(13), 1144–1146 (2002). [CrossRef] [PubMed]

] or light field imaging [58

58. V. Vaish, M. Levoy, R. Szeliski, C. L. Zitnick, and S. B. Kang, “Reconstructing Occluded Surfaces Using Synthetic Apertures: Stereo, Focus and Robust Measures,” in Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Volume 2, (IEEE Computer Society, 2006), pp. 2331–2338. [CrossRef]

61

61. G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36(20), 3987–3989 (2011). [CrossRef] [PubMed]

] is a multi-perspective imaging technique that uses multiple cameras to record multiple images of a scene from different perspectives. The acquired images are then shifted and added to perform 3D refocusing. Similar to integral imaging, the camera-scanning FP also captures multiple perspective images of a sample. However, integral imaging and our approach differ in two key regards. First, integral imaging’s resolution is still determined by the aperture size of a single photographic lens. Camera-scanning FP, on the other hand, bypasses the resolution limit of the photographic lens. Second, integral and light field imaging work with incoherent illumination, while again our approach requires coherent/partially coherent illuminations. In this regard, the reported scheme is limited to coherent (or partially coherent) imaging settings [62

62. A. Wax, Coherent Light Microscopy: Imaging and Quantitative Phase Analysis (Springer, 2011), Vol. 46.

] while integral imaging use incoherent illumination.

4. Conclusion

In conclusion, we have demonstrated an aperture-scanning Fourier ptychography scheme for 3D holographic refocusing and super-resolution macroscopic imaging. There are several advantages associated with the reported scheme. 1) It does not require any interferometric measurement. We used an LED as for our demonstrations’ light source, which helps to suppress speckle noise and other coherent artifacts common to holographic imaging. 2) The reported scheme may provide a Fourier-domain solution to the pixel aliasing problem. 3) Aberrations of large-format, high-NA lenses can be modeled using coherent pupil functions. By introducing these pupil functions in the recover process, the reported scheme may be able to compensate for these aberrations. Gigapixel microscope can be built using high-capacity photographic lenses [37

37. G. Zheng, X. Ou, and C. Yang, “0.5 gigapixel microscopy using a flatbed scanner,” Biomed. Opt. Express 5(1), 1–8 (2014). [CrossRef] [PubMed]

]. 4) The reported scheme is capable of extending the FP concept to macroscopic imaging settings, where objects are placed in the far field. We show that by simply scanning a camera over a defined region, a recovered image can bypass the resolution limit set by the photographic lens. The final resolution limit is determined by the travelling range of the camera, not the lens’ aperture size.

There are two limitations associated with the reported approach. 1) The mechanical scanning used in our prototype setups is a limiting factor for high-throughput applications. However, we note that, for the aperture-scanning scheme, the scanning process can be implemented using a spatial light modulator or an array of MEMS mirrors. For the camera-scanning scheme, a camera array similar to multi-camera integral imaging setups [48

48. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]

50

50. J.-S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27(13), 1144–1146 (2002). [CrossRef] [PubMed]

] can remove this limitation. The ultimate throughput will be determined by how fast the data can be transferred from the image sensor to the computer. 2) The use of coherent (or partially coherent) illumination in the reported approach may impose a limit on potential applications. For example, the reported approach cannot be used for photographic imaging with outdoor ambient light. If we want to image an object far away from the lens, a collimated laser beam is needed for illumination and a laser-line filter is needed to filter out the ambient light components from the environment. The coherence requirement of the reported scheme is an interesting topic and requires further investigations.

Statement of competing financial interests

Appendix A: Partial coherence of aperture-scanning Fourier ptychography

Both the aperture and camera scanning FP setups assume each captured image’s optical field is sufficiently coherent (spatially and temporally). Our successful implementation using LED illumination demonstrates that a high-coherence laser source is not required. In the following, we detail exactly how the illumination source’s spatial and temporal coherence impacts our measured data. We conclude that both forms of incoherence must be taken into account during FP system design, but do not significantly impact the accuracy of our demonstrations.

Spatial Coherence: To begin, we will assume that our LED is emitting a quasi-monochromatic field from a finite source area A, within which the field is completely incoherent. Considering a 1D system for simplicity, we can express the statistical nature of light at the LED source plane L using the cross-spectral density (CSD) function CL as,
CL(r1,r2)=γ2A(r1,ω)δ(r1r2),
(1)
where (r1,r2) represent spatial coordinates at the LED source plane, A is the geometric shape of the source intensity for each frequency ω=2π/λ, γ is the spatial coherence cross section, and δ is a Dirac delta function. The LED’s light first propagates a distance z to the sample plane S. From the Van Cittert-Zernike theorem, the CSD function CS directly before the sample is,
CS(r1'r2')A(r)e2πjλzr(r1'r2')dr=A^(r),
(2)
where (r1,r2) are spatial coordinates at the sample plane, A^ is the Fourier transform of the source shape A from Eq. (1), and the approximation drops a quadratic phase factor for simplicity. Assuming A is a circular aperture with diameter of w, Eq. (2) leads to the often-used metric of coherence length, lc=1.22λz/w, which is A^’s primary lobe width and the length over which we may consider the optical field to be a single deterministic function. Within this range, the field obeys propagation’s well-known Fourier transforming property. In both the aperture and camera scanning FP setups, we use w = 150 µm, λ= 632 nm and z = 20 cm to find an approximate lc = 1 mm. The aperture scanning data in Fig. 24 (FOV less than 1 mm) may thus be considered a coherent field, thus satisfying our FP reconstruction assumptions. In our camera-scanning setup, the lens is placed at the far field, ~70 cm away from the sample. Assuming that the sample does not significantly modify the coherence property, the transverse coherent length is then determined to be ~3.5 mm at the aperture plane of the Nikon lens. The size of the aperture we used is about 3.1 mm, smaller than the coherent area. Therefore, the light field remains correlated within each acquisition.

If the sample is larger than the coherent region of the sample, it is beneficial to split the captured images into smaller coherent tiles (e.g., 1 mm tiles), apply the FP algorithm separately to each tile, and then re-combine the resolution-enhanced tiles into a final reconstruction. This procedure, also employed in [1

1. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

], enables us to assume that each tile contains a fully coherent field and leads to an accurate resolution-enhanced image over the full FOV without loss of generality.

It is direct to show that imperfect spatial coherence degrades FP’s captured data gradually. Likewise, a digital filter can help remove these negative effects to recover raw intensity images that closely match images from a perfectly spatially coherent source (given accurate knowledge of the source shape). This becomes clear if we propagate our statistical description of the partially coherent field in Eq. (2) through the rest of the optical system. The light’s CSD after the sample is CS(r1r2)s(r1)s*(r2)=A^(r1r2)s(r1)s*(r2), assuming it is thin. Neglecting coordinate scaling factors for simplicity, this CSD is transformed by both the scanning and shifting setups to the image plane via a coherent transfer function [65

65. D. J. Brady, Optical Imaging and Spectroscopy (John Wiley & Sons, 2009).

],
CI(r1,r2)=A^(r1'r2')s(r1')s*(r2')a^(r1'r1)a^*(r2'r2)dr1'dr2',
(3)
where a^ is the Fourier transform of the finite imaging aperture. Given that FP only detects the intensity at r=r1=r2, and spatially shifts the aperture a by a finite off-axis distance x to turn its Fourier transform into a^(r)eikxr, we may rewrite Eq. (3) to express the detected intensity as a function of both these variables as,
I(r,x)=A^(r1'r2')s(r1')s*(r2')a^(r1'r)a^*(r2'r)eikx(r1'r2')dr1'dr2',
(4)
Equation (4) is identical in form to partially coherent description of the FP microscope’s captured data set in [53

53. V. Mico, Z. Zalevsky, P. García-Martínez, and J. García, “Synthetic aperture superresolution with multiple off-axis holograms,” J. Opt. Soc. Am. A 23(12), 3162–3170 (2006). [CrossRef] [PubMed]

] (i.e., a spatially offset partially coherent source is equivalent to a shifted imaging aperture). Several manipulations help verify the only difference between Eq. (4) and the description of a fully coherent FP setup is its inclusion of a non-negligible A^(r1'r2'). As demonstrated in [53

53. V. Mico, Z. Zalevsky, P. García-Martínez, and J. García, “Synthetic aperture superresolution with multiple off-axis holograms,” J. Opt. Soc. Am. A 23(12), 3162–3170 (2006). [CrossRef] [PubMed]

], it is possible to remove the effects of A^(r1'r2') from Eq. (4) via post-processing. A deconvolution operation can recover the same data that a fully coherent FP setup would capture, assuming the source shape A is known and that noise is negligible.

Finally, we note that, a mixed-state formulation can be used to model the partially coherent effects in FP platforms [3

3. S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]

]. The finite extent of the light source can be modeled as multiple independent point sources. The finite spectrum of the light source can be modeled as multiple light sources emitting light at different wavelengths.

Temporal Coherence: The LED source’s finite temporal coherence may limit the FP algorithm’s maximum achievable resolution and should be taken into account during system design. While minimally important in our setup (as detailed below), a simple analysis helps determine an optimal system focal length and scan distance. The optical field at the 4f setup’s Fourier plane will spatially scale with wavelength, causing FP’s shifting aperture to filter different spatial frequencies as a function of wavelength. Uneven filtering will increase with the shifted aperture’s distance from the optical axis. Thus, for a maximum allowable fractional disparity in LED wavelength λminmax, focal length f, and the fixed sub-aperture width w, a simple trigonometric relationship may establish a maximum off-axis aperture distance dmax.
sin(atan((dmaxw/2)/f))sin(atan((dmax+w/2)/f))=λminλmax.
(5)
This defines the FP setup’s maximum achievable numerical aperture and hence resolution limit. For example, our shifted aperture setup uses λminmax = 0.96 (from a 25 nm LED bandwidth at 632 nm) and w = 2.5 mm. A simple calculation establishes dmax = 38 mm (much longer than the demonstrated d = 7 mm). Similar analyses may prove important when considering greater bandwidth illumination sources.

Acknowledgments

We thank Prof. Changhuei Yang for helpful discussion. We also thank him for letting us use his motion controller. Huolin Xin acknowledges support from the Center for Functional Nanomaterials, Brookhaven National Laboratory, which is supported by the U.S. Department of Energy, Office of Basic Energy Sciences, under Contract No. DE-AC02-98CH10886. For more information on Fourier ptychography, please visit us at ‘Smart Imaging Lab at UConn’: https://sites.google.com/site/gazheng/.

References and links

1.

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]

2.

X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via Fourier ptychographic microscopy,” Opt. Lett. 38(22), 4845–4848 (2013). [CrossRef] [PubMed]

3.

S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]

4.

R. Gerchberg, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Stuttg.) 35, 237 (1972).

5.

J. R. Fienup, “Reconstruction of an object from the modulus of its Fourier transform,” Opt. Lett. 3(1), 27–29 (1978). [CrossRef] [PubMed]

6.

J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef] [PubMed]

7.

R. A. Gonsalves, “Phase retrieval and diversity in adaptive optics,” Opt. Eng. 21, 215829 (1982).

8.

R. A. Gonsalves, “Phase retrieval by differential intensity measurements,” J. Opt. Soc. Am. A 4(1), 166–170 (1987). [CrossRef]

9.

L. Allen and M. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. 199(1-4), 65–75 (2001). [CrossRef]

10.

V. Elser, “Phase retrieval by iterated projections,” J. Opt. Soc. Am. A 20(1), 40–55 (2003). [CrossRef] [PubMed]

11.

L. Waller, S. S. Kou, C. J. R. Sheppard, and G. Barbastathis, “Phase from chromatic aberrations,” Opt. Express 18(22), 22817–22825 (2010). [CrossRef] [PubMed]

12.

C.-H. Lu, C. Barsi, M. O. Williams, J. N. Kutz, and J. W. Fleischer, “Phase retrieval using nonlinear diversity,” Appl. Opt. 52(10), D92–D96 (2013). [CrossRef] [PubMed]

13.

L. Taylor, “The phase retrieval problem,” IEEE Trans. Antennas Propag. 29(2), 386–391 (1981). [CrossRef]

14.

B. H. Dean and C. W. Bowers, “Diversity selection for phase-diverse phase retrieval,” J. Opt. Soc. Am. A 20(8), 1490–1504 (2003). [CrossRef] [PubMed]

15.

H. M. L. Faulkner and J. M. Rodenburg, “Movable Aperture Lensless Transmission Microscopy: A Novel Phase Retrieval Algorithm,” Phys. Rev. Lett. 93(2), 023903 (2004). [CrossRef] [PubMed]

16.

M. Guizar-Sicairos and J. R. Fienup, “Phase retrieval with transverse translation diversity: a nonlinear optimization approach,” Opt. Express 16(10), 7264–7278 (2008). [CrossRef] [PubMed]

17.

P. Thibault, M. Dierolf, A. Menzel, O. Bunk, C. David, and F. Pfeiffer, “High-Resolution Scanning X-Ray Diffraction Microscopy,” Science 321(5887), 379–382 (2008). [CrossRef] [PubMed]

18.

P. Thibault, M. Dierolf, O. Bunk, A. Menzel, and F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy 109(4), 338–343 (2009). [CrossRef] [PubMed]

19.

M. Dierolf, P. Thibault, A. Menzel, C. M. Kewish, K. Jefimovs, I. Schlichting, K. von König, O. Bunk, and F. Pfeiffer, “Ptychographic coherent diffractive imaging of weakly scattering specimens,” New J. Phys. 12(3), 035017 (2010). [CrossRef]

20.

A. M. Maiden, J. M. Rodenburg, and M. J. Humphry, “Optical ptychography: a practical implementation with useful resolution,” Opt. Lett. 35(15), 2585–2587 (2010). [CrossRef] [PubMed]

21.

F. Hüe, J. M. Rodenburg, A. M. Maiden, and P. A. Midgley, “Extended ptychography in the transmission electron microscope: Possibilities and limitations,” Ultramicroscopy 111(8), 1117–1123 (2011). [CrossRef] [PubMed]

22.

A. Shenfield and J. M. Rodenburg, “Evolutionary determination of experimental parameters for ptychographical imaging,” J. Appl. Phys. 109(12), 124510 (2011). [CrossRef]

23.

M. J. Humphry, B. Kraus, A. C. Hurst, A. M. Maiden, and J. M. Rodenburg, “Ptychographic electron microscopy using high-angle dark-field scattering for sub-nanometre resolution imaging,” Nat. Commun. 3, 730 (2012). [CrossRef] [PubMed]

24.

T. B. Edo, D. J. Batey, A. M. Maiden, C. Rau, U. Wagner, Z. D. Pešić, T. A. Waigh, and J. M. Rodenburg, “Sampling in x-ray ptychography,” Phys. Rev. A 87(5), 053850 (2013). [CrossRef]

25.

S. Marchesini, A. Schirotzek, C. Yang, H.- Wu, and F. Maia, “Augmented projections for ptychographic imaging,” Inverse Probl. 29(11), 115009 (2013). [CrossRef]

26.

W. Hoppe and G. Strube, “Diffraction in inhomogeneous primary wave fields. 2. Optical experiments for phase determination of lattice interferences,” Acta Crystallogr. A 25, 502–507 (1969). [CrossRef]

27.

J. M. Rodenburg and R. H. T. Bates, “The Theory of Super-Resolution Electron Microscopy Via Wigner-Distribution Deconvolution,” Philos. Trans. R. Soc., A 339(1655), 521–553 (1992). [CrossRef]

28.

G. Zheng, “Fourier ptychographic imaging,” IEEE Photonics Journal 6, April Issue (2014).

29.

G. Zheng, X. Ou, R. Horstmeyer, J. Chung, and C. Yang, “Fourier Ptychographic Microscopy: A Gigapixel Superscope for Biomedicine,” Opt. Photon. News 25, 26–33 (2014).

30.

C.-K. Liang, T.-H. Lin, B.-Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27(3), 1–10 (2008). [CrossRef]

31.

G. Zheng, S. A. Lee, S. Yang, and C. Yang, “Sub-pixel resolving optofluidic microscope for on-chip cell imaging,” Lab Chip 10(22), 3125–3129 (2010). [CrossRef] [PubMed]

32.

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010). [CrossRef] [PubMed]

33.

Z. Bian, S. Dong, and G. Zheng, “Adaptive system correction for robust Fourier ptychographic imaging,” Opt. Express 21(26), 32400–32410 (2013). [CrossRef] [PubMed]

34.

A. W. Lohmann, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, and C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A. 13(3), 470–473 (1996). [CrossRef]

35.

O. S. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in Computational Photography (ICCP),2011IEEE International Conference on, (IEEE, 2011), 1–8. [CrossRef]

36.

M. Ben-Ezra, “A digital gigapixel large-format tile-scan camera,” IEEE Comput. Graph. Appl. 31(1), 49–61 (2011). [CrossRef] [PubMed]

37.

G. Zheng, X. Ou, and C. Yang, “0.5 gigapixel microscopy using a flatbed scanner,” Biomed. Opt. Express 5(1), 1–8 (2014). [CrossRef] [PubMed]

38.

A. B. Parthasarathy, K. K. Chu, T. N. Ford, and J. Mertz, “Quantitative phase imaging using a partitioned detection aperture,” Opt. Lett. 37(19), 4062–4064 (2012). [CrossRef] [PubMed]

39.

I. Iglesias, “Pyramid phase microscopy,” Opt. Lett. 36(18), 3636–3638 (2011). [CrossRef] [PubMed]

40.

L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39(5), 1326–1329 (2014). [CrossRef] [PubMed]

41.

M. Daneshpanah and B. Javidi, “Tracking biological microorganisms in sequence of 3D holographic microscopy images,” Opt. Express 15(17), 10761–10766 (2007). [CrossRef] [PubMed]

42.

M. Daneshpanah, S. Zwick, F. Schaal, M. Warber, B. Javidi, and W. Osten, “3D Holographic Imaging and Trapping for Non-Invasive Cell Identification and Tracking,” Display Technology, Journalism 6, 490–499 (2010).

43.

X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014). [CrossRef] [PubMed]

44.

S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Processing Mag. 20(3), 21–36 (2003). [CrossRef]

45.

J. C. Gillette, T. M. Stadtmiller, and R. C. Hardie, “Aliasing reduction in staring infrared imagers utilizing subpixel techniques,” Opt. Eng. 34(11), 3130–3137 (1995). [CrossRef]

46.

M. Ryle and A. Hewish, “The synthesis of large radio telescopes,” Mon. Not. R. Astron. Soc. 120, 220 (1960).

47.

A. B. Meinel, “Aperture synthesis using independent telescopes,” Appl. Opt. 9(11), 2501 (1970). [CrossRef] [PubMed]

48.

X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]

49.

A. Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express 11(19), 2446–2451 (2003). [CrossRef] [PubMed]

50.

J.-S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27(13), 1144–1146 (2002). [CrossRef] [PubMed]

51.

V. Mico, Z. Zalevsky, and J. García, “Superresolution optical system by common-path interferometry,” Opt. Express 14(12), 5168–5177 (2006). [CrossRef] [PubMed]

52.

J. García, Z. Zalevsky, and D. Fixler, “Synthetic aperture superresolution by speckle pattern projection,” Opt. Express 13(16), 6073–6078 (2005). [CrossRef] [PubMed]

53.

V. Mico, Z. Zalevsky, P. García-Martínez, and J. García, “Synthetic aperture superresolution with multiple off-axis holograms,” J. Opt. Soc. Am. A 23(12), 3162–3170 (2006). [CrossRef] [PubMed]

54.

V. Mico, Z. Zalevsky, P. Garcia-Martinez, and J. Garcia, “Single-step superresolution by interferometric imaging,” Opt. Express 12(12), 2589–2596 (2004). [CrossRef] [PubMed]

55.

S. A. Alexandrov, T. R. Hillman, T. Gutzler, and D. D. Sampson, “Synthetic aperture Fourier holographic optical microscopy,” Phys. Rev. Lett. 97(16), 168102 (2006). [CrossRef] [PubMed]

56.

J. Di, J. Zhao, H. Jiang, P. Zhang, Q. Fan, and W. Sun, “High resolution digital holographic microscopy with a wide field of view based on a synthetic aperture technique and use of linear CCD scanning,” Appl. Opt. 47(30), 5654–5659 (2008). [CrossRef] [PubMed]

57.

T. R. Hillman, T. Gutzler, S. A. Alexandrov, and D. D. Sampson, “High-resolution, wide-field object reconstruction with synthetic aperture Fourier holographic optical microscopy,” Opt. Express 17(10), 7873–7892 (2009). [CrossRef] [PubMed]

58.

V. Vaish, M. Levoy, R. Szeliski, C. L. Zitnick, and S. B. Kang, “Reconstructing Occluded Surfaces Using Synthetic Apertures: Stereo, Focus and Robust Measures,” in Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Volume 2, (IEEE Computer Society, 2006), pp. 2331–2338. [CrossRef]

59.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” in ACM Transactions on Graphics (TOG), (ACM, 2005), 765–776.

60.

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” in Electronic Imaging 2002, (International Society for Optics and Photonics, 2001), 29–36.

61.

G. Zheng, C. Kolner, and C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36(20), 3987–3989 (2011). [CrossRef] [PubMed]

62.

A. Wax, Coherent Light Microscopy: Imaging and Quantitative Phase Analysis (Springer, 2011), Vol. 46.

63.

G. Zheng, X. Ou, R. Horstmeyer, and C. Yang, “Characterization of spatially varying aberrations for wide field-of-view microscopy,” Opt. Express 21(13), 15131–15143 (2013). [CrossRef] [PubMed]

64.

D. B. Williams and C. B. Carter, The Transmission Electron Microscope (Springer, 1996).

65.

D. J. Brady, Optical Imaging and Spectroscopy (John Wiley & Sons, 2009).

OCIS Codes
(100.3190) Image processing : Inverse problems
(100.6640) Image processing : Superresolution
(110.0110) Imaging systems : Imaging systems
(170.0180) Medical optics and biotechnology : Microscopy
(090.1995) Holography : Digital holography

ToC Category:
Imaging Systems

History
Original Manuscript: April 3, 2014
Revised Manuscript: May 14, 2014
Manuscript Accepted: May 15, 2014
Published: May 29, 2014

Virtual Issues
Vol. 9, Iss. 8 Virtual Journal for Biomedical Optics

Citation
Siyuan Dong, Roarke Horstmeyer, Radhika Shiradkar, Kaikai Guo, Xiaoze Ou, Zichao Bian, Huolin Xin, and Guoan Zheng, "Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging," Opt. Express 22, 13586-13599 (2014)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-22-11-13586


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. G. Zheng, R. Horstmeyer, C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]
  2. X. Ou, R. Horstmeyer, C. Yang, G. Zheng, “Quantitative phase imaging via Fourier ptychographic microscopy,” Opt. Lett. 38(22), 4845–4848 (2013). [CrossRef] [PubMed]
  3. S. Dong, R. Shiradkar, P. Nanda, G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]
  4. R. Gerchberg, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Stuttg.) 35, 237 (1972).
  5. J. R. Fienup, “Reconstruction of an object from the modulus of its Fourier transform,” Opt. Lett. 3(1), 27–29 (1978). [CrossRef] [PubMed]
  6. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef] [PubMed]
  7. R. A. Gonsalves, “Phase retrieval and diversity in adaptive optics,” Opt. Eng. 21, 215829 (1982).
  8. R. A. Gonsalves, “Phase retrieval by differential intensity measurements,” J. Opt. Soc. Am. A 4(1), 166–170 (1987). [CrossRef]
  9. L. Allen, M. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. 199(1-4), 65–75 (2001). [CrossRef]
  10. V. Elser, “Phase retrieval by iterated projections,” J. Opt. Soc. Am. A 20(1), 40–55 (2003). [CrossRef] [PubMed]
  11. L. Waller, S. S. Kou, C. J. R. Sheppard, G. Barbastathis, “Phase from chromatic aberrations,” Opt. Express 18(22), 22817–22825 (2010). [CrossRef] [PubMed]
  12. C.-H. Lu, C. Barsi, M. O. Williams, J. N. Kutz, J. W. Fleischer, “Phase retrieval using nonlinear diversity,” Appl. Opt. 52(10), D92–D96 (2013). [CrossRef] [PubMed]
  13. L. Taylor, “The phase retrieval problem,” IEEE Trans. Antennas Propag. 29(2), 386–391 (1981). [CrossRef]
  14. B. H. Dean, C. W. Bowers, “Diversity selection for phase-diverse phase retrieval,” J. Opt. Soc. Am. A 20(8), 1490–1504 (2003). [CrossRef] [PubMed]
  15. H. M. L. Faulkner, J. M. Rodenburg, “Movable Aperture Lensless Transmission Microscopy: A Novel Phase Retrieval Algorithm,” Phys. Rev. Lett. 93(2), 023903 (2004). [CrossRef] [PubMed]
  16. M. Guizar-Sicairos, J. R. Fienup, “Phase retrieval with transverse translation diversity: a nonlinear optimization approach,” Opt. Express 16(10), 7264–7278 (2008). [CrossRef] [PubMed]
  17. P. Thibault, M. Dierolf, A. Menzel, O. Bunk, C. David, F. Pfeiffer, “High-Resolution Scanning X-Ray Diffraction Microscopy,” Science 321(5887), 379–382 (2008). [CrossRef] [PubMed]
  18. P. Thibault, M. Dierolf, O. Bunk, A. Menzel, F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy 109(4), 338–343 (2009). [CrossRef] [PubMed]
  19. M. Dierolf, P. Thibault, A. Menzel, C. M. Kewish, K. Jefimovs, I. Schlichting, K. von König, O. Bunk, F. Pfeiffer, “Ptychographic coherent diffractive imaging of weakly scattering specimens,” New J. Phys. 12(3), 035017 (2010). [CrossRef]
  20. A. M. Maiden, J. M. Rodenburg, M. J. Humphry, “Optical ptychography: a practical implementation with useful resolution,” Opt. Lett. 35(15), 2585–2587 (2010). [CrossRef] [PubMed]
  21. F. Hüe, J. M. Rodenburg, A. M. Maiden, P. A. Midgley, “Extended ptychography in the transmission electron microscope: Possibilities and limitations,” Ultramicroscopy 111(8), 1117–1123 (2011). [CrossRef] [PubMed]
  22. A. Shenfield, J. M. Rodenburg, “Evolutionary determination of experimental parameters for ptychographical imaging,” J. Appl. Phys. 109(12), 124510 (2011). [CrossRef]
  23. M. J. Humphry, B. Kraus, A. C. Hurst, A. M. Maiden, J. M. Rodenburg, “Ptychographic electron microscopy using high-angle dark-field scattering for sub-nanometre resolution imaging,” Nat. Commun. 3, 730 (2012). [CrossRef] [PubMed]
  24. T. B. Edo, D. J. Batey, A. M. Maiden, C. Rau, U. Wagner, Z. D. Pešić, T. A. Waigh, J. M. Rodenburg, “Sampling in x-ray ptychography,” Phys. Rev. A 87(5), 053850 (2013). [CrossRef]
  25. S. Marchesini, A. Schirotzek, C. Yang, H.- Wu, F. Maia, “Augmented projections for ptychographic imaging,” Inverse Probl. 29(11), 115009 (2013). [CrossRef]
  26. W. Hoppe, G. Strube, “Diffraction in inhomogeneous primary wave fields. 2. Optical experiments for phase determination of lattice interferences,” Acta Crystallogr. A 25, 502–507 (1969). [CrossRef]
  27. J. M. Rodenburg, R. H. T. Bates, “The Theory of Super-Resolution Electron Microscopy Via Wigner-Distribution Deconvolution,” Philos. Trans. R. Soc., A 339(1655), 521–553 (1992). [CrossRef]
  28. G. Zheng, “Fourier ptychographic imaging,” IEEE Photonics Journal 6, April Issue (2014).
  29. G. Zheng, X. Ou, R. Horstmeyer, J. Chung, C. Yang, “Fourier Ptychographic Microscopy: A Gigapixel Superscope for Biomedicine,” Opt. Photon. News 25, 26–33 (2014).
  30. C.-K. Liang, T.-H. Lin, B.-Y. Wong, C. Liu, H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27(3), 1–10 (2008). [CrossRef]
  31. G. Zheng, S. A. Lee, S. Yang, C. Yang, “Sub-pixel resolving optofluidic microscope for on-chip cell imaging,” Lab Chip 10(22), 3125–3129 (2010). [CrossRef] [PubMed]
  32. W. Bishara, T.-W. Su, A. F. Coskun, A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010). [CrossRef] [PubMed]
  33. Z. Bian, S. Dong, G. Zheng, “Adaptive system correction for robust Fourier ptychographic imaging,” Opt. Express 21(26), 32400–32410 (2013). [CrossRef] [PubMed]
  34. A. W. Lohmann, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A. 13(3), 470–473 (1996). [CrossRef]
  35. O. S. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in Computational Photography (ICCP),2011IEEE International Conference on, (IEEE, 2011), 1–8. [CrossRef]
  36. M. Ben-Ezra, “A digital gigapixel large-format tile-scan camera,” IEEE Comput. Graph. Appl. 31(1), 49–61 (2011). [CrossRef] [PubMed]
  37. G. Zheng, X. Ou, C. Yang, “0.5 gigapixel microscopy using a flatbed scanner,” Biomed. Opt. Express 5(1), 1–8 (2014). [CrossRef] [PubMed]
  38. A. B. Parthasarathy, K. K. Chu, T. N. Ford, J. Mertz, “Quantitative phase imaging using a partitioned detection aperture,” Opt. Lett. 37(19), 4062–4064 (2012). [CrossRef] [PubMed]
  39. I. Iglesias, “Pyramid phase microscopy,” Opt. Lett. 36(18), 3636–3638 (2011). [CrossRef] [PubMed]
  40. L. Tian, J. Wang, L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39(5), 1326–1329 (2014). [CrossRef] [PubMed]
  41. M. Daneshpanah, B. Javidi, “Tracking biological microorganisms in sequence of 3D holographic microscopy images,” Opt. Express 15(17), 10761–10766 (2007). [CrossRef] [PubMed]
  42. M. Daneshpanah, S. Zwick, F. Schaal, M. Warber, B. Javidi, W. Osten, “3D Holographic Imaging and Trapping for Non-Invasive Cell Identification and Tracking,” Display Technology, Journalism 6, 490–499 (2010).
  43. X. Ou, G. Zheng, C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014). [CrossRef] [PubMed]
  44. S. C. Park, M. K. Park, M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Processing Mag. 20(3), 21–36 (2003). [CrossRef]
  45. J. C. Gillette, T. M. Stadtmiller, R. C. Hardie, “Aliasing reduction in staring infrared imagers utilizing subpixel techniques,” Opt. Eng. 34(11), 3130–3137 (1995). [CrossRef]
  46. M. Ryle, A. Hewish, “The synthesis of large radio telescopes,” Mon. Not. R. Astron. Soc. 120, 220 (1960).
  47. A. B. Meinel, “Aperture synthesis using independent telescopes,” Appl. Opt. 9(11), 2501 (1970). [CrossRef] [PubMed]
  48. X. Xiao, B. Javidi, M. Martinez-Corral, A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]
  49. A. Stern, B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express 11(19), 2446–2451 (2003). [CrossRef] [PubMed]
  50. J.-S. Jang, B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27(13), 1144–1146 (2002). [CrossRef] [PubMed]
  51. V. Mico, Z. Zalevsky, J. García, “Superresolution optical system by common-path interferometry,” Opt. Express 14(12), 5168–5177 (2006). [CrossRef] [PubMed]
  52. J. García, Z. Zalevsky, D. Fixler, “Synthetic aperture superresolution by speckle pattern projection,” Opt. Express 13(16), 6073–6078 (2005). [CrossRef] [PubMed]
  53. V. Mico, Z. Zalevsky, P. García-Martínez, J. García, “Synthetic aperture superresolution with multiple off-axis holograms,” J. Opt. Soc. Am. A 23(12), 3162–3170 (2006). [CrossRef] [PubMed]
  54. V. Mico, Z. Zalevsky, P. Garcia-Martinez, J. Garcia, “Single-step superresolution by interferometric imaging,” Opt. Express 12(12), 2589–2596 (2004). [CrossRef] [PubMed]
  55. S. A. Alexandrov, T. R. Hillman, T. Gutzler, D. D. Sampson, “Synthetic aperture Fourier holographic optical microscopy,” Phys. Rev. Lett. 97(16), 168102 (2006). [CrossRef] [PubMed]
  56. J. Di, J. Zhao, H. Jiang, P. Zhang, Q. Fan, W. Sun, “High resolution digital holographic microscopy with a wide field of view based on a synthetic aperture technique and use of linear CCD scanning,” Appl. Opt. 47(30), 5654–5659 (2008). [CrossRef] [PubMed]
  57. T. R. Hillman, T. Gutzler, S. A. Alexandrov, D. D. Sampson, “High-resolution, wide-field object reconstruction with synthetic aperture Fourier holographic optical microscopy,” Opt. Express 17(10), 7873–7892 (2009). [CrossRef] [PubMed]
  58. V. Vaish, M. Levoy, R. Szeliski, C. L. Zitnick, and S. B. Kang, “Reconstructing Occluded Surfaces Using Synthetic Apertures: Stereo, Focus and Robust Measures,” in Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Volume 2, (IEEE Computer Society, 2006), pp. 2331–2338. [CrossRef]
  59. B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” in ACM Transactions on Graphics (TOG), (ACM, 2005), 765–776.
  60. B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” in Electronic Imaging 2002, (International Society for Optics and Photonics, 2001), 29–36.
  61. G. Zheng, C. Kolner, C. Yang, “Microscopy refocusing and dark-field imaging by using a simple LED array,” Opt. Lett. 36(20), 3987–3989 (2011). [CrossRef] [PubMed]
  62. A. Wax, Coherent Light Microscopy: Imaging and Quantitative Phase Analysis (Springer, 2011), Vol. 46.
  63. G. Zheng, X. Ou, R. Horstmeyer, C. Yang, “Characterization of spatially varying aberrations for wide field-of-view microscopy,” Opt. Express 21(13), 15131–15143 (2013). [CrossRef] [PubMed]
  64. D. B. Williams and C. B. Carter, The Transmission Electron Microscope (Springer, 1996).
  65. D. J. Brady, Optical Imaging and Spectroscopy (John Wiley & Sons, 2009).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Supplementary Material


» Media 1: MP4 (10695 KB)     

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited