OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 20, Iss. 6 — Mar. 12, 2012
  • pp: 6561–6574

Depth measurements through controlled aberrations of projected patterns

Gabriel C. Birch, J. Scott Tyo, and Jim Schwiegerling  »View Author Affiliations


Optics Express, Vol. 20, Issue 6, pp. 6561-6574 (2012)
http://dx.doi.org/10.1364/OE.20.006561


View Full Text Article

Enhanced HTML    Acrobat PDF (2475 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Three-dimensional displays have become increasingly present in consumer markets. However, the ability to capture three-dimensional images in space confined environments and without major modifications to current cameras is uncommon. Our goal is to create a simple modification to a conventional camera that allows for three dimensional reconstruction. We require such an imaging system have imaging and illumination paths coincident. Furthermore, we require that any three-dimensional modification to a camera also permits full resolution 2D image capture.Here we present a method of extracting depth information with a single camera and aberrated projected pattern. A commercial digital camera is used in conjunction with a projector system with astigmatic focus to capture images of a scene. By using an astigmatic projected pattern we can create two different focus depths for horizontal and vertical features of a projected pattern, thereby encoding depth. By designing an aberrated projected pattern, we are able to exploit this differential focus in post-processing designed to exploit the projected pattern and optical system. We are able to correlate the distance of an object at a particular transverse position from the camera to ratios of particular wavelet coefficients.We present our information regarding construction, calibration, and images produced by this system. The nature of linking a projected pattern design and image processing algorithms will be discussed.

© 2012 OSA

OCIS Codes
(110.6880) Imaging systems : Three-dimensional image acquisition
(150.6910) Machine vision : Three-dimensional sensing

ToC Category:
Imaging Systems

History
Original Manuscript: December 16, 2011
Revised Manuscript: February 9, 2012
Manuscript Accepted: February 28, 2012
Published: March 6, 2012

Citation
Gabriel C. Birch, J. Scott Tyo, and Jim Schwiegerling, "Depth measurements through controlled aberrations of projected patterns," Opt. Express 20, 6561-6574 (2012)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-20-6-6561


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. A. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern Anal. Mach. Intell.19, 1158 –1164 (1997). [CrossRef]
  2. V. Aslantas and D. T. Pham, “Depth from automatic defocusing,” Opt. Express15, 1011–1023 (2007). [CrossRef] [PubMed]
  3. K. Atanassov, V. Ramachandra, S. R. Goma, and M. Aleksic, “3D image processing architecture for camera phones,” Proc. SPIE7864, 786414 (2011). [CrossRef]
  4. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photon.3, 128–160 (2011). [CrossRef]
  5. D. Falie and L. C. Ciobotaru, “Modified time of flight camera 3D-images improving method,” J. Optoelectron. Adv. Mater.4, 136–140 (2010).
  6. E.P. Baltsavias, “Airborne laser scanning: basic relations and formulas,” ISPRS J. Photogramm. Remote Sens.54, 199 – 214 (1999). [CrossRef]
  7. C. Wang, T. Chang, and M. Yuen, “From laser-scanned data to feature human model: a system based on fuzzy logic concept,” Comput.-Aided Des.35, 241–253 (2003). [CrossRef]
  8. L. Yanga, P. Zhang, S. Liu, P. R. Samala, M. Su, and H. Yokota, “Measurement of strain distributions in mouse femora with 3D-digital speckle pattern interferometry,” Opt. Laser Eng.45, 843–851 (2007). [CrossRef]
  9. A. Anand, V. K. Chhaniwal, P. Almoro, G. Pedrini, and W. Osten, “Shape and deformation measurements of 3D objects using volume speckle field and phase retrieval,” Opt. Lett.34, 1522–1524 (2009). [CrossRef] [PubMed]
  10. C. Chen, Y. Hung, C. Chiang, and J. Wu, “Range data acquisition using color structured lighting and stereo vision,” Image Vis. Comput.15, 445–456 (1997). [CrossRef]
  11. Z. Kiraly, G. Springer, and J. Van Dam, “Stereoscopic vision system,” Opt. Eng.45, 043006 (2006). [CrossRef]
  12. M.-C. Park, S. J. Park, and J.-Y. Son, “Stereoscopic imaging and display for a 3-d mobile phone,” Appl. Opt.48, H238–H243 (2009). [CrossRef] [PubMed]
  13. M. Subbarao and G. Surya, “Depth from defocus - A spatial domain approach,” Int. J. Comput. Vis.13, 271–294 (1994). [CrossRef]
  14. M. Watanabe and S. Nayar, “Rational filters for passive depth from defocus,” Int. J. Comput. Vis.27, 203–225 (1998). [CrossRef]
  15. E. Adelson and J. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell.14, 99–106 (1992). [CrossRef]
  16. M. Magee, R. Weniger, and E. Franke, “Location of features of known height in the presence of reflective and refractive noise using a stereoscopic light-striping approach,” Opt. Eng.33, 1092–1098 (1994). [CrossRef]
  17. S. Alibhai and S. Zucker, “Contour-based correspondence for stereo,” in “Comp. Vis. Proceedings,”, D Vernon, ed.
  18. R. Garcia, J. Batlle, and J. Salvi, “A new approach to pose detection using a trinocular stereovision system,” Real-Time Imag.8, 73–93 (2002). [CrossRef]
  19. R. Koch, M. Pollefeys, and L. Van Gool, “Realistic surface reconstruction of 3D scenes from uncalibrated image sequences,” J. Vis. Comput. Anim.11, 115–127 (2000). [CrossRef]
  20. A. Levin, S. W. Hasinoff, P. Green, F. Durand, and W. T. Freeman, “4D frequency analysis of computational cameras for depth of field extension,” ACM Trans. Graph.28, 97 (2009). [CrossRef]
  21. T. G. Georgiev and A. Lumsdaine, “Resolution in plenoptic cameras,” Computational Optical Sensing and Imaging, (OSA, 2009).
  22. T. G. Georgiev and A. Lumsdaine, “Superresolution with plenoptic 2.0 cameras,” Signal recovery and synthesis, (OSA, 2009).
  23. T. G. Georgiev, A. Lumsdaine, and S. Goma, “High dynamic range image capture with plenoptic 2.0 camera,” Signal recovery and synthesis, (OSA, 2009).
  24. R. Valkenburg and A. McIvor, “Accurate 3D measurement using a structured light system,” Image Vis. Comput.16, 99–110 (1998). [CrossRef]
  25. H. J. Chen, J. Zhang, and J. Fang, “Surface height retrieval based on fringe shifting of color-encoded structured light pattern,” Opt. Lett.33, 1801–1803 (2008). [CrossRef] [PubMed]
  26. Y. Caulier, “Inspection of complex surfaces by means of structured light patterns,” Opt. Express18, 6642–6660 (2010). [CrossRef] [PubMed]
  27. C. Guan, L. Hassebrook, and D. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express11, 406–417 (2003). [CrossRef] [PubMed]
  28. J. Pages, J. Salvi, C. Collewet, and J. Forest, “Optimised De Bruijn patterns for one-shot shape acquisition,” Image Vis. Comput.23, 707–720 (2005). [CrossRef]
  29. C. Beumier and M. Acheroy, “Automatic 3D face authentication,” Image Vis. Comput.18, 315–321 (2000). [CrossRef]
  30. A. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern. Anal. Mach. Intell.19, 1158–1164 (1997). [CrossRef]
  31. L. Vincent and P. Soille, “Watersheds in digital spaces- an efficient algorithm based on immersion simulations,” IEEE Trans. Pattern. Anal. Mach. Intell.13, 583–598 (1991). [CrossRef]
  32. F. Meyer, “Topographic distance and watershed lines,” Signal Process.38, 113–125 (1994). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited