OSA's Digital Library

Applied Optics

Applied Optics


  • Editor: James C. Wyant
  • Vol. 47, Iss. 11 — Apr. 10, 2008
  • pp: 1927–1939

Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision

Min Young Kim, Hyunkee Lee, and Hyungsuck Cho  »View Author Affiliations

Applied Optics, Vol. 47, Issue 11, pp. 1927-1939 (2008)

View Full Text Article

Enhanced HTML    Acrobat PDF (36310 KB)

Browse Journals / Lookup Meetings

Browse by Journal and Year


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools



One major research issue associated with 3D perception by robotic systems is the creation of efficient sensor systems that can generate dense range maps reliably. A visual sensor system for robotic applications is developed that is inherently equipped with two types of sensor, an active trinocular vision and a passive stereo vision. Unlike in conventional active vision systems that use a large number of images with variations of projected patterns for dense range map acquisition or from conventional passive vision systems that work well on specific environments with sufficient feature information, a cooperative bidirectional sensor fusion method for this visual sensor system enables us to acquire a reliable dense range map using active and passive information simultaneously. The fusion algorithms are composed of two parts, one in which the passive stereo vision helps active vision and the other in which the active trinocular vision helps the passive one. The first part matches the laser patterns in stereo laser images with the help of intensity images; the second part utilizes an information fusion technique using the dynamic programming method in which image regions between laser patterns are matched pixel-by-pixel with help of the fusion results obtained in the first part. To determine how the proposed sensor system and fusion algorithms can work in real applications, the sensor system is implemented on a robotic system, and the proposed algorithms are applied. A series of experimental tests is performed for a variety of configurations of robot and environments. The performance of the sensor system is discussed in detail.

© 2008 Optical Society of America

OCIS Codes
(100.6950) Image processing : Tomographic image processing
(150.6910) Machine vision : Three-dimensional sensing

ToC Category:
Machine Vision

Original Manuscript: November 5, 2007
Revised Manuscript: February 13, 2008
Manuscript Accepted: February 14, 2008
Published: April 7, 2008

Min Young Kim, Hyunkee Lee, and Hyungsuck Cho, "Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision," Appl. Opt. 47, 1927-1939 (2008)

Sort:  Author  |  Year  |  Journal  |  Reset  


  1. G. N. DeSouza and A. C. Kak, “Vision for mobile robot navigation: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 237-267 (2002).
  2. O. Faugeras, Three Dimensional Computer Vision: a Geometric Viewpoint (MIT Press, 1993).
  3. N. Ayache, Artificial Vision for Mobile Robots: Stereo Vision and Multisensory Perception (MIT Press, 1991).
  4. P. Weckesser and R. Dillmann, “Modeling unknown environments with a mobile robot,” Rob. Auton. Syst. 23, 293-300(1998). [CrossRef]
  5. H. R. Everett, Sensors for Mobile Robots: Theory and Application (AK Peters, 1995).
  6. J. Batlle, E. Mouaddib, and J. Salvi, “Recent progress in coded structured light as a technique to solve the correspondence problem: a survey,” Pattern Recog. 31, 963-982 (1998). [CrossRef]
  7. C. S. Chen, Y. P. Hung, C. C. Chiang, and J. L. Wu, “Range data acquisition using color structured lighting and stereo vision,” Image Vision Comput. 15,445-456 (1997). [CrossRef]
  8. P. Dias, V. Sequeira, J. G. M. Gongalves, and F. Vaz, “Automatic registration of laser reflectance and colour intensity images for 3D reconstruction,” Rob. Autono. Syst. 39, 157-168 (2002). [CrossRef]
  9. V. Sequeira, K. Ng, E. Wolfart, J. G. M. Gongalves, and D. Hogg, “Automated reconstruction of 3D models from real environment,” ISPRS J. Photogramm. Remote Sens. 54, 1-22 (1999).
  10. M. A. Abidi, M. Abdulghafour, and T. Chandra, “Fusion of visual and range features using fuzzy logic,” Control Eng. Pract. 2, 833-847 (1994). [CrossRef]
  11. I. S. Chang and R. H. Park, “Segmentation based on fusion of range and intensity images using robust trimmed methods,” Pattern Recogn. 34, 1951-1962 (2001). [CrossRef]
  12. K. Umeda, K. Ikushima, and T. Arai, “Fusion of range image and intensity image for 3D shape recognition,” in 1996 IEEE International Conference on Robotics and Automation (IEEE1996), Vol. 1, pp. 680-685.
  13. K. Tate and Z. N. Li, “Depth map construction from range-guided multiresolution stereo matching,” IEEE Trans. Syst. Man Cybern. 24, 134-144 (1994).
  14. M. Y. Kim and H. S. Cho, “An active trinocular vision system for sensing mobile robot navigation environments,” Sens. Actuators A 125, 192-209 (2006).
  15. R. Jain, R. Kasturi, and B. G. Schunck, Machine Vision (McGraw-Hill, 1995).
  16. M. Z. Brown, D. Burschka, and G. D. Hager, “Advances in computational stereo,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 993-1008 (2003)
  17. D. Scharstein and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vision 47, 7-42 (2002) [CrossRef]
  18. S. Birchfield and C. Tomasi, “Depth discontinuities by pixel-to-pixel stereo,” Int. J. Comput. Vision 35, 269-293 (1999) [CrossRef]
  19. H. J. Zimmermann, Fuzzy Set Theory and Its Applications (Kluwer Adademic, 1991).
  20. http://www.intel.com/technology/computing/opencv/

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited