OSA's Digital Library

Applied Optics

Applied Optics

APPLICATIONS-CENTERED RESEARCH IN OPTICS

  • Editor: Joseph N. Mait
  • Vol. 53, Iss. 3 — Jan. 20, 2014
  • pp: 368–375

Range and egomotion estimation from compound photodetector arrays with parallel optical axis using optical flow techniques

J. S. Chahl  »View Author Affiliations


Applied Optics, Vol. 53, Issue 3, pp. 368-375 (2014)
http://dx.doi.org/10.1364/AO.53.000368


View Full Text Article

Enhanced HTML    Acrobat PDF (930 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

This paper describes an application for arrays of narrow-field-of-view sensors with parallel optical axes. These devices exhibit some complementary characteristics with respect to conventional perspective projection or angular projection imaging devices. Conventional imaging devices measure rotational egomotion directly by measuring the angular velocity of the projected image. Translational egomotion cannot be measured directly by these devices because the induced image motion depends on the unknown range of the viewed object. On the other hand, a known translational motion generates image velocities which can be used to recover the ranges of objects and hence the three-dimensional (3D) structure of the environment. A new method is presented for computing egomotion and range using the properties of linear arrays of independent narrow-field-of-view optical sensors. An approximate parallel projection can be used to measure translational egomotion in terms of the velocity of the image. On the other hand, a known rotational motion of the paraxial sensor array generates image velocities, which can be used to recover the 3D structure of the environment. Results of tests of an experimental array confirm these properties.

© 2014 Optical Society of America

OCIS Codes
(150.4620) Machine vision : Optical flow
(150.5670) Machine vision : Range finding
(110.4153) Imaging systems : Motion estimation and optical flow
(330.7324) Vision, color, and visual optics : Visual optics, comparative animal models

ToC Category:
Vision, Color, and Visual Optics

History
Original Manuscript: August 14, 2013
Revised Manuscript: December 5, 2013
Manuscript Accepted: December 6, 2013
Published: January 15, 2014

Citation
J. S. Chahl, "Range and egomotion estimation from compound photodetector arrays with parallel optical axis using optical flow techniques," Appl. Opt. 53, 368-375 (2014)
http://www.opticsinfobase.org/ao/abstract.cfm?URI=ao-53-3-368


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. H. R. Everett, Sensors for Mobile Robots (A. K. Peters, 1995).
  2. A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.
  3. J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).
  4. R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987). [CrossRef]
  5. S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.
  6. G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (1977). [CrossRef]
  7. H. G. Krapp and F. Gabbiani, “Spatial distribution of inputs and local receptive field properties of a wide-field, looming sensitive neuron,” J. Neurophysiol. 93, 2240–2253 (2005). [CrossRef]
  8. W. Junger and H. J. Dahmen, “Response to self-motion in waterstriders: visual discrimination between rotation and translation,” J. Comp. Physiol. A 169, 641–646 (1991).
  9. R. Preiss, “Separation of translation and rotation by means of eye-region specialisation in flying gypsy moths (Lepidoptera: Lymantriidae),” J. Insect Behav. 4, 209–219 (1991).
  10. M. F. Land, “Visual acuity in insects,” Ann. Rev. Entomol. 42, 147–177 (1997).
  11. B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007). [CrossRef]
  12. E. Hecht and A. Zajac, Optics (Addison-Wesley, 1973).
  13. J. S. Chahl and M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996). [CrossRef]
  14. M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,” J. Field Robotics 25, 284–301 (2008).
  15. M. Sim and D. Kim, “Electrolocation based on tail-bending movements in weakly electric fish,” J. Exp. Biol. 214, 2443–2450 (2011).
  16. A. Mizutani and Y. Toh, “Behavioral analysis of two distinct visual responses in the larva of the tiger beetle (Cicindela chinensis),” J. Comp. Physiol. A 182, 277–286 (1998). [CrossRef]
  17. R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited