OSA's Digital Library

Journal of the Optical Society of America A

Journal of the Optical Society of America A

| OPTICS, IMAGE SCIENCE, AND VISION

  • Vol. 20, Iss. 3 — Mar. 1, 2003
  • pp: 450–469

Catchment areas of panoramic snapshots in outdoor scenes

Jochen Zeil, Martin I. Hofmann, and Javaan S. Chahl  »View Author Affiliations


JOSA A, Vol. 20, Issue 3, pp. 450-469 (2003)
http://dx.doi.org/10.1364/JOSAA.20.000450


View Full Text Article

Enhanced HTML    Acrobat PDF (3054 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

We took panoramic snapshots in outdoor scenes at regular intervals in two- or three-dimensional grids covering 1 m2 or 1 m3 and determined how the root mean square pixel differences between each of the images and a reference image acquired at one of the locations in the grid develop over distance from the reference position. We then asked whether the reference position can be pinpointed from a random starting position by moving the panoramic imaging device in such a way that the image differences relative to the reference image are minimized. We find that on time scales of minutes to hours, outdoor locations are accurately defined by a clear, sharp minimum in a smooth three-dimensional (3D) volume of image differences (the 3D difference function). 3D difference functions depend on the spatial-frequency content of natural scenes and on the spatial layout of objects therein. They become steeper in the vicinity of dominant objects. Their shape and smoothness, however, are affected by changes in illumination and shadows. The difference functions generated by rotation are similar in shape to those generated by translation, but their plateau values are higher. Rotational difference functions change little with distance from the reference location. Simple gradient descent methods are surprisingly successful in recovering a goal location, even if faced with transient changes in illumination. Our results show that view-based homing with panoramic images is in principle feasible in natural environments and does not require the identification of individual landmarks. We discuss the relevance of our findings to the study of robot and insect homing.

© 2003 Optical Society of America

OCIS Codes
(000.4920) General : Other life sciences
(000.4930) General : Other topics of general interest
(100.2960) Image processing : Image analysis
(150.6910) Machine vision : Three-dimensional sensing
(330.5370) Vision, color, and visual optics : Physiological optics
(330.7310) Vision, color, and visual optics : Vision

History
Original Manuscript: July 24, 2002
Revised Manuscript: October 24, 2002
Manuscript Accepted: October 30, 2002
Published: March 1, 2003

Citation
Jochen Zeil, Martin I. Hofmann, and Javaan S. Chahl, "Catchment areas of panoramic snapshots in outdoor scenes," J. Opt. Soc. Am. A 20, 450-469 (2003)
http://www.opticsinfobase.org/josaa/abstract.cfm?URI=josaa-20-3-450


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. T. S. Collett, J. Zeil, “Selection and use of landmarks by insects,” in Orientation and Communication in Arthropods, M. Lehrer, ed. (Birkhäuser Verlag, Basel, Switzerland, 1997), pp. 41–65.
  2. T. S. Collett, J. Zeil, “Places and landmarks: an arthro-pod perspective,” in Spatial Representation in Animals, S. Healy, ed. (Oxford U. Press, Oxford, UK, 1998), pp. 18–53.
  3. M. Giurfa, E. A. Capaldi, “Vectors, routes and maps: new discoveries about navigation in insects,” Trends Neurosci. 22, 237–242 (1999). [CrossRef] [PubMed]
  4. M. O. Franz, H. A. Mallot, “Biomimetic robot navigation,” Rob. Auton. Syst. 30, 133–153 (2000). [CrossRef]
  5. E. M. Riseman, A. R. Hanson, J. R. Beveridge, R. Kumar, H. Sawhney, “Landmark-based navigation and the acquisition of environmental models,” in Visual Navigation, Y. Aloimonos, ed. (Erlbaum, Hillsdale, N.J., 1997), pp. 317–374.
  6. J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” IEEE Control Syst., special issue on robotics and automation, 12, 38–45 (1992).
  7. M. V. Srinivasan, “An image interpolation technique for the computation of optic flow and egomotion,” Biol. Cybern. 71, 401–415 (1994). [CrossRef]
  8. J. S. Chahl, M. V. Srinivasan, “Visual computation of egomotion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996). [CrossRef] [PubMed]
  9. M. O. Franz, B. Schölkopf, H. A. Mallot, H. H. Bülthoff, “Where did I take that snapshot? Scene-based homing by image matching,” Biol. Cybern. 79, 191–202 (1998). [CrossRef]
  10. D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, R. Wehner, “A mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30, 39–64 (2000). [CrossRef]
  11. R. Möller, “Insect visual homing strategies in a robot with analog processing,” Biol. Cybern. 83, 231–243 (2000). [CrossRef] [PubMed]
  12. B. A. Cartwright, T. S. Collett, “Landmark learning in bees: experiments and models,” J. Comp. Physiol. 151, 521–543 (1983). [CrossRef]
  13. B. A. Cartwright, T. S. Collett, “Landmark maps for honeybees,” Biol. Cybern. 57, 85–93 (1987). [CrossRef]
  14. M. Lehrer, G. Bianco, “The turn-back-and-look behaviour: bee versus robot,” Biol. Cybern. 83, 211–229 (2000). [CrossRef] [PubMed]
  15. P. Gaussier, C. Joulain, J. P. Banquet, S. Leprêtre, A. Revel, “The visual homing problem: an example of robotics/biology cross fertilization,” Rob. Auton. Syst. 30, 155–180 (2000). [CrossRef]
  16. M. G. Nagle, M. V. Srinivasan, D. L. Wilson, “Image interpolation technique for measurement of egomotion in 6 degrees of freedom,” J. Opt. Soc. Am. A 14, 3233–3241 (1997). [CrossRef]
  17. J. S. Chahl, M. V. Srinivasan, “Range estimation with a panoramic visual sensor,” J. Opt. Soc. Am. A 14, 2144–2151 (1997). [CrossRef]
  18. M. G. Nagle, M. V. Srinivasan, “Structure from motion: determining the range and orientation of surfaces by image interpolation,” J. Opt. Soc. Am. A 13, 25–34 (1996). [CrossRef]
  19. M. V. Srinivasan, J. S. Chahl, S. W. Zhang, “Robot navigation by visual dead-reckoning: inspiration from insects,” Int. J. Pattern Recogn. Artif. Intell. 11, 35–47 (1997). [CrossRef]
  20. J. S. Chahl, M. V. Srinivasan, “Reflective surfaces for panoramic imaging,” Appl. Opt. 36, 8275–8285 (1997). [CrossRef]
  21. M. P. Eckert, J. Zeil, “Towards an ecology of motion vision,” in Motion Vision: Computational, Neural and Ecological Constraints, J. M. Zanker, J. Zeil, eds. (Springer-Verlag, Berlin, 2001), pp. 333–369.
  22. G. E. P. Box, N. R. Draper, Evolutionary Operation (Wiley, New York, 1969).
  23. J. Zeil, “Orientation flights of solitary wasps (Cerceris; Sphecidae; Hymenoptera): I. Description of flight,” J. Comp. Physiol., A 172, 189–205 (1993). [CrossRef]
  24. M. Lehrer, “Why do bees turn back and look?” J. Comp. Physiol., A 172, 549–563 (1993). [CrossRef]
  25. T. S. Collett, M. Lehrer, “Looking and learning: a spatial pattern in the orientation flight of the wasp Vespula vulgaris,” Proc. R. Soc. London, Ser. B 252, 129–134 (1993). [CrossRef]
  26. J. Zeil, A. Kelber, R. Voss, “Structure and function of learning flights in bees and wasps,” J. Exp. Biol. 199, 245–252 (1996).
  27. T. S. Collett, J. Zeil, “Flights of learning,” Curr. Direct. Psychol. Sci. 5, 149–155 (1996). [CrossRef]
  28. J. Zeil, “Orientation flights of solitary wasps (Cerceris; Sphecidae; Hymenoptera): II. Similarities between orientation and return flights and the use of motion parallax,” J. Comp. Physiol., A 172, 207–222 (1993). [CrossRef]
  29. T. S. Collett, J. A. Rees, “View-based navigation in Hymenoptera: multiple strategies of landmark guidance in the approach to a feeder,” J. Comp. Physiol., A 181, 47–58 (1997). [CrossRef]
  30. J. H. van Hateren, M. V. Srinivasan, P. B. Wait, “Pattern recognition in bees: orientation discrimination,” J. Comp. Physiol., A 167, 649–654 (1990).
  31. D. Efler, B. Ronacher, “Evidence against a retinotopic-template matching in honeybees’ pattern recognition,” Vision Res. 40, 3391–3403 (2000). [CrossRef]
  32. M. Dill, M. Heisenberg, “Visual pattern memory without shape recognition,” Philos. Trans. R. Soc. London, Ser. B 349, 143–152 (1995). [CrossRef] [PubMed]
  33. T. S. Collett, M. F. Land, “Visual spatial memory in a hoverfly,” J. Comp. Physiol. 100, 59–84 (1975). [CrossRef]
  34. R. Wehner, F. Räber, “Visual spatial memory in desert ants, Cataglyphis bicolor (Hymenoptera: Formicidae),” Experientia 35, 1569–1571 (1979). [CrossRef]
  35. M. Dill, R. Wolf, M. Heisenberg, “Visual pattern recognition in Drosophila involves retinotopic matching,” Nature (London) 365, 751–753 (1993). [CrossRef]
  36. M. Heisenberg, “Pattern recognition in insects,” Curr. Opin. Neurobiol. 5, 475–481 (1995). [CrossRef] [PubMed]
  37. B. Ronacher, U. Duft, “An image-matching mechanism describes a generalization task in honeybees,” J. Comp. Physiol., A 178, 803–812 (1996). [CrossRef]
  38. B. Ronacher, “How do bees learn and recognize visual patterns?” Biol. Cybern. 79, 477–485 (1998). [CrossRef]
  39. R. Ernst, M. Heisenberg, “The memory template in Drosophila pattern vision at the flight simulator,” Vision Res. 39, 3920–3933 (1999). [CrossRef]
  40. D. M. Coppola, H. R. Purves, A. N. McCoy, D. Purves, “The distribution of oriented contours in thereal world,” Proc. Natl. Acad. Sci. U.S.A. 95, 4002–4006 (1998). [CrossRef]
  41. A. van der Schaaf, H. van Hateren, “Modelling the power spectra of natural images: statistics and information,” Vision Res. 36, 2759–2770 (1996). [CrossRef] [PubMed]
  42. D. Ruderman, “Origins of scaling in natural images,” Vision Res. 23, 3385–3398 (1997). [CrossRef]
  43. R. Voss, J. Zeil, “Active vision in insects: an analysis of object-directed zig-zag flights in a ground-nesting wasp (Odynerus spinipes, Eumenidae),” J. Comp. Physiol., A 182, 377–387 (1998). [CrossRef]
  44. K. Dale, T. S. Collett, “Using artificial evolution and selection to model insect navigation,” Curr. Biol. 11, 1305–1316 (2001). [CrossRef] [PubMed]
  45. R. Möller, “Do insects use templates or parameters for landmark navigaion?,” J. Theor. Biol. 210, 33–45 (2001). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited