OSA's Digital Library

Journal of the Optical Society of America A

Journal of the Optical Society of America A


  • Editor: Franco Gori
  • Vol. 30, Iss. 1 — Jan. 1, 2013
  • pp: 102–111

Efficient method for the determination of image correspondence in airborne applications using inertial sensors

Matthew Woods and Aggelos Katsaggelos  »View Author Affiliations

JOSA A, Vol. 30, Issue 1, pp. 102-111 (2013)

View Full Text Article

Enhanced HTML    Acrobat PDF (1908 KB)

Browse Journals / Lookup Meetings

Browse by Journal and Year


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools



This paper presents a computationally efficient method for the measurement of a dense image correspondence vector field using supplementary data from an inertial navigation sensor (INS). The application is suited to airborne imaging systems, such as an unmanned air vehicle, where size, weight, and power restrictions limit the amount of onboard processing available. The limited processing will typically exclude the use of traditional, but computationally expensive, optical flow and block matching algorithms, such as Lucas–Kanade, Horn–Schunck, or the adaptive rood pattern search. Alternatively, the measurements obtained from an INS, on board the platform, lead to a closed-form solution to the correspondence field. Airborne platforms are well suited to this application because they already possess INSs and global positioning systems as part of their existing avionics package. We derive the closed-form solution for the image correspondence vector field based on the INS data. We then show, through both simulations and real flight data, that the closed-form inertial sensor solution outperforms traditional optical flow and block matching methods.

© 2012 Optical Society of America

OCIS Codes
(100.2980) Image processing : Image enhancement
(150.4620) Machine vision : Optical flow
(110.4153) Imaging systems : Motion estimation and optical flow
(280.4991) Remote sensing and sensors : Passive remote sensing

ToC Category:
Imaging Systems

Original Manuscript: April 16, 2012
Revised Manuscript: October 17, 2012
Manuscript Accepted: November 23, 2012
Published: December 19, 2012

Matthew Woods and Aggelos Katsaggelos, "Efficient method for the determination of image correspondence in airborne applications using inertial sensors," J. Opt. Soc. Am. A 30, 102-111 (2013)

Sort:  Author  |  Year  |  Journal  |  Reset  


  1. R. Schultz and L. Stevenson, “Extraction of high-resolution frames from video sequences,” IEEE Trans. Image Process. 5, 996–1011 (1996). [CrossRef]
  2. K. Katsaggelos, R. Molina, and J. Mateos, Super Resolution of Images and Video (Morgan & Claypool, 2007).
  3. D. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision,” in Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI'81) (IJCAI, 1981), pp. 674–679.
  4. L. Zhang, Q. Yuan, H. Shen, and P. Li, “Multiframe image super-resolution adapted with local spatial information,” J. Opt. Soc. Am. A 28, 381–390 (2011). [CrossRef]
  5. S. S. Mokri, A. Hussain, N. Ibrahim, and M. M. Mustafa, “Motion detection using Lucas–Kanade algorithm and application enhancement,” in 2009 International Conference on Electrical Engineering and Informatics (ICEEI’09) (ICEEI, 2009), Vol. 2, pp. 537–542.
  6. K. P. Horn and G. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981). [CrossRef]
  7. S. Zhu and K. Ma, “A new diamond search algorithm for fast block matching motion estimation,” IEEE Trans. Image Process. 9, 287–290 (2000). [CrossRef]
  8. Y. Nie and K.-K. Ma, “Adaptive rood pattern search for fast block-matching motion estimation,” IEEE Trans. Image Process. 11, 1442–1449 (2002). [CrossRef]
  9. D. Gosgen-Neskin and I. Y. Bar-Itzhack, “Unified approach to inertial navigation system error modeling,” J. Guid. Control Dyn. 15, 648–653 (1992). [CrossRef]
  10. C. Hide, T. Moore, and M. Smith, “Adaptive Kalman filtering algorithms for integrating GPS and low cost INS,” in Proceedings of the IEEE Position, Location, and Navigation Symposium (IEEE, 2004), pp. 227–233.
  11. I. Rhee, M. F. Abdel-Hafez, and J. L. Speyer, “Observability of an integrated GPS/INS during maneuvers,” IEEE Trans. Aerosp. Electron. Syst. 40, 526–535 (2004). [CrossRef]
  12. D. Gebre-Egziabher, R. C. Hayward, and J. D. Powell, “A low-cost GPS/inertial attitude heading reference system (AHRS) for general aviation applications,” in Proceedings of the IEEE Position, Location, and Navigation Symposium, (IEEE, 1998), pp. 518–525.
  13. D. A. Forsyth and J. Ponce, Computer Vision: A Modern Approach (Prentice-Hall, 2003).
  14. Z. Zhang, “A flexible new technique for camera calibration,” Tech. Rep. MSR-TR-98-71 (Microsoft, 2008).
  15. B. J. Lei, E. A. Hendriks, and A. K. Katsaggelos, “Camera calibration for 3D reconstruction and view transformation,” in 3D Modeling and Animation: Synthesis and Analysis Techniques for the Human Body, N. Sarris and M. G. Strintzis, eds. (IRM, 2005), pp. 70–129.
  16. “Performance specification digital terrain elevation data (DTED),” MIL-PRF-89020B (National Information Services, May2000).
  17. K. Chen, G. Zhao, Z. Meng, J. Yan, and H. Lu, “Equivalent approaches to equations of traditional transfer alignment and rapid transfer alignment,” in Proceedings of the 7th World Congress on Intelligent Control and Automation (2008), pp. 892–895.
  18. L. Joon and L. You-Chol, “Transfer alignment considering measurement time delay and ship body flexure,” J. Mech. Sci. Tech. 23, 195–203 (2009). [CrossRef]
  19. J. Shortelle, R. Graham, and C. Rabourn, “F-16 Flight test of a rapid transfer alignment procedure,” presented at the IEEE Position, Location, and Navigation Symposium, Palm Springs, California, 20–23 April 1998.
  20. M. Veth and J. Raquet, “Alignment and calibration of optical and inertial sensors using stellar observations” (Air Force Institute of Technology, 2007).
  21. G. Fasano, D. Accardo, A. Moccia, and A. Rispoli, “An innovative procedure for calibration of strapdown electro-optical sensors onboard unmanned air vehicles,” Sensors 10, 639–654 (2010). [CrossRef]
  22. J. Shi and C. Tomasi, “Good features to track,” in 1994 IEEE Conference on Computer Vision and Pattern Recognition (CVPR’94) (IEEE Computer Society, 1994), pp. 593–600.
  23. C. Tomasi and T. Kanade, “Shape and motion from image streams: a factorization method: full report on the orthographic case,” CMU Tech. Rep. CMU-CS-92-104, March1992.
  24. Silver Fox, http://www.satnews.com/cgi-bin/story.cgi?number=1554529420 .
  25. L. Stevens and L. Lewis, Aircraft Control and Simulation(Wiley, 1992).
  26. H. Blakelock, Automatic Control of Aircraft and Missiles(Wiley, 1991).
  27. I. Lizarraga, “Autonomous landing system for a UAV,” Master’s thesis (Naval Post Graduate School, 2004).
  28. Google Earth, http://earth.google.com .
  29. J. M. Herbert, J. Keith, S. Ryan, G. Lachapelle, M. C. Szarmes, S. Jokerst, and M. E. Cannon, “DGPS kinematic carrier phase signal simulation analysis for precise aircraft velocity determination,” Navigation 44231–246 (1997).
  30. L. Serrano, D. Kim, and R. B. Langley, “A GPS velocity sensor: how accurate can it be?—A first look,” presented at ION NTM 2004,San Diego, California, 26–28 January 2004.
  31. S. Khan, Optical Flow MATLAB Code, % http://www.cs.ucf.edu/vision/public_html/source.html#Optical%20Flow .
  32. MATLAB Central, http://www.mathworks.com/matlabcentral/fileexchange/8761-block-matching-algorithms-for-motion-estimation.
  33. L.-M. Po and W.-C. Ma, “A novel four-step search algorithm for fast block motion estimation,” IEEE Trans. Circuits Syst. Video Technol. 6, 313–317 (1996). [CrossRef]
  34. M. Chi, D. Tran, and Etienne-Cummings, “Optical flow approximation of sub-pixel accurate block matching,” in IEEE Conference on Acoustics, Speech and Signal Processing (IEEE, 2007), pp. I-1017–I-1020.
  35. H. Chan, T. Vo, and Q. Nguyen, “Subpixel motion estimation without interpolation,” in IEEE Conference Acoustics, Speech and Signal Processing (IEEE, 2010), pp. 722–725.
  36. FLIR Systems: http://www.flir.com/US .

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited