OSA's Digital Library

Applied Optics

Applied Optics


  • Editor: Joseph N. Mait
  • Vol. 53, Iss. 13 — May. 1, 2014
  • pp: C32–C44

Enhancement of imagery of objects with highly dynamic brightness and large rotational motion

Andrey V. Kanaev, Christopher W. Miller, Collin J. Seanor, and Jeremy Murray-Krezan  »View Author Affiliations

Applied Optics, Vol. 53, Issue 13, pp. C32-C44 (2014)

View Full Text Article

Enhanced HTML    Acrobat PDF (1519 KB)

Browse Journals / Lookup Meetings

Browse by Journal and Year


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools



We report on application of multi-frame super-resolution (SR) to sampling limited imagery that models space objects (SOs). The difficulties of multi-frame image processing of SOs include abrupt illumination changes and complex in scene SO motion. These conditions adversely affect the accuracy of motion estimation necessary for resolution enhancement. We analyze the motion estimation errors from the standpoint of an optical flow (OF) interpolation error metric and show dependence of the object tracking accuracy on brightness changes and on the pixel displacement values between subsequent images. Despite inaccuracies of motion estimation, we demonstrate spatial acuity enhancement of the pixel limited resolution of model SO motion imagery by applying a SR algorithm that accounts for OF errors. In addition to visual inspection, image resolution improvement attained in the experiments is assessed quantitatively; a 1.8× resolution enhancement is demonstrated.

© 2014 Optical Society of America

OCIS Codes
(100.0100) Image processing : Image processing
(100.2000) Image processing : Digital image processing
(100.2980) Image processing : Image enhancement
(100.3020) Image processing : Image reconstruction-restoration
(100.6640) Image processing : Superresolution
(150.4620) Machine vision : Optical flow

Original Manuscript: December 5, 2013
Revised Manuscript: March 14, 2014
Manuscript Accepted: March 19, 2014
Published: April 17, 2014

Andrey V. Kanaev, Christopher W. Miller, Collin J. Seanor, and Jeremy Murray-Krezan, "Enhancement of imagery of objects with highly dynamic brightness and large rotational motion," Appl. Opt. 53, C32-C44 (2014)

Sort:  Author  |  Year  |  Journal  |  Reset  


  1. J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).
  2. R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013). [CrossRef]
  3. H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013). [CrossRef]
  4. B. Horn and B. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981). [CrossRef]
  5. A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005). [CrossRef]
  6. T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.
  7. D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their principles,” in Proceedings of International Conference on Computer Vision & Pattern Recognition (2010), pp. 2432–2439.
  8. H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).
  9. S. Huang and R. Y. Tsai, “Multi-frame image restoration and registration,” Adv. Comp. Vision Image Process 1, 317–339 (1984).
  10. S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003). [CrossRef]
  11. H. Zimmer, A. Bruhn, and J. Weickert, “Freehand HDR imaging of moving scenes with simultaneous resolution enhancement,” in Proceeding of Eurographics, Llandudno, UK, April 11–15, 2011.
  12. S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010). [CrossRef]
  13. M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009). [CrossRef]
  14. H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009). [CrossRef]
  15. D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).
  16. A. V. Kanaev and C. W. Miller, “Multi-frame super-resolution algorithm for complex motion patterns,” Opt. Express 21, 19850–19866 (2013). [CrossRef]
  17. R. C. Hardie and K. J. Barnard, “Fast super-resolution using an adaptive Wiener filter with robustness to local motion,” Opt. Express 20, 21053–21073 (2012). [CrossRef]
  18. S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.
  19. H. Haussecker and H. Spies, “Motion,” in Handbook of Computer Vision and Applications, B. Jähne, H. Haussecker, and P. Geissler, eds. (Academic, 1999), Vol. 2, pp. 336–338.
  20. A. Bruhn and J. Weickert, “A confidence measure for variational optic flow methods,” in Geometric Properties for Incomplete Data (Springer, 2006), pp. 283–298.
  21. C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).
  22. J. Kybic and C. Nieuwenhuis, “Bootstrap optical flow confidence and uncertainty measure,” Comput. Vis. Image Underst. 115, 1449–1462 (2011). [CrossRef]
  23. D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.
  24. A. V. Kanaev and C. W. Miller, “Confidence measures of optical flow for multi-frame image reconstruction,” in Computational Optical Sensing and Imaging (COSI), 2012 OSA Technical Digest Series (Optical Society of America, 2012), postdeadline paper CW2C.2.
  25. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007). [CrossRef]
  26. “Photography–Electronic Still-Picture Cameras–Resolution Measurements,” , International Organization for Standardization (2011).
  27. K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Supplementary Material

» Media 1: AVI (5374 KB)     

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited