OSA's Digital Library

Journal of the Optical Society of America A

Journal of the Optical Society of America A

| OPTICS, IMAGE SCIENCE, AND VISION

  • Vol. 18, Iss. 10 — Oct. 1, 2001
  • pp: 2468–2477

Fusion and merging of multispectral images with use of multiscale fundamental forms

Paul Scheunders and Steve De Backer  »View Author Affiliations


JOSA A, Vol. 18, Issue 10, pp. 2468-2477 (2001)
http://dx.doi.org/10.1364/JOSAA.18.002468


View Full Text Article

Acrobat PDF (1133 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

A new multispectral image wavelet representation is introduced, based on multiscale fundamental forms. This representation describes gradient information of multispectral images in a multiresolution framework. The representation is, in particular, extremely suited for fusion and merging of multispectral images. For fusion as well as for merging, a strategy is described. Experiments are performed on multispectral images, where Landsat Thematic Mapper images are fused and merged with SPOT Panchromatic images. The proposed techniques are compared with wavelet-based techniques described in the literature.

© 2001 Optical Society of America

OCIS Codes
(100.2000) Image processing : Digital image processing
(100.2960) Image processing : Image analysis
(100.2980) Image processing : Image enhancement
(100.7410) Image processing : Wavelets

Citation
Paul Scheunders and Steve De Backer, "Fusion and merging of multispectral images with use of multiscale fundamental forms," J. Opt. Soc. Am. A 18, 2468-2477 (2001)
http://www.opticsinfobase.org/josaa/abstract.cfm?URI=josaa-18-10-2468


Sort:  Author  |  Year  |  Journal  |  Reset

References

  1. S. Di Zenzo, “A note on the gradient of a multi-image,” Comput. Vision Graph. Image Process. 33, 116–125 (1986).
  2. A. Cumani, “Edge detection in multispectral images,” CVGIP Graph. Models Image Process. 53, 40–51 (1991).
  3. G. Sapiro and D. L. Ringach, “Anisotropic diffusion of multivalued images with applications to color filtering,” IEEE Trans. Image Process. 5, 1582–1586 (1996).
  4. S. Mallat and S. Zhong, “Characterization of signals from multiscale edges,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 710–732 (1992).
  5. C. Pohl and J. Van Genderen, “Multisensor image fusion in remote sensing: concepts, methods and applications,” Int. J. Remote Sens. 19, 823–854 (1998).
  6. H. Li, B. S. Manjunath, and S. K. Mitra, “Multisensor image fusion using the wavelet transform,” CVGIP Graph. Models Image Process. 57, 235–245 (1995).
  7. T. Wilson, S. Rogers, and L. Meyers, “Perceptual-based hyperspectral image fusion using multiresolution analysis,” Opt. Eng. 34, 3154–3164 (1995).
  8. T. Wilson, S. Rogers, and L. Meyers, “Perceptual-based im-age fusion for hyperspectral data,” IEEE Trans. Geosci. Remote Sens. 35, 1007–1017 (1997).
  9. T. Pu and G. Ni, “Contrast-based image fusion using the discrete wavelet transform,” Opt. Eng. 39, 2075–2082 (2000).
  10. D. A. Yocky, “Image merging and data fusion by means of the discrete two-dimensional wavelet transform,” J. Opt. Soc. Am. A 12, 1834–1841 (1995).
  11. B. GarguetDuport, “The use of multiresolution analysis and wavelets transform for merging spot panchromatic and multispectral image data,” Photogramm. Eng. Remote Sens. 62, 1057–1066 (1996).
  12. B. GarguetDuport, “Wavemerg: a multiresolution software for merging spot panchromatic and spot multispectral data,” Environ. Modell. Software 12, 85–92 (1997).
  13. D. A. Yocky, “Multiresolution wavelet decomposition image merger of Landsat Thematic Mapper and spot panchromatic data,” Photogramm. Eng. Remote Sens. 62, 1067–1074 (1996).
  14. J. Zhou, D. Civco, and J. Silander, “A wavelet transform method to merge Landsat TM and spot panchromatic data,” Int. J. Remote Sens. 19, 743–757 (1998).
  15. J. Nunez, X. Otazu, O. Fors, A. Prades, V. Pala, and R. Arbiol, “Image fusion with additive multiresolution wavelet decomposition; applications to spot+Landsat images,” J. Opt. Soc. Am. A 16, 467–474 (1999).
  16. J. Nunez, X. Otazu, O. Fors, A. Prades, V. Pala, and R. Arbiol, “Multiresolution-based image fusion with additive wavelet decomposition,” IEEE Trans. Geosci. Remote Sens. 37, 1204–1211 (1999).
  17. T. Ranchin and L. Wald, “Fusion of high spatial and spectral resolution images: the ARSIS concept and its implementation,” Photogramm. Eng. Remote Sens. 66, 49–61 (2000).
  18. F. Jahard, D. A. Fish, A. A. Rio, and C. P. Thompson, “Far/near infrared adapted pyramid-based fusion for automotive night vision,” in Sixth International Conference on Image Processing and Its Applications (Institute of Electrical Engineers, New York, 1997), pp. 886–890.
  19. B. Aiazzi, L. Alparone, S. Baronti, and R. Carla, “Assessment of pyramid-based multisensor image data fusion,” in Image and Signal Processing for Remote Sensing IV, S. B. Serpico, ed., Proc. Spie 3500, 237–248 (1998).
  20. I. L. Thomas, V. M. Benning, and N. P. Ching, Classification of Remotely Sensed Images (Hilger, Bristol, UK, 1987).
  21. C. Lee and D. A. Landgrebe, “Analyzing high-dimensional multispectral data,” IEEE Trans. Geosci. Remote Sens. 31, 388–400 (1993).
  22. T. Taxt and A. Lundervold, “Multispectral analysis of the brain using magnetic resonance imaging,” IEEE Trans. Med. Imaging 13, 470–481 (1994).
  23. Y. Carts-Powell, “Fusion CCD and IR images creates color night vision,” Laser Focus World 32 (5), 32–36 (1996).
  24. N. Nandhakumar and J. K. Aggarwal, “Integrated analysis of thermal and visual images for scene interpretation,” IEEE Trans. Pattern Anal. Mach. Intell. 10, 469–481 (1988).
  25. W. K. Krebs, J. S. McCarley, T. Kozek, G. M. Miller, M. J. Sinai, and F. S. Werblin, “An evaluation of a sensor fusion system to improve drivers’ nighttime detection of road hazards,” in Annual Meeting of the Human Factors and Ergonomics Society (Human Factors and Ergonomics Society, Santa Monica, Calif., 1999), pp. 1333–1337.
  26. D. Ryan and R. Tinkler, “Night pilotage assessment of image fusion,” in Helmet- and Head-Mounted Displays and Symbology Design Requirements II, R. J. Lewadowsky, W. Stephens, and L. A. Haworth, eds., Proc. SPIE 2465, 50–67 (1995).
  27. P. M. Steele and P. Perconti, “Part task investigation of multispectral image fusion using gray scale and synthetic color night vision sensor imagery for helicopter pilotage,” in Targets and Backgrounds, Characterization and Representation III, W. Watkins and D. Clement, eds., Proc. SPIE 3062, 88–100 (1997).
  28. E. A. Essock, M. J. Sinai, J. S. McCarley, W. K. Krebs, and J. K. Deford, “Perceptual ability with real-world nighttime scenes: image-intensified, infrared and fused-color imagery,” Hum. Factors 41, 438–452 (1999).
  29. M. Aguilar, D. A. Fay, D. B. Ireland, J. P. Racamoto, W. D. Ross, and A. M. Waxman, “Field evaluations of dual-band fusion for color night vision,” in Enhanced and Synthetic Vision 1999, J. G. Verly, ed., Proc. SPIE 3691, 168–175 (1999).
  30. J. Sinai, J. S. McCarley, and W. K. Krebs, “Scene recognition with infra-red, low-light and sensor fused imagery,” in Proceedings of the Infrared Information Symposia (IRIS) Specialty Groups on Passive Sensors (Infrared Information Analysis (IRIA), Ann Arbor, MI, 1999), pp. 1–9.
  31. W. J. Carper, T. M. Lillesand, and R. W. Kiefer, “The use of intensity–hue–saturation transformations for merging spot panchromatic and multispectral image data,” Photogramm. Eng. Remote Sens. 56, 459–467 (1990).
  32. M. E. Ulug and L. Claire, “A quantitative metric for comparison of night vision fusion algorithms,” in Sensor Fusion: Architectures, Algorithms and Applications IV, B. V. Dasarathy, ed., Proc. SPIE 4051, 80–88 (2000).
  33. C. S. Xydeas and V. S. Petrovic, “Objective pixel-level image fusion performance measure,” in Sensor Fusion: Architectures, Algorithms and Applications IV, B. V. Dasarathy, ed., Proc. SPIE 4051, 89–98 (2000).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited