Multi sensor image fusion is the technique used to combine heterogeneous images of the same scene obtained using different sensors. The objective of image fusion is to produce a single image containing the best aspects of the fused images. Some desirable aspects of Image Fusion include high spatial resolution and high spectral resolution (multispectral and panchromatic satellite images), areas in focus (microscopy images), functional and anatomic information (medical images), different spectral information (optical and infrared images), or color information and texture information (multispectral and synthetic aperture radar images). Image fusion can also be used for providing some protection against illegal copying by embedding water-marks. For all of the schemes, it is assumed that the images have been co-registered and resampled. The aim of this survey is to present a review of publications related to Multi Sensor Image Fusion. This paper paints a comprehensive picture of Multi Sensor Image Fusion methods and their applications. This paper is an introduction for those new to the field, an overview for those working in the field and a reference for those searching for literature on a specific application. Methods are classified according to the different aspects of Multi Sensor Image Fusion.
V. R. S. Mani,
A Survey of Multi Sensor Satellite Image Fusion Techniques, International Journal of Sensors and Sensor Networks.
Vol. 8, No. 1,
2020, pp. 1-10.
Simone. G., Farina. A., Morabito, F. C., serpico. S. B. & Bruzzone, L. (2002). Image fusion techniques for remote sensing applications. Information fusion, No. 3, 2002, pp. 3-15.
Chiang, JL.: Knowledge - based Principal Component Analysis for Image Fusion. International journal of Appl. Math. Inf. Sci., vol. 8, no. 1 L, pp. 223-230. (2014).
Zhang, YS, Backer, SD & Scheunders, P.: Noise-resistant wavelet- based Bayesian fusion of multispectral and hyperspectral images. IEEE Trans. Geosci. Remote Sens., vol. 47, no. 11, pp. 3834-3843. (2009).
Wei, Q, Dobigeon, N & Tourneret, JY.: Bayesian Fusion of Multi-Band Images’, IEEE Journal of Selected Topics in Signal Processing, vol. 9, no. 6, pp. 1117-1127. (2015).
Borotschnig, H, Paletta, L, Prantl, M, Pinz, A & Graz,: ‘A Comparison of Probabilistic, Possibilistic and Evidence Theoretic Fusion Schemes for Active Object Recognition’, Computing. vol. 62, pp. 293–319. (1999).
Wu, H, Siegel, M, Stiefelhagen, R & Yang, J.: Sensor Fusion Using Dempster-Shafer Theory. IEEE Instrumentation and Measurement Technology Conference, vol. 1, pp. 1-6. (2002).
Hegarat, MSL, Seltz, R, Hubert, LM, Corgne, S & Stachs, N:, Performance of change detection using remotely sensed data and evidential fusion comparison of three cases of application:, International Journal of Remote Sensing, vol. 27, no. 16, pp. 3515-3532. (2006).
Svaba & Ostir, K.: High-Resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolution’. Photogrammetric Engineering & Remote Sensing, vol. 72, no. 5, pp. 565-572. (2006).
Prasad, N, Saran, S, Kushwaha, SPS & Roy, PS.: Evaluation Of Various Image Fusion Techniques And Imaging Scales For Forest Features Interpretation. Current Science, vol. 81, no. 9, pp. 12-18. (2001).
Alparone, L, Aiazzi, B&Baronti, S,: Fast classified pansharpening with spectral and spatial distortion optimization. Geoscience and Remote Sensing Symposium, pp. 154-157, (2012).
Keshava, N & Mustard, J 2002, ‘Spectral unmixing’, IEEE Signal Processing Mag., vol. 19, no. 1, pp. 44-57. (2002).
Nascimento, JMP & Dias, JMB 2003, ‘Does independent component analysis play a role in unmixing hyperspectral data?’, in Pattern Recognition and Image Analysis, ser. Lecture Notes in Computer Science, Perales, FJ, Campilho, A and Sanfeliu, NPBA Eds., Springer-Verlag, vol. 2652, pp. 616-625.
Mazer, AS & Martin, M 1988,: Image processing software for imaging spectrometry data analysis’, Remote Sens. Environ., vol. 24, no. 1, pp. 201-210. (1998).
Harsanyi, JC & Chang, CI, "Hyperspectral image classification and dimensionality reduction, An orthogonal subspace projection approach’, IEEE Trans. Geosci. Remote Sens., vol. 32, no. 4, pp. 779–785. (1994).
Klonus, S& Ehlers, M 2007,: Image fusion using the Ehlers spectral characteristics preserving algorithm. GI Science and Remote Sensing, vol. 44, pp. 93-116. (2007).
Yang, Y.: Multi resolution Image Fusion Based on Wavelet Transform By Using a Novel Technique for Selection Coefficients. Journal of Multimedia, vol. 6, no. 1, pp. 91-98. (2011).
El Ejaily, AM, Eltohamy, F, Hamid, MS & Ismail, G.: An Image Fusion Method Using DT-CWT and Average Gradient. International Journal of Computer Science and Mobile Computing, vol. 3, pp. 272- 280. (2014).
Do, MN & Vetterli, M,: The contourlet transform: An efficient directional multiresolution image representation’, IEEE Trans. Image Process., vol. 14, no. 12, pp. 2096-2106. (2005).
Choi, Y, Sharifahmadian, E & Latifi, S,: Remote Sensing Image Fusion using Contourlet Transform with sharp frequency localization International. Journal of Information Technology, Modeling and Computing, vol. 2, no. 1, pp. 23-35. (2014).
Egfin Nirmala, D, Vignesh, RK & Vaidehi, V,: A Fuzzy Based Multiresolution Method for Multimodal Image Fusion. International Journal of Soft Computing, vol. 9, pp. 169-177. (2014).
Mangalraj, P & Anupam, A 2015, ‘Fusion of Multi-Sensor Satellite Images using Non-Sub sampled Contour let Transform. Procedia Computer Science, vol. 54, pp. 713-720. (2015).
Palsson, F, Johannes, R, Sveinsson, Magnus, O & Jon, A.: Model-Based Fusion of Multispectral and Hyperspectral Images Using PCA and Wavelets. IEEE Transactions On Geo science and Remote Sensing, vol. 53, no. 5, pp. 2652-2663. (2015).
Zhang, Y & Hong, G.: An IHS and wavelet integrated approach to improve pan-sharpening visual quality of natural color IKONOS and QuickBird images. Information Fusion, vol. 6, pp. 225-234. (2005).
Shah, VP, Younan, NH & King, R.: An efficient pan-sharpening method via a combined adaptive PCA approach and contourlets’, IEEE Trans. Geosci. Remote Sens. vol. 46, no. 5, pp. 1323-1335. (2008).
Akula, RT, Gupta, R & Vimala, MR: An efficient PAN sharpening technique by merging two hybrid approaches, Proceedia Engineering. vol. 30, pp. 535-541, (2012).
Mani, V. R. S.,: Comparison of three Un mixing based Hyper spectral Image Fusion Techniques. Journal Geological Society of India, Springer. Vol. 91, pp. 541-546. (2018).
Tu, TM, Cheng, WC, Chang, CP, Huang, PS & Chang, JC.: Best tradeoff for high-resolution image fusion to preserve spatial details and minimize color distortion. IEEE Geoscience and Remote Sensing Letters, vol. 4, no. 2, pp. 302-306. (2007).
Jia, S & Qian, Y,: Constrained nonnegative matrix factorization for hyperspectral unmixing’, IEEE Trans. Geosci. Remote Sens., vol. 47, no. 1, pp. 161-173. (2009).
N. Yokoya, T. Yairi and A. Iwasaki, "Coupled Nonnegative Matrix Factorization Unmixing for Hyperspectral and Multispectral Data Fusion," in IEEE Transactions on Geoscience and Remote Sensing, vol. 50, no. 2, pp. 528-537, Feb. 2012.
Weisheng Dong, Fazuo Fu, and Xin Li. Hyperspectral image super-resolution via non negative structured sparse representation. IEEE Transactions on Image Processing, 25 (5): 2337–2352, 2016.
Parente, M & Plaza, A.: Survey of geometric and statistical un mixing algorithms for hyper spectral images. in IEEE GRSS Workshop on Hyper spectral Image and Signal Processing Evolution in Remote Sensing, vol. 1, pp. 1-4. (2010).
N. Yokoya, J. Chanussot and A. Iwasaki, "Nonlinear Unmixing of Hyperspectral Data Using Semi-Nonnegative Matrix Factorization," in IEEE Transactions on Geoscience and Remote Sensing, vol. 52, no. 2, pp. 1430-1437, Feb. 2014.
Renwei Dian, Shutao Li, Anjing Guo, and Leyuan Fang. Deep hyperspectral image sharpening. IEEE Transactions on Neural Networks and Learning Systems, 29 (11): pp. 5345–5355, 2018.
Shutao Li, Renwei Dian, Leyuan Fang, and Jos M. Bioucas- Dias. Fusing hyperspectral and multispectral images via coupled sparse tensor factorization. IEEE Transactions on Image Processing, 27 (8): pp. 4118–4130, 2018.
Burak Uzkent, Aneesh Rangnekar, and M. J. Hoffman. Aerial vehicle tracking by adaptive fusion of hyperspectral likelihood maps. In Computer Vision and Pattern Recognition Workshops, pp. 233–242, 2017.