形态学滤波和改进PCNN的NSST域多光谱与全色图像融合
Fusion of multispectral and panchromatic images via morphological filter and improved PCNN in NSST domain
- 2019年24卷第3期 页码:435-446
收稿:2018-06-28,
修回:2018-8-31,
纸质出版:2019-03-16
DOI: 10.11834/jig.180399
移动端阅览

浏览全部资源
扫码关注微信
收稿:2018-06-28,
修回:2018-8-31,
纸质出版:2019-03-16
移动端阅览
目的
2
全色图像的空间细节信息增强和多光谱图像的光谱信息保持通常是相互矛盾的,如何能够在这对矛盾中实现最佳融合效果一直以来都是遥感图像融合领域的研究热点与难点。为了有效结合光谱信息与空间细节信息,进一步改善多光谱与全色图像的融合质量,提出一种形态学滤波和改进脉冲耦合神经网络(PCNN)的非下采样剪切波变换(NSST)域多光谱与全色图像融合方法。
方法
2
该方法首先分别对多光谱和全色图像进行非下采样剪切波变换;对二者的低频分量采用形态学滤波和高通调制框架(HPM)进行融合,将全色图像低频子带的细节信息注入到多光谱图像低频子带中得到融合后的低频子带;对二者的高频分量则采用改进脉冲耦合神经网络的方法进行融合,进一步增强融合图像中的空间细节信息;最后通过NSST逆变换得到融合图像。
结果
2
仿真实验表明,本文方法得到的融合图像细节信息清晰且光谱保真度高,视觉效果上优势明显,且各项评价指标与其他方法相比整体上较优。相比于5种方法中3组融合结果各指标平均值中的最优值,清晰度和空间频率分别比NSCT-PCNN方法提高0.5%和1.0%,光谱扭曲度比NSST-PCNN方法降低4.2%,相关系数比NSST-PCNN方法提高1.4%,信息熵仅比NSST-PCNN方法低0.08%。相关系数和光谱扭曲度两项指标的评价结果表明本文方法相比于其他5种方法能够更好地保持光谱信息,清晰度和空间频率两项指标的评价结果则展示了本文方法具有优于其他对比方法的空间细节注入能力,信息熵指标虽不是最优值,但与最优值非常接近。
结论
2
分析视觉效果及各项客观评价指标可以看出,本文方法在提高融合图像空间分辨率的同时,很好地保持了光谱信息。综合来看,本文方法在主观与客观方面均具有优于亮度色调饱和度(IHS)法、主成分分析(PCA)法、基于非负矩阵分解(CNMF)、基于非下采样轮廓波变换和脉冲耦合神经网络(NSCT-PCNN)以及基于非下采样剪切波变换和脉冲耦合神经网络(NSST-PCNN)5种经典及现有流行方法的融合效果。
Objective
2
Various remote sensing sensors presently exist
and multisource remote sensing images
such as multispectral (MS) and panchromatic (PAN) images
can be acquired. MS images
which have rich spectral information and low spatial resolution
cannot meet the remote sensing application demand. Correspondingly
PAN images have more spatial details and higher spatial resolutions. The significance of MS and PAN image fusion is that it improves the spatial resolution of MS images while maintaining original spectral information. It also combines target shape and the structural characteristics of PAN images and the spectral information of MS images to provide great interpretation capability and reliable results
as well as enhances the classification and identification precision of objects. However
the spatial resolution enhancement of PAN images and the spectral information maintenance of MS images are usually contradictory. How to acquire a high fusion performance in the contradictions has always been a popular and difficult point in the research field of remote sensing image fusion and has an extensive prospect in research and application. In this study
a fusion method based on morphological filter and improved pulse-coupled neural network (PCNN) in a non-subsampled shearlet transform (NSST) domain is proposed to improve the fusion quality of MS and PAN images by combining spectral information with spatial details efficiently.
Method
2
The proposed method is conducted on MS and PAN images that have been accurately registered. First
the PAN and MS images are decomposed by NSST to obtain low- and high-frequency sub-band coefficients. Second
the low-frequency sub-bands
which are approximate sub-graphs of the original image and inherit the overall characteristics
still have some edges and detailed information. The fusion rule of low-frequency coefficients based on morphological filtering and high-pass modulation (HPM) scheme is proposed. The morphological half-gradient operator is used to extract the details of the low-frequency sub-bands of the PAN image owing to its preliminary encouraging fusion results on remote sensing images. The low-resolution PAN sub-band image can be obtained by morphological filtering
and the detailed PAN sub-band image is estimated by subtracting the low-resolution PAN sub-band image from the PAN sub-band image equalized with histogram on the basis of the MS sub-band image. The spatial details are then injected into the low-frequency sub-band of the MS image through the HPM scheme. For the fusion of high-frequency sub-bands
an improved PCNN is taken to enhance spatial detail information. Existing PCNN models usually adopt a hard-limiting function as output
and the firing output is 0 or 1
which cannot reflect the amplitude difference of the synchronous pulse excitation efficiently. At a point
a soft-limiting sigmoid function is adopted to calculate the firing output amplitude during the iterations
and the decision matrix for high-frequency coefficient selection can be achieved by summing up the firing output amplitude in the iterative process. Then
the fusion low- and high-frequency coefficients are reconstructed with the inverse NSST to obtain the final fusion image.
Result
2
A series of simulation experiments is conducted to verify the superiority and validity of the proposed fusion method. Three groups of QuickBird remote sensing images are utilized to test the proposed method. The performance evaluation of the fusion methods includes the subjective visual effect and objective standard evaluation. Visual analysis is the most immediate detection method. Five objective evaluation indicators
namely
image clarity
information entropy
correlation coefficient
spatial frequency
and spectral distortion
are selected to evaluate the fusion results quantitatively and objectively. Experimental results show that the proposed method has obvious advantages in the fusion effect. The subjective visual effect of the proposed method is obviously better than those of the other five methods. Details such as image textures and edges are clear
and the spectral information is maintained efficiently. Compared with the other fusion methods
the proposed method also has great superiority on the objective evaluation indicators. The average values of the five indicators for three bands are calculated
four of which are the best among the comparison methods. The average value of three groups of images are also calculated. Compared with the best indicator of the other five methods
the image clarity and spatial frequency of our method are improved by 0.5% and 1.0%
respectively
compared with the NSCT-PCNN method. Our spectral distortion is 4.2% lower than that of the NSST-PCNN method. Our correlation coefficient is 1.4% higher than that of NSST-PCNN
and the information entropy is only 0.08% lower than the best value from NSST-PCNN. The results of the correlation coefficient and spectral distortion demonstrate that the proposed method maintains better spectral information than do the other five methods. Results of the image clarity and spatial frequency show that the proposed method has an excellent capability of detailed information injection
and only the image clarity of B band in group 2 is poor. The information entropy is approximate to the best result.
Conclusion
2
A fusion method of MS and PAN images based on morphological operator and improved PCNN in NSST domain is proposed. We present the fusion rules for different frequency bands according to the NSST decomposition of the original MS and PAN images. Low-frequency coefficient fusion rule based on morphological half-gradient filtering and the HPM scheme and high-frequency coefficient fusion rule based on the improved PCNN are designed. A real satellite dataset is employed for the performance evaluation of the proposed method. The analysis indicates that our method can improve the spatial resolution and maintain the spectral information of fusion results. In general
the proposed method is superior to the traditional methods and some current popular fusion methods from the overall effect of visual aspects and objective indicators.
Carper WJ, Lillesand T M, Kiefer P W. The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data[J]. Photogrammetric Engineering and Remote Sensing, 1990, 56(4):459-467.
PohlC, Van Genderen J L. Review article multisensor image fusion in remote sensing:Concepts, methods and applications[J]. International Journal of Remote Sensing, 1998, 19(5):823-854.[DOI:10.1080/014311698215748]
Laben C A, Brower B V. Process for enhancing the spatial resolution of multispectral imagery using pan-sharpening: US, 6011875[P]. 2000-01-04.
Zhou J, Civco D L, Silander J A. A wavelet transform method to merge Landsat TM and SPOT panchromatic data[J]. International Journal of Remote Sensing, 1998, 19(4):743-757.[DOI:10.1080/014311698215973]
Wu Y Q, Wang Z L. Multispectral and panchromatic image fusion using chaotic Bee Colony optimization in NSST domain[J]. Journal of Remote Sensing, 2017, 21(4):549-557.
吴一全, 王志来.混沌蜂群优化的NSST域多光谱与全色图像融合[J].遥感学报, 2017, 21(4):549-557. [DOI:10.11834/jrs.20176273]
Overturf L A, Comer M L, Delp E J. Color image coding using morphological pyramid decomposition[J]. IEEE Transactions on Image Processing, 1995, 4(2):177-185.[DOI:10.1109/83.342191]
Mukhopadhyay S, Chanda B. Fusion of 2D grayscale images using multiscale morphology[J]. Pattern Recognition, 2001, 34(10):1939-1949.[DOI:10.1016/S0031-3203(00)00123-0]
Laporterie F, Amram O, Flouzat GE, et al. Data fusion thanks to an improved morphological pyramid approach: comparisonloop on simulated images and application to SPOT 4 data[C]//IEEE 2000 International Geoscience and Remote Sensing Symposium. Taking the Pulse of the Planet: The Role of Remote Sensing in Managing the Environment. Honolulu, HI, USA: IEEE, 2000: 2117-2119.[ DOI: 10.1109/IGARSS.2000.858314 http://dx.doi.org/10.1109/IGARSS.2000.858314 ]
Bejinariu S I, Rotaru F, NiţăC D, et al. Morphological wavelets for panchromatic and multispectral image fusion[C]//Proceedings of the 5th International Workshop Soft Computing Applications. Berlin, Heidelberg: Springer, 2013: 573-583.[ DOI: 10.1007/978-3-642-33941-7_50 http://dx.doi.org/10.1007/978-3-642-33941-7_50 ]
Restaino R, Vivone G, Mura M D, et al. Fusion of multispectral and panchromatic images based on morphological operators[J]. IEEE Transactions on Image Processing, 2016, 25(6):2882-2895.[DOI:10.1109/TIP.2016.2556944]
Guo K H, Labate D. Optimally sparse multidimensional representation using shearlets[J]. SIAM Journal on Mathematical Analysis, 2007, 39(1):298-318.[DOI:10.1137/060649781]
Kutyniok G, Labate D. Shearlets:Multiscale Analysis for Multivariate Data[M]. Birkhäuser Basel:Springer, 2012.[DOI:10.1007/978-0-8176-8316-0]
Easley G, Labate D, Lim W Q. Sparse directional image representations using the discrete shearlet transform[J]. Applied and Computational Harmonic Analysis, 2008, 25(1):25-46.[DOI:10.1016/j.acha.2007.09.003]
Blasch E P. Biological information fusion using a PCNN and belief filtering[C]//Proceedings of the International Joint Conference on Neural Networks. Washington, DC, USA: IEEE, 1999, 4: 2792-2795.[ DOI: 10.1109/IJCNN.1999.833523 http://dx.doi.org/10.1109/IJCNN.1999.833523 ]
Soille P. Morphological Image Analysis:Principles and Applications[M]. Berlin, Germany:Springe, 2003:84-87.[DOI:10.1007/978-3-662-05088-0]
Vivone G, Restaino R, Mura M D, et al. Contrast and error-based fusion schemes for multispectral image pansharpening[J]. IEEE Geoscience and Remote Sensing Letters, 2014, 11(5):930-934.[DOI:10.1109/LGRS.2013.2281996]
Peli E. Contrast in complex images[J]. Journal of the Optical Society of America A, 1990, 7(10):2032-2040.[DOI:10.1364/JOSAA.7.002032]
Liao Y, Huang W L, Shang L, et al. Image fusion based on shearlet and improved PCNN[J]. ComputerEngineering and Applications, 2014, 50(2):142-146.
廖勇, 黄文龙, 尚琳, 等. Shearlet与改进PCNN相结合的图像融合[J].计算机工程与应用, 2014, 50(2):142-146. [DOI:10.3778/j.issn.1002-8331.1207-0258]
Wang Z N, Yu X C, Zhang L B. A remote sensing image fusion algorithm based on non-negative matrix factorization[J]. Journal of Beijing Normal University:Natural Science, 2008, 44(4):378-390.
王仲妮, 余先川, 张立保.基于受限的非负矩阵分解的多光谱和全色遥感影像融合[J].北京师范大学学报:自然科学版, 2008, 44(4):387-390. [DOI:10.3321/j.issn:0476-0301.2008.04.012]
Qu X B, Yan J W, Xiao H Z, et al. Image fusion algorithm based on spatial frequency-motivated pulse coupled neural networks in nonsubsampled Contourlet transform domain[J]. Acta Automatica Sinica, 2008, 34(12):1508-1514.
屈小波, 闫敬文, 肖弘智, 等.非降采样Contourlet域内空间频率激励的PCNN图像融合算法[J].自动化学报, 2008, 34(12):1508-1514. [DOI:10.3724/SP.J.1004.2008.01508]
Jiang P, Zhang Q, Li J, et al. Fusion algorithm for infrared and visible image based on NSST and adaptive PCNN[J]. Laser&Infrared, 2014, 44(1):108-113.
江平, 张强, 李静, 等.基于NSST和自适应PCNN的图像融合算法[J].激光与红外, 2014, 44(1):108-113. [DOI:10.3969/j.issn.1001-5078.2014.01.024]
Stathaki T. Image Fusion:Algorithms and Applications[M]. Amsterdam:Academic Press, 2008:367-392.[10.1108/sr.2009.08729cae.001]
相关文章
相关作者
相关机构
京公网安备11010802024621