Full-Text Search:
Home|Journal Papers|About CNKI|User Service|FAQ|Contact Us|中文
《Electronics Optics & Control》 2018-06
Add to Favorite Get Latest Update

Fusion of Infrared and Visible Images Based on Sparse Representation and NSCT-PCNN

XIA Jing-ming;CHEN Yi-ming;CHEN Yi-cai;HE Kai;School of Electronics and Information Engineering,Nanjing University of Information Science and Technology;School of Mechanical Engineering,North China Electric Power University;  
In view of the problems of the loss of detailed information caused by wavelet transform,the nonsparsity of low-frequency subband coefficient decomposed by Non-Subsampled Contourlet Transform( NSCT),and the poor comprehensive performance of infrared and visible image fusion,an algorithm for fusion of infrared and visible images is proposed based on sparse representation,NSCT,and Pulse Coupled Neural Network( PCNN). Firstly,the original image is decomposed by NSCT to obtain the low-frequency and high-frequency subbands. Secondly,the K-SVD( Singular Value Decomposition) algorithm is used to carry out dictionary training on the low-frequency subband to realize the sparse representation of lowfrequency subband and the fusion of low-frequency sparse coefficients. Then,the spatial frequency of the high-frequency subband is utilized to stimulate PCNN,and the coefficient with more ignition times is selected as the fusion coefficient of the high-frequency subband. Finally,the NSCT inverse transform is applied to the low and high frequency subband fusion coefficients to obtain the fused image. The experimental results show that the proposed algorithm has a great advantage in subjective visual effect and objective index evaluation,and its comprehensive performance is superior to that of the existing algorithm.
【Fund】: 国家自然科学基金(41505017)
【CateGory Index】: TP391.41
Download(CAJ format) Download(PDF format)
CAJViewer7.0 supports all the CNKI file formats; AdobeReader only supports the PDF format.
©2006 Tsinghua Tongfang Knowledge Network Technology Co., Ltd.(Beijing)(TTKN) All rights reserved