Full-Text Search:
Home|About CNKI|User Service|中文
Add to Favorite Get Latest Update

Multi-Source Remote Sensing Image Data Fusion

Sun Jiabing Liu Jilin Li Jun (Wuhan Tech. Univ. of Surveying & Mapping School of Informatics 430079)  
Remote sensing image fusion of different data sources of the same area can be used to enrich the information about the interested areas. The image fusion of the most different bands of electro-magnetic spectrum (such as optical and radar data), provides additional information with respect to each of single sensor separately, thus more accurate classification can be achieved. Fused high spatial resolution data (such as panchromatic air photo) and lower spatial, but higher spectral resolution data (such as LANDSAT TM), can improve image sharpness and enhance feature extraction and visual interpretation, and can be used for objects' detection change. At present image fusion has three basic methods to be discussed on remote sensing section: 1. pixel-based fusion, 2.feature-based fusion, 3.decision-level fusion. In this paper three methods of multisource image fusion are discussed. They are pixels-based weight fusion, feature fusion based on wavelet transform and separate classes fusion based on Bayes rule.
Download(CAJ format) Download(PDF format)
CAJViewer7.0 supports all the CNKI file formats; AdobeReader only supports the PDF format.
©CNKI All Rights Reserved