Unsupervised transfer learning for target detection from hyperspectral images
详细信息    查看全文
文摘
Target detection has been of great interest in hyperspectral image analysis. Feature extraction from target samples and counterpart backgrounds consist the key to the problem. Traditional target detection methods depend on comparatively fixed feature for all the pixels under observation. For example, RX employs the same distance measurement for all the pixels. However, the best separation results usually come from certain targets and backgrounds. Theoretically, they are the purest targets and backgrounds pixels, or the constructive endmembers in the subspace model. So using those most representative pixels' feature to train a concentrated subspace is expected to enhance the separability between targets and backgrounds. Meanwhile, applying the discriminative information from these training data to the large testing data which are not in the same feature space and with different data distributions is a challenge. Here, the idea of transfer learning from interactive annotation technique in video is employed. Based on the transfer learning frame, several points are taken into consideration and the proposed method is named as an unsupervised transfer learning based target detection (UTLD) method. Firstly, the extreme target and background pixels are generated from robust outlier detection, providing the input for target samples and background samples in transfer learning. Secondly, pixels are calculated from the root points in a segmentation method with the purpose to preserve the most distribution feature of the backgrounds after reduced dimension. Thirdly, sparse constraint is imposed into the transfer learning procedure. With this constraint, a simpler and more concentrated subspace with clear physical meaning can be constructed. Extensive experiments reveal the performance is comparable to the state-of-art target detection methods.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700