基于支持向量机的高光谱遥感影像分类研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
随着高光谱遥感数据获取技术的快速发展,高光谱数据处理和分析方法的研究已成为推动其应用发展最为重要的方面。作为统计学习理论最有效的方法,支持向量机(Support Vector Machine, SVM)因其具有小样本学习、高维空间、非线性等优点,在解决高维数据分类、小样本学习、抗噪声影响等方面体现出了明显的优越性,成为高光谱遥感影像分类的一个研究热点。本文从统计学习理论出发,在详细研究SVM理论的基础上,对其从多类分类器设计、核函数构造、组合核函数等方面进行改进,并应用这些改进策略对高光谱遥感数据进行分类。论文主要取得了以下主要研究结论:
     1)在构建基于SVM的高光谱遥感影像分类技术体系的基础上,分析比较了SVM四种常用核函数的效果,并与现有的几种代表性分类方法进行了比较。结果表明,SVM可以有效地克服样本不足和特征维数高导致的Hughes现象,其训练速度和分类速度均优于径向基神经网络,分类精度高于最小距离分类器、光谱角制图分类器、最大似然分类器和径向基神经网络分类器等常规分类器。在SVM的四种常用核函数中,RBF核函数的分类精度最高。
     2)通过主成分分析、独立成分分析、最大噪声分离、分组特征提取、导数光谱等进行高光谱遥感数据进行降维与特征提取,作为SVM分类器输入,结果表明SVM分类器的分类精度随着特征维数明显波动,其中主成分分析后提取的特征用于分类通常能够获得最高的精度。基于大量试验,建议对于以分类为目标的高光谱遥感数据降维处理,可以优先选择主成分分析。
     3)在深入研究多类SVM分类器构建方法的基础上,提出了一个基于分离性测度的二叉树多类SVM分类器,试验结果表明多类SVM的分类精度优于常规分类算法,而相对已有的多分类SVM方法,本文提出的基于分离性测度的二叉树多类SVM的分类精度最高。
     4)针对SVM分类中核函数构造的重要性,提出了一种再生核Hilbert空间的小波核,构建了小波SVM。小波SVM在应用Coiflet小波核函数的时候能获得最高分类精度,优于光谱角制图分类器、最小距离分类器,及径向基核函数的SVM。
     5)常用的遥感分类器在组合光谱特征和空间特征进行分类时往往存在局限性,而多核分类器通过对光谱特征和空间特征采用多个核函数进行组合,能够综合空间和结构特征的优点改进分类结果。利用小波变换提取纹理特征,分别对纹理特征和光谱特征采用独立的核函数,构建了多核SVM分类器。试验结果表明,组合光谱和小波纹理信息的多核SVM分类精度高于单核SVM分类器和交叉信息核SVM分类器。将多核SVM与特征提取相结合进行试验,发现当设计多核SVM分类器时,主成分变换后的前四主成分和小波纹理组合作为输入的分类精度最高,是一种快速有效的分类方法。
     6)基于多核SVM,提出了一种用于高空间分辨率高光谱数据的多核SVM分类模型。该分类方法用数学形态学提取空间结构信息,与光谱信息进行组合实现分类。结果显示融合光谱信息和数学形态学剖面的多核SVM能够得到优于其他方法的分类精度,试验中组合前7主成分分量和数学形态学剖面数据的多核SVM能达到91.0%的分类精度。
With the rapid development of the hyperspectral remote sensing data acquisition technology, it has become the most important aspect to process and analyze hyperspectral data with high performance. As the most effective algorithm of the Statistical Learning Theory (SLT), Support Vector Machine (SVM) has the small sample study, high-dimensional space, nonlinear, etc., and becomes the hot issue because of its superiority of the hyperspectral remote sensing image classification. In this paper, SVM theory and its improvement were studied in detail based on the statistical learning concept, and some improvements of SVM algorithms(such as multi-class SVM designed, wavelet Kernel SVM, multiple kernel functions) were successfully applied in hyperspectral remote sensing data classification. The main results are as follows:
     Four common kernel functions of SVM were analyzed and compared. And we compared SVM with some typical classifiers such as Minimum Distance classifier (MDC), Radial Basis Function Neural Network (RBFNN) classifier, Spectral Angle Mapper(SAM), Maximum Likelihood classifier(MLC). SVM could effectively overcome the Hughes phenomenon with inadequate samples. Overall, it is concluded that the classification speed of SVM are faster than RBFNN, and the accuracy of the SVM classifiers are higher than MDC, RBFNN, SAM, MLC Classifier.
     By executing the experiments of the feature extraction algorithms including Principal Component Analysis (PCA), Maximum Noise Fraction (MNF), Independent Component Analysis(ICA), feature extraction after correlation coefficient grouping, derivative spectral analysis and so on, it indicates that the SVM model has the fluctuation accuracy with feature dimension and PCA achieves the best accuracy for SVM classification. According to the experiments, it is effective to choose the PCA as the hyperspectral data feature extraction for the classification.
     According to the SVM theory and the separability measure of hyperspectral data, a novel binary tree multi- class SVM classifier based on separability between classes was put forward. It indicates that the novel binary tree classifier has highest accuracy than the other multi-class SVM classifiers and some traditional classifiers (SAM and MDC).
     With the study on the SVM theory based on reproducing kernel Hibert Space (RKHS) and wavelet analysis, the wavelet SVM (WSVM) classifier based on wavelet kernel functions was constructed. In the experiments, the WSVM classifier demonstrated more accurate results when it was using Coiflet wavelet Kernel function. Compared with some traditional classifiers (SAM&MDC) and classic kernel (Radial Basis Function kernel) of SVM, it indicates that wavelet kernel SVM classifier is most accurate.
     Usually, remote sensing image classifiers are limited in terms of the ability to combine spectral features with spatial features. Multiple kernel classifiers, however, are capable to integrate spectral features with spatial or structural features by using multiple kernels for spectral and spatial features and summating them for final outputs. The results show that more accurate classification results can be obtained by integrating the spectral and wavelet texture information, using the multiple kernel SVM classifiers. Moreover, when multiple kernel SVM classifier was being adopted, the highest accuracy can be provided by the combination of the first four principal components from PCA with textural features.
     Furthermore, a method is proposed for the classification of hyperspectral data with high spatial resolution by using SVM with multiple kernels. In the approach, morphological profile (MP) was used for hyperspectral data classification. The results show that by the integrating the spectral features and MP features, the multiple kernel SVM classifiers obtain more accurate classification results than sole-kernel SVM classifier. Moreover, when the multiple kernel SVM classifier was being used, the combination the first seven principal components derived from Principal Components Analysis (PCA) and MP can provide the highest accuracy (91.0%).
引文
[1]李德仁.论21世纪遥感与GIS的发展[J].武汉大学学报(信息科学版), 2003,28(2): 127-131.
    [2] Hughes G. On the mean accuracy of statistical pattern recognizers[J]. IEEE Transactions on Information Theory, 1968,14(1): 55-63.
    [3]郑兰芬,王晋年.成像光谱遥感技术及其图像光谱信息提取分析研究[J].环境遥感, 1992,7(1): 49-58.
    [4]浦瑞良,宫鹏.高光谱遥感及其应用[M].北京:高等教育出版社, 2000: 254.
    [5]刘志刚.支撑向量机在光谱遥感影像分类中的若干问题研究[D].武汉:武汉大学, 2004.
    [6] Gao B-C, Montes MJ, Davis CO et al. Atmospheric correction algorithms for hyperspectral remote sensing data of land and ocean[J]. Remote Sensing of Environment, 2009,113(Supplement 1): S17-S24.
    [7] Richter R. Atmospheric correction of DAIS hyperspectral image data[J]. Computers & Geosciences, 1996,22(7): 785-793.
    [8] Cairns B, Carlson BE, Ruoxian Y et al. Atmospheric correction and its application to an analysis of Hyperion data[J]. IEEE Transactions on Geoscience and Remote Sensing, 2003,41(6): 1232-1245.
    [9] Hsu P-H. Feature extraction of hyperspectral images using wavelet and matching pursuit[J]. Isprs Journal of Photogrammetry and Remote Sensing, 2007,62(2): 78-92.
    [10] Malpica JA, Rejas JG, Alonso MC. A projection pursuit algorithm for anomaly detection in hyperspectral imagery[J]. Pattern Recognition, 2008,41(11): 3313-3327.
    [11] Plaza A, Benediktsson JA, Boardman JW et al. Recent advances in techniques for hyperspectral image processing[J]. Remote Sensing of Environment, 2009,113(Supplement 1): S110-S122.
    [12] Bruce LM, Koger CH, Jiang L. Dimensionality reduction of hyperspectral data using discrete wavelet transform feature extraction[J]. IEEE Transactions on Geoscience and Remote Sensing, 2002,40(10): 2331-2338.
    [13] Plaza A, Martinez P, Plaza J et al. Dimensionality reduction and classification of hyperspectral image data using sequences of extended morphological transformations[J]. IEEE Transactions on Geoscience and Remote Sensing, 2005,43(3): 466-479.
    [14] Chen CM, Hepner GF, Forster RR. Fusion of hyperspectral and radar data using the IHS transformation to enhance urban surface features[J]. Isprs Journal of Photogrammetry and Remote Sensing, 2003,58(1-2): 19-30.
    [15] Gillespie AR. Spectral mixture analysis of multispectral thermal infrared images[J]. Remote Sensing of Environment, 1992,42(2): 137-145.
    [16] Thenkabail PS, Enclona EA, Ashton MS et al. Accuracy assessments of hyperspectral waveband performance for vegetation analysis applications[J]. Remote Sensing of Environment, 2004,91(3-4): 354-376.
    [17] Zhang Y, Desai MD. Hyperspectral image compression based on adaptive recursive bidirection prediction/JPEG[J]. Pattern Recognition, 2000,33(11): 1851-1861.
    [18] Pu R, Gong P. Wavelet transform applied to EO-1 hyperspectral data for forest LAI and crown closure mapping[J]. Remote Sensing of Environment, 2004,91(2): 212-224.
    [19] Chan JC-W, Paelinckx D. Evaluation of Random Forest and Adaboost tree-based ensemble classification and spectral band selection for ecotope mapping using airborne hyperspectral imagery[J]. Remote Sensing of Environment, 2008,112(6): 2999-3011.
    [20] Du P-j, Tan K, Su H-j. Feature extraction for target identification and image classification of OMIS hyperspectral image[J]. Mining Science and Technology (China), 2009,19(6): 835-841.
    [21] Du Q. Unsupervised real-time constrained linear discriminant analysis to hyperspectral image classification[J]. Pattern Recognition, 2007,40(5): 1510-1519.
    [22] Lawrence RL, Wood SD, Sheley RL. Mapping invasive plants using hyperspectral imagery and Breiman Cutler classifications (randomForest)[J]. Remote Sensing of Environment, 2006,100(3): 356-362.
    [23] Rivard B, Feng J, Gallie A et al. Continuous wavelets for the improved use of spectral libraries and hyperspectral data[J]. Remote Sensing of Environment, 2008,112(6): 2850-2862.
    [24] Letexier D, Bourennane S. Noise Removal From Hyperspectral Images by Multidimensional Filtering[J]. IEEE Transactions onGeoscience and Remote Sensing, 2008,46(7): 2061-2069.
    [25] Othman H, Shen-En Q. Noise reduction of hyperspectral imagery using hybrid spatial-spectral derivative-domain wavelet shrinkage[J]. IEEE Transactions on Geoscience and Remote Sensing, 2006,44(2): 397-408.
    [26] Dalponte M, Bruzzone L, Vescovo L et al. The role of spectral resolution and classifier complexity in the analysis of hyperspectral images of forest areas[J]. Remote Sensing of Environment, 2009,113(11): 2345-2355.
    [27] Haboudane D, Miller JR, Pattey E et al. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture[J]. Remote Sensing of Environment, 2004,90(3): 337-352.
    [28] Lelong CCD, Pinet PC, PoilvéH. Hyperspectral Imaging and Stress Mapping in Agriculture: A Case Study on Wheat in Beauce (France)[J]. Remote Sensing of Environment, 1998,66(2): 179-191.
    [29] Strachan IB, Pattey E, Boisvert JB. Impact of nitrogen and environmental conditions on corn as detected by hyperspectral reflectance[J]. Remote Sensing of Environment, 2002,80(2): 213-224.
    [30] Choe E, van der Meer F, van Ruitenbeek F et al. Mapping of heavy metal pollution in stream sediments using combined geochemistry, field spectroscopy, and hyperspectral remote sensing: A case study of the Rodalquilar mining area, SE Spain[J]. Remote Sensing of Environment, 2008,112(7): 3222-3233.
    [31] Debba P, van Ruitenbeek FJA, van der Meer FD et al. Optimal field sampling for targeting minerals using hyperspectral data[J]. Remote Sensing of Environment, 2005,99(4): 373-386.
    [32] Ellis RJ, Scott PW. Evaluation of hyperspectral remote sensing as a means of environmental monitoring in the St. Austell China clay (kaolin) region, Cornwall, UK[J]. Remote Sensing of Environment, 2004,93(1-2): 118-130.
    [33] He L, Pan Q, Di W et al. Anomaly detection in hyperspectral imagery based on maximum entropy and nonparametric estimation[J]. Pattern Recognition Letters, 2008,29(9): 1392-1403.
    [34] Tan K, Du PJ. Hyperspectral Remote Sensing Image Classification based on Support Vector Machine[J]. Journal of Infrared and Millimeter Waves, 2008,27(2): 123-128.
    [35] Hermes L, Frieauff D, Puzicha J et al. Support vector machines for land usage classification in Landsat TM imagery: proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS'99), Hamburg Germany, 1999[C]. IEEE, 1999.
    [36] Huang C, Davis L, Townshend J. An assessment of support vector machines for land cover classification[J]. International Journal of Remote Sensing, 2002,23(4): 725-749.
    [37]谭琨,杜培军.基于支持向量机的高光谱遥感图像分类[J].红外与毫米波学报, 2008,28(02): 2009-2013.
    [38] Brown M, Lewis H, Gunn S. Linear Spectral Mixture Models and Support Vector Machines for Remote Sensing[J]. IEEE Transactions on Geoscience and Remote Sensing, 2000,38(5): 2346-2360.
    [39]董广军,张永生,范永弘.基于多特征多分辨率融合的高光谱图像分类[J].红外与毫米波学报, 2006,25(3): 123-126.
    [40] Han J, M K. Data Mining: Concepts and Techniques[M]. San Francisco: Morgan Kaufmann, 2000.
    [41] Williams C, Barber D. Bayesian classification with Gaussian processes[J]. Ieee Transactions on Pattern Analysis and Machine Intelligence, 1998,20(12): 1342-1351.
    [42] Cheeseman P, Stutz J. Bayesian classification (AutoClass): Theory and results[M]. Advances in knowledge discovery and data mining. Philadelphia, PA, USA; American Association for Artificial Intelligence. 1996: 153-180.
    [43] Hepner G, Logan T, Ritter N et al. Artificial neural network classification using a minimal training set- Comparison to conventional supervised classification[J]. Photogrammetric Engineering and Remote Sensing, 1990,56: 469-473.
    [44]阎平凡,张长水.人工神经网络与模拟进化计算[M].北京:清华大学出版社, 2005: 639.
    [45] Ripley B. Pattern recognition and neural networks[M]. Cambridge: Cambridge University Press, 1996: 403.
    [46] Hofmeyr S, Forrest S. Architecture for an artificial immune system[J]. Evolutionary Computation, 2000,8(4): 443-473.
    [47] Zhong Y, Zhang L, Huang B et al. An unsupervised artificial immune classifier for multi/hyperspectral remote sensing imagery[J]. IEEE Transactions on Geoscience and Remote Sensing, 2006,44(2): 420-431.
    [48] Friedl M, Brodley C. Decision tree classification of land cover from remotely sensed data[J]. Remote Sensing of Environment, 1997,61(3): 399-409.
    [49] Quinlan J. Induction of decision trees[J]. Machine learning, 1986,1(1): 81-106.
    [50] Friedl M, Brodley C, Strahler A. Maximizing land cover classification accuracies produced by decision trees at continental to global scales[J]. IEEE Transactions on Geoscience and Remote Sensing, 1999,37(2): 969-977.
    [51] Zadeh L. Fuzzy sets[J]. Information and control, 1965,8(3): 338-353.
    [52] Dubois D, Prade H. Rough Fuzzy Sets and Fuzzy Rough Sets[J]. International Journal of general systems, 1990,17(2): 191-209.
    [53] Vapnik VN. The Nature of Statistical Learning Theory[M]. NY: Springer, 2000: 314.
    [54] Scholkopf B, Burges C, Smola A. Advances in Kernel Methods: Support Vector Learning[M]. Cambridge, MA, USA: MIT Press, 1999: 373.
    [55] Vapnik VN. Statistical Learning Theory[M]. NY: Springer, 1998: 732.
    [56]邓乃扬,田英杰.数据挖掘中的新方法——支持向量机[M].北京:科学出版社, 2004: 407.
    [57] Rosenblatt M. Remarks on some nonparametric estimates of a density function[J]. The Annals of Mathematical Statistics, 1956,27(3): 832-837.
    [58] Novikoff A. On convergence proofs on perceptrons: proceedings of the the Symposium on the Mathematical Theory of Automata, 1962[C]. Polytechnic Institute of Brooklyn, 1962.
    [59] Tikhonov. On solving ill-problem and method of regularization[J]. Dokladay Akademii Nauk USSR, 1963,153: 501-504.
    [60] Chaitin G. On the length of programs for computing finite binary sequences[J]. Journal of the ACM (JACM), 1966,13(4): 547-569.
    [61] Solomonoff R. A preliminary report on a general theory of inductive inference[J]. Revision of Report, 1960,131: 1-21.
    [62] Kolmogorov A. Three approaches to the quantitative definition of information[J]. International Journal of Computer Mathematics, 1968,2(1): 157-168.
    [63] Widrow B. Pattern recognition and adaptive control[J]. Applications and Industry, 1964: 61-65.
    [64] Vapnik V, Chervonenkis A. Uniform convergence of frequencies of occurence of events to their probabilities[J]. Dokladay Akademii Nauk USSR, 1968,181(4): 915-918.
    [65] Vapnik V, Chervonenkis A. The necessary and sufficient conditions for consistency in the empirical risk minimization method[J]. Pattern Recognition and Image Analysis, 1991,1(3): 284-305.
    [66] Cortes C, Vapnik V. Support-vector networks[J]. Machine learning, 1995,20(3): 273-297.
    [67] Boser B, Guyon I, Vapnik V. A training algorithm for optimal margin classifiers: proceedings of the the 5th Annual ACM Workshop on Computational Learning Theory, Pittsburgh, PA, 1992[C]. ACM New York, NY, USA.
    [68] Smola A, Sch?lkopf B. A tutorial on support vector regression[J]. Statistics and Computing, 2004,14(3): 199-222.
    [69] Cherkassky V, Ma Y. Practical selection of SVM parameters and noise estimation for SVM regression[J]. Neural networks, 2004,17(1): 113-126.
    [70] Drucker H, Burges C, Kaufman L et al. Support vector regression machines[J]. Advances in Neural Information Processing Systems, 1997: 155-161.
    [71] Burges C. A tutorial on support vector machines for pattern recognition[J]. Data mining and knowledge discovery, 1998,2(2): 121-167.
    [72] Suykens J, Vandewalle J. Least squares support vector machine classifiers[J]. Neural processing letters, 1999,9(3): 293-300.
    [73] Bishop C. Pattern recognition and machine learning[M]. New York: Springer 2006: 409.
    [74] Cristianini N, Shawe-Taylor J. An introduction to support Vector Machines: and other kernel-based learning methods[M]. Cambridge: Cambridge University Press, 2000: 189.
    [75] Suykens J, Vandewalle J, De Moor B. Optimal control by least squares support vector machines[J]. Neural networks, 2001,14(1): 23-35.
    [76] Sch?lkopf B, Smola A. Learning with kernels: Support vector machines, regularization, optimization, and beyond[M]. Cambrigde: the MIT Press, 2002: 644.
    [77] Joachims T. Making large scale SVM learning practical[M]. Advances in kernel methods: support vector learning. Cambridge; MIT Press. 1999: 169-184.
    [78] Chang C, Lin C. Training v-support vector classifiers: Theory and algorithms[J]. Neural Computation, 2001,13(9): 2119-2147.
    [79] Weston J, Mukherjee S, Chapelle O et al. Feature selection for SVMs[J]. Advances in Neural Information Processing Systems, 2001: 668-674.
    [80] Suykens J, De Brabanter J, Lukas L et al. Weighted least squares support vector machines: robustness and sparse approximation[J]. Neurocomputing, 2002,48(1-4): 85-105.
    [81] Suykens J. Least squares support vector machines for classification and nonlinear modelling[J]. Neural Network World, 2000,10(1): 29-47.
    [82] Manevitz L, Yousef M. One-class svms for document classification[J]. The Journal of Machine Learning Research, 2002,2: 154.
    [83] Platt J, Cristianini N, Shawe-Taylor J. Large margin DAGs for multiclass classification[J]. Advances in Neural Information Processing Systems, 2000,12(3): 547-553.
    [84] Lin C, Wang S. Fuzzy support vector machines[J]. Ieee Transactions on Neural Networks, 2002,13(2): 464-471.
    [85] Fung G, Mangasarian O. Finite Newton method for Lagrangian support vector machine classification[J]. Neurocomputing, 2003,55(1-2): 39-56.
    [86] Lin K, Lin C. A study on reduced support vector machines[J]. IEEE Transactions on Neural Networks, 2003,14(6): 1449-1459.
    [87] Weston J, Watkins C. Support vector machines for multi-class pattern recognition: proceedings of the the Seventh European Symposium On Artificial Neural Networks, Bruges, Belgium, 1999[C]. 1999.
    [88] Hsu C, Lin C. A Comparison of Methods for Multiclass Support Vector Machines[J]. IEEE Transactions on Neural Networks, 2002,13(2): 415-425.
    [89] Crammer K, Singer Y. On the algorithmic implementation of multiclass kernel-based vector machines[J]. The Journal of Machine Learning Research, 2002,2: 265-292.
    [90] Crammer K, Singer Y. On the learnability and design of output codes for multiclass problems[J]. Machine learning, 2002,47(2): 201-233.
    [91] Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines: proceedings of the the IEEE Workshop Neural Networks for Signal Processing VII, 1997[C]. 1997.
    [92] Ayat N, Cheriet M, Suen C. Automatic model selection for the optimization of SVM kernels[J]. Pattern Recognition, 2005,38(10): 1733-1745.
    [93] Ayat N, Cheriet M, Remaki L et al. Kmod-a new support vector machine kernel with moderate decreasing for pattern recognition. application to digit image recognition: proceedings of the the 2005 conference on Genetic and evolutionary computation, Washington DC, USA 2001[C]. New York: ACM, 2005.
    [94] Wahba G. Support Vector Machines, Reproducing Kernel Hilbert Spaces and the Randomized GACV[M]. Advances in kernel methods: support vector learning. Cambridge; MIT Press. 1999: 69-88
    [95] Foody GM, Mathur A. The use of small training sets containing mixed pixels for accurate hard image classification: Training on mixed spectral responses for classification by a SVM[J]. Remote Sensing of Environment, 2006,103(2): 179-189.
    [96] Camps-Valls G, Martin-Guerrero JD, Rojo-Alvarez JL et al. Fuzzy Sigmoid Kernel for Support Vector Classifiers[J]. Neurocomputing, 2004,62(6): 501-506.
    [97] Camps-Valls G, Gomez-Chova L, Munoz-Mari J et al. Kernel-based Framework for Multitemporal and Multisource Remote Sensing Data Classification and Change Detection[J]. IEEE Transactions on Geoscience and Remote Sensing, 2008,46(6): 1822-1835.
    [98] Camps-Valls G, Gomez-Chova L, Munoz-Mari J et al. Composite kernels for hyperspectral image classification[J]. IEEE Geoscience and Remote Sensing Letters, 2006,3(1): 93-97.
    [99] Bruzzone L, Chi M, Marconcini M. A Novel Transductive SVM for Semisupervised Classification of Remote-Sensing Images[J]. Ieee Transactions on Geoscience and Remote Sensing, 2006,44(11): 3363-3373.
    [100] Fauvel M, Chanussot J, Benediktsson J. A Combined Support Vector Machines Classification Based on Decision Fusion: proceedings of the IEEE Intl Geoscience and Remote Sensing Symposium, Denver, CO, USA, 2006[C].
    [101]张学工.关于统计学习理论与支持向量机[J].自动化学报, 2000,26(1): 32-42.
    [102]张莉,周伟达,焦李成.子波核函数网络[J].红外与毫米波学报, 2001,20(3): 223-227.
    [103]张莉,周伟达,焦李成.尺度核函数支撑矢量机[J].电子学报, 2002,30(4): 527-529.
    [104]唐发明,王仲东,陈绵云.支持向量机多类分类算法研究[J].控制与决策, 2005,(07): 746-749+754.
    [105]唐发明,王仲东,陈绵云.一种新的二叉树多类支持向量机算法[J].计算机工程与应用, 2005,(07): 24-26.
    [106]陈毅松,汪国平,董士海.基于支持向量机的渐进直推式分类学习算法[J].软件学报, 2003,(03): 451-460.
    [107]李建民,张钹,林福宗.序贯最小优化的改进算法[J].软件学报, 2003,(05): 918-924.
    [108]李红莲,王春花,袁保宗等.针对大规模训练集的支持向量机的学习策略[J].计算机学报, 2004,(05): 715-719.
    [109]张浩然,韩正之.回归支持向量机的改进序列最小优化学习算法[J].软件学报, 2003,(12): 2006-2013.
    [110]郭辉,刘贺平,王玲.最小二乘支持向量机参数选择方法及其应用研究[J].系统仿真学报, 2006,(07): 2033-2036+2051.
    [111]周伟达.核机器学习方法研究[D].西安:西安电子科技大学, 2003.
    [112]刘志刚,李德仁,秦前清等.支持向量机在多类分类问题中的推广[J].计算机工程与应用, 2004,(07): 10-13+65.
    [113]沈照庆,舒宁,陶建斌.一种基于NPA的加权“1 V m”SVM高光谱影像分类算法[J].武汉大学学报(信息科学版), 2009,(12): 1444-1447.
    [114]黄昕,张良培,李平湘.基于多尺度特征融合和支持向量机的高分辨率遥感影像分类[J].遥感学报, 2007,(01): 48-54.
    [115]边肇祺,张学工.模式识别[M].北京:清华大学出版社, 2002: 338.
    [116] Kressel U. Pairwise Classification and Support Vector Machines[M]. Advances in Kernel Methods:Support Vector Learning. Cambridge; The MIT Press. 1999: 255-268.
    [117] XU B, GONG P. Land-use/land-cover classification with multispectral and hyperspectral EO-1 data[J]. Photogrammetric Engineering and Remote Sensing, 2007,73(8): 955-965.
    [118] Tso B, Mather P. Classification methods for remotely sensed data[M]. London: Taylor & Francis, 2001: 326.
    [119]路威,余旭初,刘娟等.基于分布异常的高光谱遥感影像小目标检测算法[J].测绘学报, 2006,35(1): 40-45.
    [120]杨可明,李慧,郭达志.基于最佳小波包基的高光谱影像特征制图[J].测绘学报, 2008,37(1): 54-58.
    [121]汤国安,张友顺,刘咏梅等.遥感数字图像处理[M].北京:科学出版社, 2004: 274.
    [122] Cheriyadat A, Bruce L. Why principal component analysis is not an appropriate feature extraction method for hyperspectral data: proceedings of the Geoscience and Remote Sensing Symposium, IGARSS03, Toulouse, France, 2003[C]. IEEE, 2003.
    [123] Shah CA, Varshney PK, Arora MK. ICA mixture model algorithm for unsupervised classification of remote sensing imagery[J]. International Journal of Remote Sensing, 2007,28(7-8): 1711-1731.
    [124] Hyvainen A, Oja E. Independent component analysis: algorithms and applications[J]. Neural networks, 2000,13(4-5): 411-430.
    [125] Zhong J, Wang R. Multi-temporal remote sensing change detection based on independent component analysis[J]. International Journal of Remote Sensing, 2006,27(9-10): 2055-2061.
    [126] Lotsch A, Friedl MA, Pinzon J. Spatio-temporal deconvolution of NDVI image sequences using independent component analysis[J]. Ieee Transactions on Geoscience and Remote Sensing, 2003,41(12): 2938-2942.
    [127]任红艳,潘剑君,张佳宝.高光谱遥感技术的铅污染监测应用研究[J].遥感信息, 2005,(3): 34-38.
    [128]张睿,马建文.一种SVM-RFE高光谱数据特征选择算法[J].武汉大学学报(信息科学版), 2009,(07): 834-837.
    [129]闫志刚. SVM与MGIS及其在矿井突水信息处理中应用的研究[D].徐州:中国矿业大学, 2006.
    [130] Azimi-Sadjadi MR, Zekavat SA. Cloud classification using support vector machines: proceedings of the Geoscience and Remote Sensing Symposium(IGARSS 2000), Honolulu, HI, 2000[C]. IEEE 2000 International, 2000.
    [131] Bennett K, Cristianini N, Shawe-Taylor J et al. Enlarging the margins in perceptron decision trees[J]. Machine learning, 2000,41(3): 295-313.
    [132]童庆禧,张兵,郑兰芬.高光谱遥感:原理、技术与应用[M].北京:高等教育出版社, 2006: 415.
    [133]郑明国,蔡强国,秦明周等.一种遥感影像分类精度检验的新方法[J].遥感学报, 2006,(01): 39-48.
    [134] Daubechies I. The Wavelet Transtorm, Time-frequency Localization and Signal Analysis[J]. IEEE Transactions on Information Theory, 1990,36(5): 961-1005.
    [135] Zhong JM, Sun HF. Wavelet-Based Multiscale Anisotropic Diffusion With Adaptive Statistical Analysis for Image Restoration[J]. IEEE Transactions on Circuits and Systems I-Regular Papers, 2008,55(9): 2716-2725.
    [136] Du ZM, Jin XQ, Yang YY. Wavelet Neural Network-Based Fault Diagnosis in Air-Handling Units[J]. Hvac&R Research, 2008,14(6): 959-973.
    [137] Zhang Q, Benveniste A, Hogskola L. Wavelet networks[J]. IEEE Transactions on Neural Networks, 1992,3(6): 889-898.
    [138] Zhang L, Zhou W, Jiao L. Wavelet Support Vector Machine[J]. IEEE Transactions on Systems, Man and Cybernetics, Part B, 2004,34(1): 34-39.
    [139] Rakotomamonjy A, Canu S. Frames, Reproducing Kernels, Regularization and Learning[J]. Journal of Machine Learning Research, 2005,6: 1485-1515.
    [140] Wu F, Zhao Y. Least Squares Support Vector Machine on Gaussian Wavelet Kernel Function Set[J]. Lecture Notes in Computer Science, 2006,3971: 936-941.
    [141] Scholkopf B, Smola A, Williamson R et al. New Support Vector Algorithms[J]. Neural Computation, 2000,12(5): 1207-1245.
    [142] Evgeniou T, Pontil M, Poggio T. Regularization Networks and Support Vector Machines[J]. Advances in Computational Mathematics, 2000,13(1): 1-50.
    [143] Canu S, Mary X, Rakotomamonjy A. Functional Learning through Kernels[J]. Advances in Learning Theory: Methods, Models and Applications, 2003,190: 89-110.
    [144] Debnath L, Mikusinski P. Hilbert Spaces with Applications[M]. San Diego: Academic Press, 2005: 571.
    [145] Aronszajn N. Theory of Reproducing Kernels[J]. Transactions of the American Mathematical Society, 1950,68(3): 337-404.
    [146] Hsu C, Chang C, Lin C, A Practical Guide to Support Vector Classification[R]. Taipei: National Taiwan University, 2003.
    [147] Nelson JDB, Damper RI, Gunn SR et al. Signal theory for SVM kernel design with applications to parameter estimation and sequence kernels[J]. Neurocomputing, 2008,72(1-3): 15-22.
    [148] Inglada J. Automatic recognition of man-made objects in high resolution optical remote sensing images by SVM classification of geometric image features[J]. Isprs Journal of Photogrammetry and Remote Sensing, 2007,62(3): 236-248.
    [149] Turkoglu I, Avci E. Comparison of wavelet-SVM and wavelet-adaptive network based fuzzy inference system for texture classification[J]. Digital Signal Processing, 2008,18(1): 15-24.
    [150] Zhang J, Liu Y. SVM decision boundary based discriminative subspace induction[J]. Pattern Recognition, 2005,38(10): 1746-1758.
    [151] Bazi Y, Melgani F. Toward an optimal SVM classification system for hyperspectral remote sensing images[J]. IEEE Transactions on Geoscience and Remote Sensing, 2006,44(11): 3374-3385.
    [152] Foody G, Mathur A. Toward intelligent training of supervised image classifications: directing training data acquisition for SVM classification[J]. Remote Sensing of Environment, 2004,93(1-2): 107-117.
    [153] Camps-Valls G, Bruzzone L. Kernel-based methods for hyperspectral image classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2005,43(6): 1351-1362.
    [154] Arivazhagan S, Ganesan L, Priyal SP. Texture classification using Gabor wavelets based rotation invariant features[J]. Pattern Recognition Letters, 2006,27(16): 1976-1982.
    [155] Ouma YO, Tetuko J, Tateishi R. Analysis of co-occurrence and discrete wavelet transform textures for differentiation of forest and non-forest vegetation in very-high-resolution optical-sensor imagery[J]. International Journal of Remote Sensing, 2008,29(12): 3417-3456.
    [156] Arivazhagan S, Ganesan L. Texture segmentation using wavelet transform[J]. Pattern Recognition Letters, 2003,24(16): 3197-3203.
    [157] Chanda B, Majumder DD. A note on the use of the graylevel co-occurrence matrix in threshold selection[J]. Signal Processing, 1988,15(2): 149-167.
    [158] Gelzinis A, Verikas A, Bacauskiene M. Increasing the discrimination power of the co-occurrence matrix-based features[J]. Pattern Recognition, 2007,40(9): 2367-2372.
    [159] Bartsch H, Obermayer K. Second-order statistics of natural images[J]. Neurocomputing, 2003,52-54: 467-472.
    [160] Murino V, Ottonello C, Pagnan S. Noisy texture classification: A higher-order statistics approach[J]. Pattern Recognition, 1998,31(4): 383-393.
    [161] Garrity SR, Vierling LA, Smith AMS et al. Automatic detection of shrub location, crown area, and cover using spatial wavelet analysis and aerial photography[J]. Canadian Journal of Remote Sensing, 2008,34: S376-S384.
    [162] Huang X, Zhang LP, Li PX. Classification of Very High Spatial Resolution Imagery Based on the Fusion of Edge and Multispectral Information[J]. Photogrammetric Engineering and Remote Sensing, 2008,74(12): 1585-1596.
    [163] Mercier G, Moser G, Serpico SB. Conditional copulas for change detection in heterogeneous remote sensing images[J]. IEEE Transactions on Geoscience and Remote Sensing, 2008,46(5): 1428-1441.
    [164] Gueguen L, Datcu M. Image time-series data mining based on the information-bottleneck principle[J]. IEEE Transactions on Geoscience and Remote Sensing, 2007,45(4): 827-838.
    [165] Mallat S. A theory for multiresolution signal decomposition: The wavelet representation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1989,11(7): 674-693.
    [166] Ma W, Manjunath B. A comparison of wavelet transform features for texture image annotation: proceedings of the IEEE Int Conf Image Processing, Washington, DC, USA, 1995[C]. IEEE Computer Society,Washington, DC, USA 1995.
    [167] Wang Z, Yong J. Texture analysis and classification with linear regression model based on wavelet transform[J]. IEEE Transactions on Image Processing, 2008,17(8): 1421-1430.
    [168] Manian V, Vasquez R. Scaled and rotated texture classification using a class of basis functions[J]. Pattern Recognition, 1998,31(12): 1937-1948.
    [169] Borah S, Hines EL, Bhuyan M. Wavelet transform based image texture analysis for size estimation applied to the sorting of tea granules[J]. Journal of Food Engineering, 2007,79(2): 629-639.
    [170] Palmason JA, Benediktsson JA, Sveinsson JR. Classification of hyperspectral ROSIS data from urban areas: proceedings of the Recent Advances in Space Technologies, 2005 RAST 2005 Proceedings of 2nd International Conference on, 2005[C].
    [171] Fauvel M, Chanussot J, Benediktsson JA et al. Spectral and spatial classification of hyperspectral data using SVMs and morphological profiles: proceedings of the IEEE International Geoscience and Remote Sensing Symposium, 2007 IGARSS 2007 2007[C].
    [172] Benediktsson JA, Pesaresi M, Amason K. Classification and feature extraction for remote sensing images from urban areas based on morphological transformations[J]. IEEE Transactions on Geoscience and Remote Sensing, 2003,41(9): 1940-1949.
    [173] Fauvel M, Benediktsson JA, Chanussot J et al. Spectral and Spatial Classification of Hyperspectral Data Using SVMs and Morphological Profiles[J]. Ieee Transactions on Geoscience and Remote Sensing, 2008,46(11): 3804-3814.
    [174] Pesaresi M, Benediktsson JA. A new approach for the morphological segmentation of high-resolution satellite imagery[J]. Ieee Transactions on Geoscience and Remote Sensing, 2001,39(2): 309-320.
    [175] Benediktsson JA, Palmason JA, Sveinsson JR. Classification of hyperspectral data from urban areas based on extended morphological profiles[J]. IEEE Transactions on Geoscience and Remote Sensing, 2005,43(3): 480-491.
    [176]唐常青,吕宏伯,黄铮等.数学形态学方法及其应用[M].北京:科学出版社, 1990: 178.
    [177]崔屹.图象处理与分析一一数学形态学方法及应用[M].北京:科学出版社, 2000: 171.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700