面向目标探测的高光谱影像特征提取与分类技术研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
高光谱遥感技术将反映目标辐射属性的光谱与反映目标空间和几何关系的图像有机地结合在一起。从理论上讲,高光谱遥感信息非常有利于深入挖掘目标的理化特性或精细识别不同目标间的细微差异。但是,传统的全色和多光谱影像的处理方法已无法满足高光谱影像信息提取的需求,面对如此海量的光谱影像数据,人们应如何从中提取感兴趣的信息,是必须解决的问题。
     结合国家863-708计划项目,本论文探讨了面向目标探测的高光谱影像特征提取与分类问题。论文以高光谱影像目标探测为主线,重点研究了高光谱影像噪声滤波、小目标探测、小样本分类和非线性特征提取技术。归纳起来,本论文主要在以下几方面开展了开拓性和创新性的研究工作:
     1、系统地总结和分析了高光谱影像的结构特性及其对目标探测的影响,探讨了研究高光谱影像特征提取与目标探测时需要考虑的因素和应该注意的问题;
     2、较深入地研究了高光谱影像噪声滤波技术,实践了基于三次光滑样条函数和小波函数的高光谱影像自适应滤波方法,并提出了一种改进阈值的多分辨率小波噪声滤波方法。
     通过对PHI数据的滤波、信噪比计算和地物分类实验证明,该方法能有效地滤除高光谱影像中随波长变化的噪声,并同时改善数据的信噪比和分类性能;
     3、深入研究了面向小目标探测的高光谱影像特征提取技术,提出了基于快速独立成份分析(FICA)、实码遗传优化投影寻踪(RCGAPP)和一维多分辨率小波分析的小目标特征提取方法;
     ●将FICA引入高光谱影像分析,解决了传统ICA不适用于高光谱影像的问题。并在此基础上,提出了一种基于FICA的高光谱影像小目标特征提取方法。通过AVIRIS和OMIS数据的小目标探测实验可以证明,基于快速独立成份分析(FICA)的特征提取方法能有效地分离出高光谱影像中的非高斯分布结构,准确、高效地对均匀背景地物中分布的小目标进行探测;
     ●利用实码遗传优化方法解决投影寻踪的指标优化问题,并提出了一种基于RCGAPP的高光谱影像小目标特征提取方法。通过AVIRIS和OMIS数据的小目标探测实验可以证明,RCGAPP方法也可以有效地提取高光谱影像中的小目标信息,并且与ICA方法相比,RCGAPP方法具有更大的灵活性,可以通过设置不同的投影指标而提取不同的兴趣特征;
     ●基于一维多分辨率小波分析的特征提取方法,是一种面向目标光谱特性的特征提取方法。该方法从地物光谱特性的信息组成结构出发,提取不同尺度上的地物吸收特征,减小光谱数据的冗余性。通过AVIRIS数据的仿真亚像素目标探测实验可以证明,不同尺度上的光谱吸收特征可以很好地表现目标的光谱辐射特性,有效地提高目标探测的效率;
Hyperspectral remote sensing effectively integrate the spectral feature with geometric characters of targets. Theoretically speaking, hyperspectral data is greatly propitious to explore target' s physical and chemical characters deeply or to classify different targets precisely. But conventional data processing methods in panchromatic and multispectral remote sensing can not satisfy demands of hyperspectral data' s information extraction. In case of so much spectral bands and such huge quantities of data, the key problem is how to extract the interest information.This dissertation explored the theories and methods for feature extraction and classification of hyperspectral data target detection, which are parts of important research contents of National 863-708 Hi-Tech, and concentrated on hyperspectral data target detection, studied de-noising methods of hyperspectral data, small targets detecting methods of hyperspectral data, small training samples classifying of hyperspectral data, nonlinear feature extracting of hyperspectral data mainly. In general, the major works and contribution of this paper are as follows.1 Hyperspectral data' s characteristic and its effect on target detection are systematically summarized and analyzed. Factor and problem that should be noticed are explored.2 The de-noising techniques of hyperspectral data are studied deeply. Two kinds of methods that based on the cubic smooth spline and stationary discrete wavelet transform respectively are researched, and a new improved threshold de-noising method is brought forward. From noise filtering, signal-to-noise calculating, targets classifying experiments of PHI data, it can be concluded that two methods al 1 can filter the noise in hyperspectral data effectively, which change with wavelengh, and improve data' s qualities on signal-to-noise and classification.3 The feature extraction techniques for hyperspectral data small targets detection are studied deeply. Three kinds of methods that based on Fast Independent Component Analysis (FICA), Real Coding Genetic Algorithm Projection Pursuit(RCGAPP), Multiscale 1-D Wavelet Transform respectively are brought forward. We settle the problem that ICA is not feet to hyperspectral data with FICA. On the grounds of this, a small targets feature extracting method that based on FICA is brought forward. From small targets detecting experiments of AVIRIS and OMIS data, it can be concluded that feature extraction method, which based on fast independent component analysis (FICA), can extract the
    non-gauss structure of hyperspectral data effectively, and is greatly propitious to detect small targets from the background objects that distribute uniformly.? We optimize the projection index with Real Coding Genetic Algorithm, and brought forward a small targets feature extracting method that based on RCGAPP. From small targets detecting experiments of AVIRIS and OMIS data, it can be concluded that projection pursuit can mine the small targets' information from hyperspectral data. Compared with ICA, it can extract different characteristic structure by set projection index more flexible.? Multiscale 1-D wavelet feature extracting is a method for target' s spectral characters. It is derived from theories of information structure, can mine the absorbing features on different scales, and reduce quantities of data. From sub. pixel targets detecting experiment of AVIRIS data, it can be concluded that different scale' s features can describe targets' spectral characteristic excellent, and improve target detection performance.4> Feature extraction and classification method that based on small training samples learning are studied deeply. A projection index for small training samples learning and a feature extraction and classification method based on SVM projection pursuit is brought forward, and are generalized for multi-class problem by Error Correcting Output Code. From targets classifying experiment of Man-made and AVIRIS data, it can be concluded that this method has an nice ability on small training samples classifying, and can extract the classification feature exactly by little pre-information.5> Training samples kernel-based techniques, which extract nonlinear feature for hyperspectral data, are studied. A feature extraction method that based on kernel Bhattacharyya projection pursuit is researched, and are generalized for multi-class problem by Error Correcting Output Code. From targets classifying experiment of Man-made and AVIRIS data, it can be concluded that kernel-based methods can mine nonlinear feature from hyperspectral data with better performance and simpler calculation.
引文
[1] Allwein E L, Schapire R E. Reducing multiclass to binary: a unifying approach for margin classifiers[C]. Proceedings of the Seventeenth International Conference on Machine Learning, San Francisco, 2000, p: 9~16.
    [2] Almuallim H, Dietterich T. Learning Boolean concepts in the presence of many irrelevant features[J]. Artificial Intelligence, 1994, 69(1-2): 279~305.
    [3] Bell A J, Sejnowski T J. An information, maximization approach to blind separation and blind deconvolution[J]. Neural Computation, 1995, 7(6): 1129~1159.
    [4] Biehl L, Landgrebe D A. Effect of the number of samples used in a leave, one. out covariance estimator[C]. SPIE International Symposium on Aerosense. Orlando Florida, 2000, p: 24~28.
    [5] Blanz V, Scholkopf B, Bulthoff H. Comparison of view. based object recognition algorithms using realistic 3D models[C]. Proceedings of the International Conference on Artificial Neural Networks, Berlin, 1996, p: 251~256.
    [6] Boardman J W, Kruse F A. Automated spectral analysis: a geological example using AVIRIS data, north Grapevine Mountains, Nevada[C]. Proceedings of the Tenth Thematic Conference on Geologic Remote Sensing(I), Environmental Research Institute of Michigan, 1994, p: 407~418.
    [7] Cloutis E A. Hyperspectral geological remote sensing: evaluation of analytical techniques[J]. International Journal of Remote Sensing, 1996, 17(12): 2215~2242.
    [8] Comon P. Independent component analysis. A new concept[J]? Signal Processing, 1994, 36(3): 287~314.
    [9] Cormen T, Leiserson C, Rivest R. Introduction to Algorithms[M]. Massachusetts: MIT Press, Cambridge, 1990.
    [10] Cortes C, Vapnik V. Support vector networks[J]. Machine Learning, 1995, 20(3): 273~297.
    [11] Curran P J, Kupiec J A, Simth G M. Remote sensing the biochemical composition of a slash pine canopy[J]. IEEE Transaction on Geoscience and Remote Sensing, 1997, 35(2): 415~420.
    [12] Dawson T P, Curran P J. A new technique for interpolating the reflectance of red edge position[J]. International Journal of Remote Sensing. 1998. 19(11): 2133~2139.
    [13] Demetriades. Shah T H, Steven M D, Clark J A. High Resolution Derivatives Spectra in Remote Sensing[J]. Remote Sensing of Environment, 1990, 33(1): 55~64.
    [14] Dick K, Miller J R. Derivative analysis applied to high resolution optical spectra of freshwater lakes[C]. Proceedings of the 14th Canadian Symposium on Remote Sensing, Calgary, Alberta, 1991, p: 400~403.
    [15] Dietterich T G, Bakiri G. Solving multiclass learning problems via Error. Correcting Output Codes[J]. Journal of Artificial Intelligence Research, 1995, 2:263—286.
    [16]Doak J. An evaluation of feature selection methods and their application to computer security[R]. Technical Report,Department of Computer Science, California University. 1992.
    [17]Donoho D L. De.noising by Soft. thresholding[J]. IEEE Transactions on Information Theory, 1995, 41(3) :613~627.
    [18]Eklundh L R. Noise estimation in NOAA AVHRR maximum. value composite NDVI images[J]. International Journal of Remote Sensing,1995, 16(15):2955—2962.
    [19]Foley J D, Dam A V, Feiner S K, Hughes J F. Computer Graphics: Principles and Practice (second edition)[M]. Indiana:Addison Wesley Publishing Company, 1996.
    [20]Foroutan I, Sklansky J. Feature selection for automatic classification of non. gauss data. IEEE Transactions on Systems, Man and Cybernetics, 1987, 17(2) : 187—198.
    [21]Friedman J H, Tukey J W. A projection pursuit algorithm for exploratory data analysis[J]. IEEE Transactions on Computers, 1974,23(1):881~890.
    [22]Fukunaga K,Hayes R R. Effects of sample size in classifier design[J]. IEEE Transactions Pattern Analysis and Machine Intelligence, 1989, 11 (8) :873~885.
    [23]Fukunaga K. Introduction to Statistical Pattern Recognition (Second Edition) [M]. New York: Academic Press, 1990.
    [24]Gao Bo. Cai. An Operational Method for Estimating Signal to Noise Ratios from Data Acquired with Imaging Spetrometers[J]. Remote Sensing of Environment, 1993, 43(1) :23—33.
    [25]Geoffrey G H. Multivariate Gaussian MRF for multispectral Scene segmentation and anomaly detectionLJ]. IEEE Transactions on Geoscience and Remote Sensing, 2000, 38(3):1199— 1211.
    [26]Giannakopoulos X, Karhunen J. Experimental comparison of neural ICA algorithms[C]. On Artificial Neural Networks (ICANN' 98). Sweden. 1998. p:651~656.
    [27]Green A A, Berman M, Switzer P, Craig M D. A transformation for ordering multispectral data in terms of image quality with implications for noise removal[J]. IEEE Transaction on Geoscience and Remote Sensing, 1988, 26(1) :65~74.
    [28]Green R 0, Eastwood M L, Sarture C M. Imaging spectroscopy and the airborne visible/infrared imaging spectrometer(AVIRIS)[J]. Remote Sensing of Environment, 1998, 65(3):227—248.
    [29]Gruninger J,Fox M J,Sundberg R L. Hyperspectral mixture analysis using constrained projections onto material subspaces[C]. International Symposium on Spectral Sensing Research (ISSSR), 2001.
    [30]Gualtieri J A,Cromp R F. Support vector machines for hyperspectral remote sensing classification[C]. The 27~(th) AIPR Workshop, Advances in Computer Assisted Recognition, 1998, Washington D C.
    [31]Hanley J A, McNeil B J. The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology, 1982, 143 (1) :29—36.
    [32]Hilton L, Ogden T. Data Analytic Wavelet Threshold Selection in 2-D Signal Denoising[J]. IEEE Transactions on Signal Processing, 1997, 45(2): 496~500.
    [33]Hoffbeck J P, Landgrebe D A. Classification of high dimensional multispectral data[D]. PhD Thesis of Purdue University. 1995.
    [34]Hoffbeck J P, Landgrebe D A. Covariance matrix estimation and classification with limited training data[J].IEEE Transactions on Pattern Analysis and Machine Intelligence, 1996, 18(7):763-767.
    [35]Hsieh P F, Landgrebe D A. Classification of high dimensional data[D].PhD thesis of Purdue University, 1998.
    [36]Hsu P H, Tseng Y H. Feature Extraction for Hyperspectral Image[C].Asian Conference On Remote Sensine, Hong Kong, 1999.
    [37]Hughes G F. On the mean accuracy of statistical pattern recognizers[J]. IEEE Transactions on Information Theory, 1968, 14(1) :55~63.
    [38]Huguenin R L, Jones J L. Intelligent information extraction from reflectance spectra:absorption band position[J]. Journal of Geophysical Research, 1986, 91(B9):9585~ 9598.
    [39]Hyvarinen A. Fast and robust fixed. point algorithms for independent component analysis[J]. IEEE Transactions on Neural Networks, 1999, 10(3) :626~634.
    [40]Hyvarinen A. New approximations of differential entropy for independent component analysis and projection pursuit[C]. Proceedings of the 1997 conference on Advances in neural information processing systems(10), Denver, Colorado, 1998, p:273—279.
    [41]Jeon B,Landgrebe D A. Partially supervised classification using weighted unsupervised clustering[J]. IEEE Transactions on Geoscience and Remote Sensing, 1999, 37(2): 1073~1079.
    [42]Jia X P,Richards J A. Segmented principal components transformation for efficient hyperspectral remote sensing image display and classification[J]. IEEE Transaction on Geoscience and Remote Sensing, 1999, 37(1) :538~542.
    [43]Jimenez L O, Landgrebe D A. Projection Pursuit in High Dimensional Data Reduction: Initial Conditions, Feature Selection and the Assumption of Normality[C]. IEEE International Conference on Systems, Man and Cybernetics. Vancouver Canada, 1995.
    [44]Jimenez L O, Landgrebe D A. Supervised classification in high dimensional space: geometrical, statistical, and asymptotical properties of multivariate data[J]. IEEE Transactions on System, Man and Cybernetics, 1998, 28(1) :39~54.
    [45]Jutten C, Herault J. Blind Separation of Sources, Part l:an adaptive algorithm based on neuromimetic architecture [J]. Signal Processing, 1991, 24(1) : 1 — 10.
    [46]Kaufman Y J. The M0DIS2. lmm channel correlation with visible reflectance for use in remote
     sensing of aerosol[J]. IEEE Transactions on Geoscience and Remote Sensing, 1997,35(5) :1286—1298.
    [47]Kauth R J, Thomas G S. The tasseled cap. A graphic description of the spectral. temporal development of agricultural crops as seen by Landsat[C]. Proceedings of the Symposium on Machine Processing of Remotely Sensed Data(4B), 1976, Purdue Universtiy, Indiana, p:41—50.
    [48]Kira K, Rendell L. A practical approach to feature selection[C]. Proceedings of the Ninth International Conference on Machine Learning. Morgan Kaufmann. 1992. pp:249—256.
    [49]Knerr S, Personnaz L, Dreyfus G. Single layer learning revisited: a stepwise procedure for building and training a neural network[C]. Neuro Computing, Algorithms, Architectures and Applications, Elsevier, 1990.
    [50]Kruse F A, Lefkoff A B. The spectral image processing system (SIPS)-interactive visualization and analysis of imaging spectrometer data[J]. Remote Sensing of Environment, 1993, 44(2-3) :145-163.
    [51]Kuo B C, Landgrebe D A. A covariance estimator for small sample size classification problems and its application to feature extraction[J]. IEEE Transactions on Geoscience and Remote Sensing, 2002, 40 (4) : 814-819.
    [52]Kuo B C, Landgrebe D A. Improved statistics estimation and feature extraction for hyperspectral data classification[D]. PhD Thesis of Purdue University.2001.
    [53]Lampinen J, Oja E. Distortion tolerant pattern recognition based on self.organizing feature extraction[J]. IEEE Transactions on Neural Networks, 1995, 6(3):539—547.
    [54]Landgrebe D A. Information Extraction Principles and Methods for Multispectral and Hyperspectral Image Data[M]. Chapter 1 of Information Processing for Remote Sensing. published by the World Scientific Publishing,USA, 2000.
    [55]Langley P. Selection of relevant features in machine learning[C]. Proceedings of the AAAI Fall Symposium on Relevance. New Orlean. 1994. p: 1—5.
    [56]Lee C, Landgrebe D A. Analyzing High Dimensional Multispectral Data[J]. IEEE Transactions on Geoscience and Remote Sensing, 1993, 31 (4) :792—800.
    [57]LeeC. Landgrebe D A. Feature extraction based on decision boundaries [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1993, 15(4) :388—400.
    [58]Lee C, Landgrebe D A. Feature extraction and classification algorithms for high dimension data[D]. PhD thesis of Purdue University,1993.
    [59]Lee J B, Woodyatt A S, Berman M. Enhancement of high spectral resolution remote sensing data by noise. adjusted principal components transform [J]. IEEE Transactions on Geoscience and Remote Sensing, 1990,28(3):295~304.
    [60]Lee T W. Independent Component Analysis:Theory and Applications. Boston :Kluwer Academic Publisher,1998.
    [61]Mallat S,Hwang W L. Singularity Detection and Processing with Wavelets[J]. IEEE Transactions on Information Theory, 1992, 38(2) :617—643.
    [62]Manolakis D, Siracusa C, Shaw G, Hyperspectral subpixel target detection using the linear mixing model, IEEE Transactions on Geoscience and Remote Sensing, 2001, 39(7) :1392~1409.
    [63]Mao J, Jain A K. Artificial neural networks for feature extraction and multivariate data projection[J]. IEEE Transactions on Neural Networks, 1995, 6(2) :296~317.
    [64]Mckeown M J, Jung T P, Makeig S. Spatially independent activity patterns in functional MRI data during the stroop color. naming task[J]. Proceedings of the National Academy of Sciences, 1998, 95(6):803—810.
    [65]Mika S,Ratsch G,Muller K R.A mathematical programming approach to the kernel fisher algorithm. In Advances in Neural Information Processing Systems(13), 2001, p:591—597.
    [66]Mika S,Ratsch G,Weston J, Scholkopf B. Fisher discriminate analysis with kernels[C]. Proceedings of the 1999 IEEE Signal Processing Society Workshop, 1999,p:41.48.
    [67]Miller J R, Hare E W, Wu J. Quantitative characterization of the vegetation red edge reflectance, 1 An inverted. gaussian reflectance model[J]. International Journal of Remote Sensing, 1990, 11 (10): 1775-1795.
    [68]Miller K R, Mika S, Ratsch G, Tsuda K, An introduction to kernel. based learning algorithms[J]. IEEE Transactions on Neural Networks,2001, 12(2):181—201.
    [69]Narendra P, Fukunaga K. A branch and bound algorithm for feature subset selection[J]. IEEE Transactions on Computers, 1977, 26(9):917—922.
    [70]0uimet M, Bengio Y.Sparse greedy spectral clustering and kernel PCA[R]. Technical Report,IRIS Machine Learning Workshop, Montreal, 2004.
    [71]Philpot W D. The Derivative Ratio Algorithm: Avoiding Atmospheric Effects in Remote Sensing[J]. IEEE Transactions on Geoscience and Remote Sensing, 1991, 29(3):350—357.
    [72] Platt J C, Cristianini N, Shawe T J. Large margin DAGs for multiclass classification[C]. In Advances in Neural Information Processing Systems(12),2000, p:547—553.
    [73] Prasanth B N, Arindam C, Andy J K.Some greedy learning algorithms for sparse regression and classification with Mercer kernels. Journal of Machine Learning Research, 2002, 3:781 — 801.
    [74] Rao C R. Linear Statistical Inference and Its Applications(Second Edition)[M]. New York-John Wiley and Sons, 2001.
    [75]Resmini R G. Mineral mapping with hyperspectral digital Imagery collection experiment sensor data at Cuprite, Nevada, USA, Nevada,USA[J]. International Journal of Remote Sensing, 1997, 18(7) : 1553-1570.
    [76] Richards J A. Remote Sensing Digital Image Analysis:An introduction(second edition)[M]. New York :Springer-Verlag, 1993.
    [77]Roger R E, Arnold J F. Reliably estimating the noise in AVIRIS hyperspectral images[J].International Journal of Remote Sensing, 1996, 17(10) : 1951~ 1962.
    [78] Running S W. Mapping regional forest evapotranspiration and photosynthesis by coupling satellite data with ecosystem simulation[J]. Ecology, 1989, 70(4) : 1090~1101.
    [79] Scholkopf B, Smola A, Miller K R.Nonlinear component analysis as a kernel eigenvalue problem[J]. Neural Computation, 1998, 10(5) : 1299—1319.
    [80]Scholkopf B. Support vector learning[D]. PhD thesis of Berlin Universitat, 1997.
    [81]Schtilkopf B, Burges C J C, Smola A J. Advances in kernel methods. Support Vector Learning[M]. Massachusetts:MIT Press, Cambridge, 1999.
    [82]Scott D W. Multivariate Density Estimation[M]. New York:John Wiley and Sons,1992.
    [83] Shahshahani B M, Landgrebe D A. The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon[J]. IEEE Transactions on Geoscience and Remote Sensing, 1994, 32(5) : 1087-1095.
    [84] Shao S C, Chang C I, Ginsberg I W. Unsupervised Target Detection in Hyperspectral Images Using Projection Pursuit[J]. IEEE Transactions on Geoscience and Remote Sensing. 2001,39(7) :1380-1391.
    [85] Sheinvald J, Dom B,Niblack W. A modeling approach to feature selection[C]. Proceedings of the Tenth International Conference on Pattern Recognition, 1990, p:535~539.
    [86] Siedlecki W, Sklansky J. On automatic feature selection[J]. International Journal of Pattern Recognition. 1988. 2(2) : 197—220.
    [87] Skalak D. Prototype and feature selection by sampling and random mutation hill. climbing algorithms[C]. Proceedings of the Eleventh International Conference on Machine Learning, Woman Kaufmann, 1994, p:293—301.
    [88] Smola A, Burges C,Drucker H. Regression estimation with support vector learning machines[D]. MS thesis of Technische Universitat Munchen, 1996.
    [89]Smola A J. Learning with kernels[D]. PhD thesis of Technische Universitat Berlin, 1998.
    [90]Smola A J, Scholkopf B. On a kernel.based method for pattern recognition, regression, approximation and operator inversion algorithm[J]. Algorithmica,1998,22(1-2):211—231.
    [91] Stocker A D, Schaum A P. Application of Stochastic Mixing Models to Hyperspectral Detection Problems, Proceedings SPIE, Algorithms for Multispectral and Hyperspectral Imagery Ⅲ, 1997,3071:47-60.
    [92] Switzer P, Green A A. Min/Max Autocorrelation Factors for Multivariate Spatial Imagery [R]. Technical Report 6, Department of Statistics, Stanford University, 1984.
    [93]Thomas H M, Hoff P J M. Blind separation of mixed. kurtosis signals using an adaptive threshold nonlinearity[C]. International Conference on Independent Component Analysis and Blind Signal Separation, Helsinki, Finland, 2000, p:221—226.
    [94] Vafaie H, De J K. Robust feature selection algorithms[C]. Proceedings of the IEEE International Conference on Tools with Artificial Intelligence, 1993, p: 256~363.
    [95] Vane G, Goetz Alexander F H. Terrestrial imaging spectrometry: Current status, future trends[J]. Remote Sensing of Environment, 1993, 44(2-3): 117~126.
    [96] Vapnik V. Statistical learning theory[M]. New York: John Wiley and Sons, 1998.
    [97] Vermote E F. Second simulation of the satellite signal in the solar spectrum, 6S: An Overview[J]. IEEE Transactions on Geoscience and Remote Sensing, 1997, 35(3): 675~686.
    [98] Warner T A, Michael C S. Spatial autocorrelation analysis of hyperspectral imagery for feature selection[J]. Remote Sensing of Environment, 1997, 60(1): 58~70.
    [99] Weston J, Watkins C. Multi. class Support Vector Machines[R]. Technical report CSD. TR. 98. 04, Royal Holloway, 1998.
    [100] Vao Y. Frasconi P. Fingerprint classification with combinations of support vector machines[C]. Proceedings of the third international conferent on Audio. and Video. Based Biometric Person Authentication, 2001, p:253~258.
    [101] Vapnik V.著,张学工译.统计学习理论的本质[M].北京:清华大学出版社.2000.
    [102] 蔡国强.实数型遗传算法的研究及其应用[D].福州大学硕士论文.2003.
    [103] 陈宝林.最优化理论与算法[M].北京:清华大学出版社.1996.
    [104] 陈秋林,薛永祺.OMIS成像光谱数据信噪比估算[J].遥感学报,2000,4(4):284~289.
    [105] 陈述彭,童庆禧,郭华东.遥感信息机理的研究[M].北京:科学出版社,1998.
    [106] 程正兴.数据拟合[M].西安:西安交通太学出版社.1986.
    [107] 付强,付红,王立坤.基于加速遗传算法的投影寻踪模型在水质评价中的应用研究[J].地理科学,2003,23(2):236~239.
    [108] 黄文艳.支撑向量机与指纹分类算法研究[D].河北工业大学硕士论文.2003.
    [109] 李智勇,郁文贤,匡纲要.基于高维几何特性的高光谱异常检测算法研究[J].遥感技术与应用.2003.18(6):379~383.
    [110] 刘卓.高维数据分析中的降维方法研究[D].国防科技大学硕士论文.2002.
    [111] 浦瑞良,宫鹏.高光谱遥感及其应用[M].北京:高等教育出版社,2000.
    [112] 田庆久.遥感信息定量化理论、方法与应用.http://www.idlworld.com/thesisl.htm,2000.
    [113] 田庆久,张良培,郑兰芬,童庆禧.成像光谱遥感定标模型的分析与评价[J].遥感技术与应用,1996,11(3):16~21.
    [114] 王春胜,戴逸松,文大化.高光谱图像数据变换编码压缩方法[J].遥感学报2000,4(2):95~99.
    [115] 王仁宏.多元样条函数及其应用[M].北京:科学出版社.1994.
    [116] 夏建涛.基于机器学习的高维多光谱数据分类[D].西北工业大学博士论文.2002.
    [117] 张莲蓬.基于投影寻踪和非线性主曲线的高光谱遥感影像特征提取及分类研究[D].山东科技大学博士论文.2003.
    [118] 张宗贵,王润生.基于谱学的成像光谱遥感技术发展与应用[J].国土资源遥感,2000,(3):17~24.
    [119] 张学工.关于统计学习理论与支持向量机[J].自动化学报,2000,26(1):32~42.
    [120] 章照止,林须端.信息论与最优编码[M].上海:上海科学技术出版社.1993.
    [121] 郑会永,杨秉正,戴冠中.光顺样条滤波的数学原理及其应用[J].西北工业大学学报,1999,17(1):77~81.
    [122] 朱长青.小波分析理论与影像分析[M].北京:测绘出版社.1998.
    [123] 杨竹青,李勇,胡德文.独立成份分析方法综述[J].自动化学报,2002,5(28):762~772.
    [124] 肖建华,吴今培.基于核的特征提取技术及应用研究[J].计算机工程,2002,20(10):36~38.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700