用户名: 密码: 验证码:
基于支持向量机的多光谱数据分类
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
结合国家自然科学基金和河北省自然科学基金项目,研究了基于支持向量机的多光谱数据分类问题。目前遥感信息的提取和利用水平大大滞后于遥感技术的发展,因此研究新的理论和方法提高遥感信息的提取水平具有十分重要的意义。在多光谱数据分类中,由于训练样本非常有限、数据维数很高,容易导致严重的Hughes现象,传统模式识别的分类方法难以取得很好的结果。统计学习理论第一次系统地研究了在有限样本下的机器学习问题,提出了一种能够根据样本数量的多少合理地控制分类器的推广能力的一种模型选取原则—结构风险最小化原则。支持向量机是在该理论框架下产生的一种学习方法。本文以统计学习理论(Statistic Learning Theory-SLT)和支持向量机(Support Vector Machine-SVM)为基础,开展了以下几个方面的研究工作:
     首先,深入分析了多光谱数据的特点和传统模式分类方法在多光谱数据分类中面临的困难。把统计学习理论和支持向量机用于多光谱数据分类,有效地克服了Hughes现象,获得了比一般方法更好的分类精度。
     其次,总结了现有的几种有代表性的多类支持向量机方法,这些方法包括:一对多(one-against-all)、一对一(one-against-one)、有向无环图支持向量机(DAG-SVMs)、决策树分类和全局优化分类(MSVM);还介绍了两种模糊支持向量机方法。提出了两种改进的模糊多类支持向量机方法,它是在全局优化分类(MSVM)的基础上,引入模糊隶属函数,并将其用于多光谱数据分类,提高了数据的分类精度,具有较强的泛化能力。
     第三,针对传统支持向量机方法中存在对噪声或野点敏感的问题,提出了两种基于支持向量数据描述的模糊多类支持向量机方法。重点在隶属度的选取上不同,在确定样本的隶属度时,不仅考虑了样本与类中心之间的关系,还考虑了类中各个样本之间的关系。一种是基于数据紧描述引入模糊隶属函数;另一种是基于支持向量数据描述引入模糊隶属函数,使用近邻方法提取每个数据点的局部密度。数值实验结果表明,与几种支持向量机方法相比,上述两种基于支持向量数据描述的模糊多类支持向量机方法具有良好的抗噪性能及分类能力。
     第四,为了减少计算的复杂度,提出了基于聚类的支持向量机反问题求解方法。从实验结果看,基于聚类求解SVM反问题,有效地减少了算法复杂度,提高了计算效率,还研究了最大间隔与两个聚类中两个最近点的距离之间的数量关系。针对线性可分情况,研究表明线性硬间隔分类机的对偶问题与凸壳问题(平分最近点法)是等价的,线性硬间隔分类机的最大间隔与凸壳问题的两个最近点的距离相等:针对非线性可分情况,研究表明线性软间隔分类机的对偶问题与缩小的凸壳问题(推广的平分最近点法)是等价的,线性软间隔分类机的最大间隔与缩小的凸壳问题的两个最近点的距离相等。
     最后,总结了适合于求解大型问题的训练算法:选块算法(Chuncking),分解算法(Decomposing)和序列最小最优化算法(Sequential Minimal Optimization-SMO)等,这些都是专门针对支持向量机设计的快速算法;然后利用改进的序列最小最优化算法求解模糊多类支持向量机,实验结果显示运行时间减少了,方法是可行的和有效的。
The theories and methods for high dimensional multispectral data classification are studied in the thesis based on support vector machines, which is an important part of research of the National Natural Science Foundation of China and the Natural Science Foundation of Hebei Province. The existing methods'ability of information extraction from spectral remote sensing images still largely lags behind technical developments. It is desirable and significant to study new theories and methods to improve this ability. Due to the limited number of training samples, high data dimension and the "Hughes Phenomenon", the performance of traditional pattern classification algorithms is often unsatisfactory. Statistical Learning Theory (SLT), the first theory that systematically studies the problem of machine learning with small size sample, presents a new inductive principle, structural risk minimization (SRM) principle, which can guide the selection of suitable classification model according to sample amount so as to obtain high generalization ability. Support vector machine (SVM) is a new general machine learning method based on SRM. In this thesis, several issues are addressed concerning the support vector machine and the classification of high dimensional multispectral data. The study is based on statistic learning theory (SLT) and support vector machine (SVM). The main work and results are outlined as follows:
     At first, the characteristics of high dimensional multispectral data are studied, and the weaknesses of the traditional pattern classification algorithms that deteriorate the performance are carefully analyzed. Appling statistic learning theory and support vector machine in high dimensional multispectral data classification, the Hughes phenomenon is reduced and higher classification accuracy is obtained.
     Secondly, five major types of multicategory support vector machine methods are systematically summarized and analyzed. These multicategory classification methods include: One-against-All, One-against-One, Directed Acyclic Graph SVMs (DAG-SVMs), Decision-Tree-Based Multiclass Support Vector Machines and Multiclass Support Vector Machines. Moreover, two types of Fuzzy Support Vector Machines are introduced and analyzed. Further, two improved fuzzy multicategory support vector machines are proposed and applied in classification of high dimensional multispectral data. They are based on the Multiclass Support Vector Machines method, and introduce the fuzzy membership of data samples of a given class so that to improve classification performance with high generalization capability.
     Thirdly, two types of Fuzzy Multicategory Support Vector Machines (FMSVM) based cn Support Vector Data Description (SVDD) are presented in order to reduce the effects of noises and outliers. The fuzzy membership is defined by not only the relation between a sample and its cluster center, but also the relation among samples. Two methods of defining the fuzzy membership are developed:One is based on the affinity among samples, and another is based on the improved Support Vector Data Description. The experimental results show that the presented two fuzzy multicategory support vector machines methods are more robust than the traditional support vector machine.
     Fourthly, in order to reduce the computational complexity, we propose a method for solving SVM inverse problems based on clustering. The computational complexity of SVM inverse problems by clustering is greatly reduced and the margin is enlarged. Based on the clustering, the relationship of the margin and the closest points in convex hulls can be also analyzed. For the linearly separable case, it is demonstrated that the maximum margin between the two subsets is equivalent to the distance of the two closest points in the convex hulls. For the inseparable case, the maximum margin between the two subsets is equivalent to the distance of the two closest points in the reduced convex hulls.
     Finally, the training algorithms of SVM for large-scale training set are summarized and analyzed. These methods include Chuncking, Decomposing and Sequential Minimal Optimization (SMO). We have used the improved Sequential Minimal Optimization to solve fuzzy multicategory support vector machine. The experimental results show that the computational load is greatly reduced and the generalization capability is improved.
引文
[1]David A. Landgrebe. Signal theory methods in multispectral remote sensing. Inc., Hoboken, New Jersey:John Wiley & Sons,2003.
    [2]蒲瑞良,宫鹏.高光谱遥感及其应用.北京:高等教育出版社,2000.
    [3]Thomas M. Lillesand(彭望禄译).遥感与图像解译(第四版).北京:电子工业出版社,2003.
    [4]郭德方.遥感图像的计算机处理和模式识别.北京:电子工业出版社,1987.
    [5]Kenneth R.Castleman.数字图像处理.北京:电子工业出版社,2002.
    [6]John A. Richards, Xiuping Jia. Remot sensing digital image analysis. Berlin Heidelberg: Springer-Verlag,1999.
    [7]边肇祺,张学工.模式识别.北京:清华大学出版社,2000.
    [8]蔡自兴、徐光佑.人工智能及其应用.北京:清华大学出版社,2002,174-221.
    [9]邓乃扬,田英杰.数据挖掘中的新方法一支持向量机.北京:科学出版社,2004.
    [10]Sergios Theodoridis, Konstantinos Koutroumbas(李晶皎等译).模式识别(第二版).北京:电子工业出版社,2004.
    [11]Tom M. Mitchell. Machine Learning.北京:机械工业出版社,2003.
    [12]陈宝林.最优化理论与算法.北京:清华大学出版社,1996.
    [13]Chulhee Lee, David A. Landgrebe. Decision boundary feature extraction for non-parametric classification. IEEE Transactions on System, Man, and Cybernetics,1993,23(2):433-444.
    [14]Chulhee Lee, David A. Landgrebe. Feature extraction based on decision boundaries. IEEE Transactions on Pattern Analysis and Machine Intelligence,1993,15(4):388-400.
    [15]Chulhee Lee, David Landgrebe. Feature extraction and classification algorithm for high dimensional data. PhD Thesis and School of Electrical Engineering Technical Report TR-EE 93-1,1993 (222 pages).
    [16]Chulhee Lee, David A. Landgrebe. Decision boundary feature extraction for neural networks. IEEE Transactions on Neural Networks,1997,8(1):75-83.
    [17]Bor-Chen Kuo, David Landgrebe. Improved statistics estimation and feature extraction for
    hyperspectral data classification, PhD Thesis and School of Electrical & Computer Engineering Technical Report TR-ECE 01-6,2001 (88 pages). [18] Hoffbeck, Joseph P., David A. Landgrebe. Covariance matrix estimation and classification with limited training data. IEEE Transactions on Pattern Analysis and Machine Intelligence,1996,18(7): 763-767. [19] Joseph P. Hoffbeck, David Landgrebe. Classification of high dimensional multispectral data. PhD Thesis and School of Electrical Engineering Technical Report TR-EE 95-14,1995. [20] Saldju Tadjudin, David Landgrebe. Covariance estimation with limited training samples. IEEE Transactions on Geoscience and Remote Sensing,1999,37(4):2113-2118. [21] Bor-Chen Kuo, David A. Landgrebe. A covariance estimator for small sample size classification problems and its application to feature extraction. IEEE Transactions on Geoscience and Remote Sensing,2002,40(4):814-819. [22] Behzad M. Shahshahani, David A. Landgrebe. The effect of unlabeled samples in reducing the small sample size problem and mitigating the Hughes phenomenon. IEEE Transactions on Geoscience and Remote Sensing,1994,32(5):1087-1095. [23] Behzad Shahshahani, David Landgrebe. Classification of multispectral data by joint supervised-unsupervised learning. PhD Thesis and School of Electrical Engineering Technical Report TR-EE 94-1,1994. [24] Byeungwoo Jeon, David Landgrebe. Partially supervised classification using weighted unsupervised clustering. IEEE Transactions on Geoscience and Remote Sensing,1999,37(2):1073-1079. [25] Victor Haertel, David Landgrebe. On the classification of classes with nearly equal spectral responses in remote sensing hyperspectral image data. IEEE Transactions on Geoscience and Remote Sensing,1999,37(5):2374-2386. [26] David Landgrebe. On information extraction principles for hyperspectral data. 4th International Conference on Geocomputation, Fredercksburg, Virginia, USA,25-28,1999. [27] David Landgrebe. Information extraction principles and methods for multispectral and hyperspectral image data. Chapter 1 of Information Processing for Remote Sensing, edited by C.H.Chen, published by the World Scientific Publishing Co., Inc.,1060 Main Street, River Edge, NJ07661, USA,2000.
    [28]Luis O. Jimenez, David Landgrebe. Supervised classification in high dimensional space:geometrical, statistical and asymptotical properties of multivariate data. IEEE Tran. On Systems, Man, and Cybernetics,1998,28(1):39-54.
    [29]Luis O. Jimenez, David Landgrebe. High dimensional feature reduction via projection pursuit. PhD thesis, Purdue University,1995.
    [30]Jimenez, Luis, David Landgrebe. Hyperspectral data analysis and feature reduction via projection pursuit. IEEE Transactions on Geoscience and Remote Sensing,1999,37(6):2653-2667.
    [31]Saldju Tadjudin, David A. Landgrebe. Robust parameter estimation for mixture model. IEEE Transactions on Geoscience and Remote Sensing,2000,38(1):439-445.
    [32]Jackson, Q., David A. Landgrebe. An adaptive classifier design for high-dimensional data analysis with a limited training data set. IEEE Transactions on Geoscience and Remote Sensing,2001,39(12): 2664-2679.
    [33]David Landgrebe. Hyperspectral image data analysis as a high dimensional signal processing problem. (Invited), Special Issue of the IEEE Signal Processing Magazine,2002,19(1):17-28.
    [34]Varun Madhok, David A. Landgrebe. A process model for remote sensing data analysis. IEEE Transactions on Geoscience and Remote Sensing,2002,40(3):680-686.
    [35]Qiong Jackson, David Landgrebe. An Adaptive Method for Combined Covariance Estimation and Classification. IEEE Transactions on Geoscience and Remote Sensing,2002,40(5):1082-1087.
    [36]Qiong Jackson, David Landgrebe. Adaptive bayesian contextural classification based on markov random fields. IEEE Transactions on Geoscience and Remote Sensing,2002,40(11):2454-2463.
    [37]Bor-Chen Kuo, David Landgrebe. A robust clasification procedure based on mixed classifiers and nonparametric weighted feature extraction. IEEE Transactions on Geoscience and Remote Sensing, 2002,40(11):2486-2494.
    [38]M. Murat Dundar, David Landgrebe. A model based mixture supervised classification approach in hyperspectral data analysis. IEEE Transactions on Geoscience and Remote Sensing,2002,40(12): 2692-2699.
    [39]Murat Dundar, David Landgrebe. A cost-effective semi-supervised classifier approach with kernels. IEEE Transactions on Geoscience and Remote Sensing,2004,42(1):264-270.
    [40]M. Murat Dundar, David Landgrebe. Toward an optimal supervised classifier for the analysis of hyperspectral data. IEEE Transactions on Geoscience and Remote Sensing,2004,42(1):271-277.
    [41]Bor-chen Kuo, David Landgrebe. Nonparametric weighted feature extraction for classification. IEEE Transactions on Geoscience and Remote Sensing,2004,42(5):1096-1105.
    [42]David Landgrebe. Multispectral land sensing:where from, where to?. IEEE Transactions on Geosciences and Remote Sensing,2005,43(3):414-421.
    [43]G. M. Fung, O. L. Mangasarian. A feature selection Newton method for support vector machine classification. Computational Optimization and Applications,2004,28:185-202.
    [44]J. Bi, K. P. Bennett, M. Embrechts, C. M. Breneman, M. Song. Dimensionality reduction via sparse support vector machines. Journal of Machine Learning Research,2003,3:1229-1243.
    [45]G. Fung, O. L. Mangasarian. Finite Newton method for Lagrangian support vector machine classification. Technical Report 02-01, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin,2002.
    [46]Y.J. Lee, O. L. Mangasarian. RSVM:Reduced support vector machines. Technical Report 00-07, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, July 2000. Proceedings of the First SIAM International Conference on Data Mining, Chicago, April 5-7,2001, CD-ROM Proceedings.
    [47]O. L. Mangasarian. A finite Newton method for classification problems. Technical Report 01-11, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, December 2001. Optimization Methods and Software,2002,17:913-929.
    [48]O. L. Mangasarian, D. R. Musicant. Lagrangian support vector machines. Journal of Machine Learning Research,2001,1:161-177.
    [49]Y.J. Lee, O. L. Mangasarian. SSVM:A smooth support vector machine. Computational Optimization and Applications,2001,20:5-22.
    [50]O. L. Mangasarian. Generalized support vector machines. In A. Smola, P. Bartlett, B. Scholkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, Cambridge, MA:MIT Press,2000, 135-146.
    [51]O. L. Mangasarian, D. R. Musicant. Robust linear and support vector regression. IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(9):950-955.
    [52]O. L. Mangasarian, D. R. Musicant. Large scale kernel regression vialinear programming. Machine Learning,2002,46(1-3):255-269.
    [53]A. T. Phillips, J. B. Rosen, K. A. Dill. Convex global underestimation for molecular stucture prediction. In P. M. Pardalos et al, editor, From Local to Global Optimization, Dordrecht, Netherlands,2001,1-18.
    [54]J. B. Rosen, R. F. Marcia. Convex quadratic approximation. Computational Optimization and Applications,2004,28:173-184.
    [55]O. L. Mangasrian, J. B. Rosen, M. E. Thompson. Convex Kernel Underestimation of Functions with Multiple Local Minima. Computational Optimization and Applications,2006,34(1):35-45.
    [56]O. L. Mangasarian, J. B. Rosen, M. E. Thompson. Nonconvex piecewise-quadratic underestimation for global minimization. Journal of Global Optimization,2006,34(4):475-488.
    [57]O. L. Mangasarian, J. B. Rosen, M. E. Thompson. Global minimization via piecewise-linear underestimation. Technical Report 03-03, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin,2003.
    [58]Chunhui Chen, O. L. Mangasarian. Hybrid misclassification minimization. Advances in Computational Mathematics,1996,5(2):127-136.
    [59]Nello Cristianini, John Shawe-Taylor. Support Vector Machines, www.support-vector.net.
    [60]V. N. Vapnik. Statistical learning theory. New York:Wiley,1998.
    [61]V. N. Vapnik. The nature of statistical learning theory. New York:Springer, second edition,2000.
    [62]V.N.Vapnik(张学工译).统计学习理论的本质.北京:清华大学出版社,2000.
    [63]Cortes C, Vapnik V. Supprot vector networks. Machine Learning,1995,20:273-297.
    [64]Cristianini N, Shawe-Taylor J. An introduction to support vector machines and other kernelbased learning methods. Cambridge University Press,2000.
    [65]Scholkopf B, Burges C J C, Smola A J. Advances in kernel methods-support vector learning, Cambridge, MA:MIT Press,1999.
    [66]Scholkopf B et al. New support vector algorithms. Neural Computation,2000,12:1207-1245.
    [67]Scholkopf B et al. Shrinking the Tube:A new support vector regression algorithm. In:Advances in Neural Info. Proc. Sys. Cambridge, MA:MIT Press,1999.
    [68]Scholkopf B, Smola A J. Learning with kernels-support vector machines, regularization, optimization, and beyond. MIT Press,2002.
    [69]V. Vapnik. An overview of statistical learning theory. IEEE Transactions on Neural Networks,1999, 10(5):988-1000.
    [70]Asa Ben-Hur, David Horn, Hava T. Siegelmann, Vladimir Vapnik. Support vector clustering. Journal of Medicine Learning Research,2001,2:125-137.
    [71]V. Chercassky, X. Shao, F. Mulier, V. Vapnik. Model complexity control for regression using VC generalization bounds. IEEE Transactions on Neural Networks,1999,10(5):1075-1090.
    [72]Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, Sayan Mukherjee. Choosing Multiple Parameters for Support Vector Machines. Machine Learning,2002,46:131-159.
    [73]Bahlmann C, Haasdonk B, Burkhardt H. On-Line handwriting recognition with support vector machines-a kernel approach. In Proc. of the 8th IWFHR,2002,49-54.
    [74]Tong S, Koller D. Support vector machine active learning with applications to text classification. In: Proceedings of the Seventeenth International Conference on Machine Learning,2000.
    [75]Lu J, Plataniotis K, Venetsanopoulos A. Face recognition using feature optimization and nu-support vector learning. In:Proceedings of the IEEE International Workshop on Neural Networks for Signal Processing, Falmouth, MA. USA,2001,373-382.
    [76]A. J. Smola. Learning with kernels. PhD thesis, Technische Universitat Berlin,1998.
    [77]J. Weston, A. Gammerman, M. Stitson, V Vapnik, V. Vovk, C. Watkins. Support vector density estimation. In B. Scholkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods-Support Vector Learning, Cambridge, MA:MIT Press,1999,293-306.
    [78]Huang C, Davis L, Townshend J. An assessment of support vector machines for land cover classification. Int J. Remote Sensing,2002,23(4):725-749.
    [79]Zhang J, Zhang Y, Zhou T. Classification of hyperspectral data using support vector machine. In: IEEE International Confernece on Image Processing,2001,882-885.
    [80]夏建涛.基于机器学习的多光谱数据分类.西安:西北工业大学博士学位论文,2002.
    [81]刘志刚.支撑向量机在光谱遥感影像分类中的若干问题研究.武汉:武汉大学博士学位论文,2004.
    [82]梅建新.基于支持向量机的高分辨率遥感影像的目标检测研究.武汉:武汉大学博士学位论文,2004.
    [83]Diaconis. P, Freeddman. D. Asymptotics of graphical projection pursuit. The Annals of Statistics, 1984,12(3):793-815.
    [84]Hall P, Li. K. On almost linearity of low dimensional projections from high dimensional data. The Annals of Statistics,1993,21(2):867-889.
    [85]Fukunaga, K. introduction to statistical pattern recognition. San Diego, California, Academic Press, Inc.,1990.
    [86]Hughes, G. F. On the accuracy of statistical pattern recognizers. IEEE Trans. on Information Theory, 1968,14(1):55-63.
    [87]E. J. Bredensteiner, K. P. Bennett. Multicategory classification by support vector machines. Computational Optimization and Applications,1999,12:53-79.
    [88]Hsu C. W., Lin C. J., A comparison of methods for multiclass support vector machines, IEEE Transactions on Neural Networks,2002,13(2):415-425.
    [89]J. Weston, C. Watkins. Multi-class support vector machines. Technical report csd-tr-98-04, Royal Holloway, University of London, Surrey, England,1998.
    [90]Crammer K, Singer Y. Ultraconservative Online Algorithms for Multiclass Problems. In Proc. of the 14th Annual Conf. on Computational Learning Theory,2001.
    [91]Bottou L, Cortes C, Dcnkcr J, Dnrckcr H, Guyon l, Jackcl L, Cun Y, Muller Sackingcr E, Simard P, Vapnik V. Comparison of Classifier Methods:A case study handwriting digit recognition. In International Conference on Pattern Recognition. IEEE Computer Society Press,1994,77-87.
    [92]Kref(?)el.U. H. -G.. Pairwise classification and support vector machines. In B. Scholkopf, C. J. C. Burges, and A. J. Smola (Eds.), Advances in kernel methods:Support vector learning, Cambridge, MA:MIT Press,1999,255-268.
    [93]J. C. Platt, N. Cristianini, J. Shawe-Taylor. Large margin DAG's for multiclass classification. In Advances in neural Information Processing Systems. Cambridge, MA:MIT Press,2000,12: 547-553.
    [94]Bennett K, Cristianini J, Shawe-Taylor, Wu D. Enlarging the margin in perceptron decision trees. Machine Learning,2000,41:295-313.
    [95]Takahashi F, Abe S. Decision-Tree-Based multiclass support vector machines. From: http://frenchblue.scitec.kobe-u.ac.jp/~abe/pdf/iconop02-takahashi.pdf.2002.
    [96]J. A. K. Suykens, J. Vandewalle. Multiclass least squares support vector machines. In Proceedings of IJCNN'99.
    [97]G. M. Fung, O. L. Mangasarian. Multicategory proximal support vector machine classifiers. Machine Learning,2004,1-21.
    [98]Takuya Inoue, Shigeo Abe. Fuzzy support vector machines for pattern classification. Neural Networks, Proceedings of IJCNN'01 International Joint Conference on 2001, 2:1449-1454.
    [99]Shigeo Abe, Takuya Inoue. Fuzzy support vector machines for multiclass problems. ESANN'2002 proceedings-European Symposium on Artificial Neural Networks. Bruges (Belgium), d-side public, ISBN-2-930307-02-1,2002,113-118.
    [100]Chun-Fu Lin, Sheng-De Wang. Fuzzy support vector machines. IEEE transactions on neural network, 2002,13(2):464-471.
    [101]Han-Pang Huang, Yi-Hung Liu. Fuzzy support vector machines for pattern recognition data mining. International Journal of Fuzzy Systems,2002,4(3):826-835.
    [102]Kijsirikul. B., Ussivakul, Tsujinishi, D., Abe. S.. Fuzzy least squares support vector machines for multiclass problems, Neural Networks,2003,16:785-792.
    [103]C. F. Lin, S. D. Wang. Training algorithm for fuzzy support vector machines with noisy data. Pattern Recognition Letters,2004,25:1647-1656.
    [104]UCI Repository of machine learning databases and domain theories, FTP address:ftp:// ftp.ics.uci.edu/pub/machine-learning-databases.
    [105]D. Michie, D. J. Spiegelhalter, C. C. Taylor. Machine learning, neural and statistical classification [Online]. Available:ftp.ncc.up.pt/pub/statlog/,1994.
    [106]D. M. J. Tax, R. P.W. Duin, Support vector domain description. Pattern Recognition Letters,1999, 20:1191-1199.
    [107]D. M. J. Tax, R. P. W. Duin. Support vector data description. Machine Learning,2004,54:45-66.
    [108]张翔,肖小玲,徐光佑.基于样本之间紧密度的模糊支持向量机方法.软件学报,2006,17(5):951—958.
    [109]K. Y. Lee, D. W. Kim, D. Lee, K. H. lee. Improving support vector data description using local density degree. Pattern Recognition,2005,38:1768-1771.
    [110]A. Navia-Vazquez, E. Parrado-Hernandez. Support vector machine interpretation. Neurocomputing Letters. 2006,69:1754-1759.
    [111]X. Z. Wang, Q. He, D. G. Chen, D. Yeung. A genetic algorithm for solving the inverse problem of support vector machines. Neurocomputing,2005,68:225-238.
    [112]J. Saketha Nath, S. K. Shevade. An efficient clustering scheme using support vecor method. Pattern Recognition,2006,39:1473-1480.
    [113]Wei Zhong, Jieyue He, Robert Harrison, Phang C. Tai, Yi Pan. Clustering support vector machines for protein local structure prediction. Expert System with Applications,2007,32:518-526.
    [114]Platt J. Fast training of support vector machines using sequential minimal optimization. In Advances in Kernel Methods - Support Vector Learning, edited by B. Scholkopf, C.J.C.Surges, and A.J. Smola, MIT Press,1999,185-208.
    [115]C. J. C. Burges. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery,1998,2(2):121-167.
    [116]K. P. Bennett, E. J. Bredensteiner. Gemtetry in learning. In C. Gorini et al. (Eds.), Geometry at work. MAA Press. Also available as http://www.rpi.edu/~bennek/geometry2.ps.
    [117]K. P. Bennett, E. J. Bredensteiner. Duality and geometry in svm classifiers. In:Proceedings of the Seventeenth International Conference on Machine Learning, Morgan Kaufmann Publishers Inc., 2000,57-64.
    [118]Bi Jinbo, Bennett Kristin P. A geometric approach to support vector regression. Neurocomputing, 2003,55:79-108.
    [119]Scholkopf B, Smola A J. Learning with kernels-support vector machines, regularization, optimization, and beyond. MIT Press,2002.
    [120]B. A. Murtagh, M. A. Saunders. MINOS 5.4 user's guide. Technical Report SOL 83.20, Stanford University,1993.
    [121]R. J. Vanderbei. LOQO user's manual - version 3.10. Technical Report SOR-97-08. Princeton University, Statistics and Operations Research,1997. Code available at http://www.princeton.edu/rvdb/
    [122]Stitson M O, Weston J A E, Gammennan A, Vovk V, Vapnik V. Theory of support vector machines. Technical Report CSD-TR-96-17, Royal Holloway University of London, Dec.31,1996.
    [123]Osuna E. E., Freund R., Girosi F. Support vector machines:training and applications A.I. memo 1602, MTT Artificial Intelligence Laboratory,1997.
    [124]Osuna E, Freund R, Girosi F. Training support vector machines:an application to face detection. Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, New York:IEEE,1997,130-136.
    [125]Joachims T. Making large-scale svm learning practical. In:Scholkopf B, Burges C J C, Smola Aeds., Advances in Kernel Methods - Supprot Vector Learning, Cambridge, MA:MIT Press,1998, 169-184.
    [126]S. K. Shevade, S. S. Keerthi, C. Bhattacharyya, K. R. K. Murthy. Improvements to the SMO algorithm for SVM regression, IEEE Trans. Neural Network,2000,11(5):1188-1193.
    [127]S. S. Keerthi, S. K. Shevade, C. Bhattaacharyya, K. R. K. Murthy. Improvements to Platt's SMO algorithm for SVM classifier design, Neural Comput.,2001,13:637-649.
    [128]C. C. Chang, C. J. Lin, LIBSVM:A Library for Support Vector Machines [Online]. Available: http://www.csie.ntu.edu.tw/~cjlin/libsvm/.
    [129]B. H. Guang, K. Z. Mao, C. K. Siew, D. S. Huang. Fast modular network implementation for support vector machines. IEEE Trans. Neural Network,2005,16(6):1651-1663.
    [130]L. J. Cao, S. S. Keerthi, Chong-Jin Ong, J. Q. Zhang, Uvaraj Periyathamby, Xiu Ju Fu, H. P. Lee. Parallel sequential minimal optimization for the training of support vector machines. IEEE Transactions on Neural Networks,2006,17(4):1039-1049.
    [131]J. Platt. Sequential minimal optimization:a fast algorithm for training support vector machines, Microsoft Research Technical Report MSR-TR-98-14,1998.
    [132]Alfonso Rojas, Asoke K. Nandi. Practical scheme for fast detection and classification of rolling-element bearing faults using support vector machines. Mechanical Systems and Signal Processing,2006,20:1523-1536.
    [133]Jian-xiong Dong, Adam Krzyzak, Ching Y. Suen. Fast SVM training algorithm with decomposition on very large data sets. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005, 27(4):603-618.
    [134]Pai-Hsuen Chen, Rong-En Fan, Chih-Jen Lin. A study on SMO-type decomposition methods for support vector machines. IEEE Trans. Neural Network,2006,17:893-908.
    [135]R. -E. Fan, P. -H. Chen, C.-J. Lin. Working set selection using the second order information for training SVM. Technical report, Department of Computer Science, National Taiwan University, 2005.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700