用户名: 密码: 验证码:
基于优化理论的支持向量机学习算法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
支持向量机是借助优化方法解决机器学习问题的新工具.近年来,支持向量机越来越受到人们的广泛关注,在其理论研究和算法实现方面都取得了重大进展,成为机器学习领域的前沿热点课题.
     支持向量机将机器学习问题转化为优化问题,并应用优化理论构造算法.优化理论是支持向量机的重要理论基础之一,本文主要从优化理论和方法的角度对支持向量机进行研究.主要内容如下:
     1.对最小二乘支持向量机进行研究.提出一类训练最小二乘支持向量机的条件预优共轭梯度法.当训练样本的个数较大时,最小二乘支持向量机需要求解高阶线性方程组,利用分块矩阵的思想将该高阶线性方程组系数矩阵降阶,为了提高收敛速度,克服数值的不稳定性,采用条件预优共轭梯度法求解低阶的线性方程组,大大提高了最小二乘支持向量机的训练速度.
     2.对光滑支持向量机进行研究.无约束支持向量机模型是非光滑不可微的,许多优化算法无法直接用来求解该模型.采用CHKS函数作为光滑函数,提出了光滑的CHKS支持向量机模型,并用Newton-Armijo算法来训练该模型.该算法通过批处理训练来提高训练速度,节省存储空间,可以有效求解高维、大规模的分类问题.
     3.基于优化理论中的KKT互补条件,分别建立了支持向量分类机和支持向量回归机的无约束不可微优化模型,并给出了有效的光滑化近似解法.建立了支持向量分类机的无约束不可微优化模型,给出了求解支持向量分类机的调节熵函数法.该方法不需要参数取值很大就可以逼近问题的最优解,避免了一般熵函数法为了逼近精确解,参数取得过大而导致数值的溢出现象;调节熵函数法同样可以用来训练无约束不可微的支持向量回归机,提出了求解支持向量回归机的调节熵函数法,有效避免了数值的溢出现象.这两个算法分别为求解支持向量分类机和支持向量回归机提供了新的思路.
     4.对模糊支持向量机进行研究.针对支持向量分类机对训练样本中的噪声和孤立点特别敏感的问题,提出了一类基于边界向量提取的模糊支持向量机方法.选择可能成为支持向量的边界向量作为新样本,减少了参与训练的样本数目,提高了训练速度.样本的隶属度根据边界样本和噪声点与所在超球球心的距离分别确定,减弱了噪声点的影响,增强了支持向量对支持向量机分类的作用;为了克服最小二乘支持向量机对于孤立点过分敏感的问题,将模糊隶属度概念引入最小二乘支持向量机中,提出了基于支持向量域描述的模糊最小二乘支持向量回归机.新的隶属度的定义减弱了噪声点的影响.把所要求解的约束凸二次优化问题转化为正定线性方程组,并采用快速Cholesky分解的方法求解该方程组.在不牺牲训练速度的前提下,比支持向量机和最小二乘支持向量机具有更高的预测精度.
     5.对半监督支持向量机进行研究.为了改进?TSVM的分类性能,引进了一个光滑分段函数,给出了光滑分段半监督支持向量机模型.光滑分段函数的逼近性能优于高斯近似函数.根据光滑分段半监督支持向量机的非凸特性,首次采用保证收敛的线性粒子群算法来训练半监督支持向量机,光滑分段半监督支持向量机在分类性能上优于?TSVM.
Support vector machine (SVM) is a new approach that can solve machine learning problem with optimization methods. In recent years, there has been a surge of interest in SVM. It has achieved a prodigious progress in the theory research and algorithm implement, thus has been an active research area in machine learning.
     SVM translates the machine learning problems into the optimization problems and applies the optimization theory to construct algorithms. The optimization theory is the important theory foundation of SVM. This dissertation mainly does researches on SVM with the optimization theory and the optimization methods. All of the research results can be described as follows.
     1. The study on least squares support vector machine (LSSVM). A preconditioning conjugate gradient method for LSSVM is proposed. LSSVM has to solve a large scale linear system of equation when the number of the training simples is large. Block matrix is applied to reduce the system of equations. In order to improve the rate of convergence and overcome instability of numerical value, a preconditioning conjugate gradient method is presented for solving the reduced system of linear equations. The training efficiency of LSSVM is improved greatly by the method.
     2. The study on smooth SVM. The objective function of the unconstrained SVM model is non-smooth and non-differentiable. So a lot of optimization algorithms can not be used to find the solution of the model. To overcome the difficulty, a novel smoothing method using Chen-Harker-Kanzow-Smale functions for SVM (CHKS-SVM) is proposed. The Newton-Armijo method is adopted to train the smooth CHKS-SVM. Using the proposed method, the optimal separating hyperplane is trained in batches, both the training time and memories needed in the training process are saved. So the novel method can efficiently handle large scale and high dimensional problems.
     3. Based on KKT complementary condition in optimization theory, two unconstrained non-differential optimization model for SVM and support vector regression (SVR) are proposed respectively. A smooth approximate method is given to deal with the proposed optimization problems. An adjustable entropy function method is given to train SVM. The proposed method can find an optimal solution with relatively small parameters, which avoids the numerical overflow in the traditional entropy function methods. The adjustable entropy function method can be used to train SVR analogously, which avoids the numerical overflow effectively. It is a new approach to solve SVM and SVR.
     4. The study on fuzzy SVM. A fuzzy SVM based on border vector extraction is presented, which overcomes the disadvantage that traditional SVM are so sensitive to noises or outliers in the training samples. Select possible support vectors for border vectors to train SVM, so as to reduce training samples and improve training speed. The fuzzy membership, which is defined according to the distance between the center of their spheres and border vectors and outliers respectively, both diminishes the effect of noises and outliers and improves the role of support vectors to design a classifier. The conception of fuzzy membership is introduced into LSSVM in order to overcome the disadvantage that LSSVM is much sensitive noises or outliers in the training samples. And then fuzzy LSSVM (FLSSVM) is proposed based on support vector domain description. The new defined fuzzy membership can reduce the effect of outliers. The constrained convex quadric programming problem can be translated into positive definite linear equation system. The fast Cholesky decomposition is applied to solve the linear equation system. The regression performance of FLSSVM is superior to that of SVM and LSSVM.
     5. The study on Semi-supervised SVM (S3VM). A piecewise function is used as a smooth function and smooth piecewise semi-supervised support vector machine (SPS3VM) is given. The approximation performance of the smooth piecewise function is better than that of the Gaussian approximation function. According to the non-convex character of SPS3VM, a converging linear particle swarm optimization is first used to train S3VM. Experimental results illustrate that our proposed algorithm improves ?TSVM in terms of classification accuracy.
引文
[1] F. Mulier. Vapnik-Chervonenkis (VC) learning theory and its applications. IEEE Transactions on Neural Networks, 1999, 10(5): 985-987.
    [2]黄凤岗,宁克欧.模式识别.哈尔滨工业大学出版社,1998.
    [3] S. Haykin. Neural networks: A comprehensive Foundation. Pearson Education Inc. 1999.
    [4]高隽.人工神经网络原理及仿真实例.北京:机械工业出版社,2003.
    [5] A.R.Ahmad, M.Khalia, C.V.Gaudin, et al. Online handwriting recognition using support vector machine. TENCON 2004. IEEE Region 10 Conference, 2004, 4: 311-314.
    [6] N.Matic, I.Guyon, J.Denker, et al. Writer adaptation for on-line handwritten character recognition. In Second international Conference on Pattern Recognition and Document Analysis, Tsukuba, Japan. IEEE Computer Society Press, 1993: 187-191.
    [7] M. Schmidt, H. Gish. Speaker identification via support vector classifiers. Proc. ICASSP. Atlanta, 1996: 105-108.
    [8] M. Gordan, C. Kotropoulos and I. Pitas. Application of support vector machines classifiers to visual speech recognition. Proc. Digital Signal Processing, 2002: 1093-1096.
    [9] Y. Fu, R. shen and H. Lu. Watermarking scheme based on support vector machine for colour images. Electronics Letters. 2004, 40(16): 986-987.
    [10] E. Osuna, R. Freund and F. Girosi. Training support vector machines: An application to face detection. Proc. IEEE. Conf. Computer Vision and Pattern Recognition, 1997: 130-136.
    [11] G. D.Guo, S. Z. Li and K. L. Chan. Support vector machines for face recognition. Image and Vision Computing, 2001, 19(9-10): 631-638.
    [12] V. N. Vapnik. An overview of statistical learning theory. IEEE Transactions on Neural Networks, 1999, 10(5): 988-999.
    [13]张学工.关于统计学习理论与支持向量机.自动化学报,2000, 26 (1): 32-42.
    [14]史忠植.知识发现.北京:清华大学出版社,2002.
    [15] V. N. Vapnik. Estimation of dependencies based on empirical data. Berlin: Springer -Verlag, 1982.
    [16] V. Vapnik. Statistical learning theory. Wiley, New York, 1998.
    [17] V. Vapnik, A. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory of probability and its applications, 1971, 16(2): 264-280.
    [18] V. Vapnik, A. Chervonenkis. The necessary and sufficient conditions for consistency in the empirical risk minimization method. Pattern Recognition and Image Analysis, 1991, 1(3): 283-305.
    [19] V. Vapnik, A. Chervonenkis. The necessary and sufficient conditions for the uniform convergence of means to their expectations. Theory of Probability and Its Applications, 1981, 26(3): 532-553.
    [20] A. Cherkassky, F. Mulier. Learning from data: concepts, theory and methods. N Y: John Viley & Sons, 1997.
    [21] V. Vapnik.The nature of statistical learning theory. Springer, New York, 1995.
    [22] C. Burges.A Tutorial on support vector machines for pattern recognition.Data Mining and Knowledge Discovery, 1998, 2(2): 121-167.
    [23] S. Abe. Support vector machines for pattern classification. Springer, 2005.
    [24] H.P. Huang, Y.H. Liu. Fuzzy support vector machines for pattern recognition and data mining. International Journal of Fuzzy Systems, 2002, 4(3): 826-835.
    [25] A. J. Smola, B. Sch?lkopf. A tutorial on support vector regression. Statistics and Computing, 2004, 14(3): 199-222.
    [26] R. Collobert, S. Bengio. SVMTorch: support vector machines for large-scale regression problems. The Journal of Machine Learning Research, 2001, 1: 143-160.
    [27] R. Mohamed, A. El-Baz and A. Farag. Probability density estimation using advanced support vector machines and the expectation maximization algorithm, Proceedings of World Academy of Science, Engineering and Technology, 2005, 2: 137-140.
    [28] T. Joachims. Text categorization with support vector machine: learning with many relevant features. European Conference on Machine Learning, 1998.
    [29] D. Heckerman, J. Platt, S. Dumais, et al. Inductive learning algorithms and representations for text categorization. In Proceedings of the 7th International Conference on Information and Knowledge Management, 1998.
    [30] H. Ji, T. Ah-Hwee. A comparative study on Chinese text categorization methods. In proceedings of PRICAI’2000 Interernational Workshop on Text and Web Mining, Melbourne, Australia, 2000: 24-35.
    [31]李晓黎,刘继敏,史忠植.基于支持向量机与无监督聚类相结合的中文网页分类器.计算机学报,2001, 24(1): 63-68.
    [32] Y. F. Shi, Y. P. Zhao. Comparison of text categorization algorithms. Wuhan University Journal of Natural Sciences, 2004, 9(5): 798-804.
    [33]孙晋文,肖建国.基于SVM的中文文本分类反馈学习技术的研究.控制与决策,2004, 19(8): 927-930.
    [34] L. B. Jack, A. K. Nandi. Fault detection using support vector machines and artificial neural networks. Mechanical Systems and Signal Processing, 2002, 16(3): 373-390.
    [35] B. Samanta. Gear fault detection using artificial neural networks and support vector machines with genetic algorithms. Mechanical Systems and Signal Processing, 2004, 18(3): 625-644.
    [36] M. Ge, R. Du, G. C. Zhang, et al. Fault diagnosis using support vector machine with an application in sheet metal stamping operations. Mechanical Systems and Signal Processing, 2004, 18(1): 143-159.
    [37]付岩,王耀威,王伟强等.SVM用于基于内容的自然图像分类和检索.计算机学报,2003, 26(10): 1261-1265.
    [38] L. Wang, P. Xue and K. L. Chan. Incorporating prior knowledge into SVM for image retrieval. Proceedings of 17th International Conference on Pattern Recognition, 2004.
    [39] C. Pan, X. G. Yan and C. X. Zheng. Fast training of SVM for color-based image segmentation. Proceedings of 2004 International Conference on Machine Learning and Cybernetics, 2004: 3820-3825.
    [40] L. M. Patnaik. Daubechies 4 wavelet with a support vector machine as an efficient method for classification of brain images. Journal of Electronic Imaging, 2005, 14 (1): 1-7.
    [41] C. F. Tsai. Training support vector machines based on stacked generalization for image classification. Neurocomputing, 2005, 64(1): 497- 503.
    [42] A. David, B. Lerner. Support vector machine-based image classification for genetic syndrome diagnosis. Pattern Recognition letters, 2005, 26(8): 1029-1038.
    [43] B. Boser, I. Guyon and V. N. Vapnik. A training algorithm for optimal margin classifiers. In Proceedings of 5th Annual Workshop Computation on Learning Theory. Pittsburgh, PA: ACM, 1992.
    [44] E. Osuna, R. Freund and F. Girosi. An improved training algorithm for support vector machines. Proceedings of the 1997 IEEE Workshop on Neural Networksfor Signal Processing. NewYork: IEEE Press, 1997: 276-285.
    [45] T. Joachims. Making large-scale SVM learning practical. In: C. Burges, B. Scholkopf. Advances in Kernel Methods: Support Vector Learning. Cambridge, MA: MIT press, 1998.
    [46] T. Joachims. SVMlight, 1998. http://svmlight.joachims.org/.
    [47] C. J. Lin. On the convergence of the decomposition method for support vector machines. IEEE Transactions on Neural Networks, 2001, 12(6): 1288-1298.
    [48] C. W. Hsu, C. J. Lin. A simple decomposition method for support vector machines. Machine Learning, 2002, 46(1): 291-314.
    [49] J. C. Platt. Fast training of support vector machines using sequential minimal optimization. In: B. Sch?lkopf, C. J. Burges, A. J. Smola (ed.). Advance in Kernel Methods-Support Vector Learning, Cambridge, MA, MIT Press, 1999, 185-208.
    [50] J. C. Platt. Using analytic QP and sparseness to speed training of support vector machines. In M. Kearns, S. Solla and D.Cohn, Advances in Neural Information Processing Systems 11. Cambridge, MA: MIT Press, 1999, 557-563.
    [51] S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, et al. Improvements to Platt's SMO algorithm for SVM classier design, Technical report, National University of Singapore, 1999.
    [52] C. J. Lin. Asymptotic convergence of an SMO algorithm without any assumptions. IEEE Transactions on Neural Networks, 2002, 13(1): 248-250.
    [53] S. S. Keerthi, E. G. Gilbert. Convergence of a generalized SMO algorithm for SVM classifier design. Machine Learning, 2002, 46: 351-360.
    [54] N. Takahashi, T. Nishi. Rigorous proof of termination of SMO algorithm for support vector machines. IEEE Transactions on Neural Network, 2005, 16(3): 774-776.
    [55] O. L.Mangasarian, D. R. Musicant. Successive overrelaxation for support vector machines. IEEE Transactions on Neural Network, 1999, 10(5): 1032-1037.
    [56] O. L. Mangasarian. Generalized support vector machine. Advances in Large Margin Classifiers, A. J. Smola, P. Bartlett, B. Schokopf and D. Schuurmans, editors, MIT Press, 2000: 135-146.
    [57] O. L. Mangasarian, D. R. Musicant. Data discrimination via nonlinear generalized support vector machines. In M. C. Ferris, O. L. Mangasarian and J.-S. Pang, editors, Complementarity: Applications, Algorithms and Extensions, Kluwer Academic Publishers, 2001: 233-251.
    [58] Y. J. Lee, O. L. Mangasarian. SSVM: A smooth support vector machine.Computational Optimization and Applications, 2001, 20(1): 5-22.
    [59] Y. J. Lee, W. F. Hsieh and C. F. Huang,ε-SSVR: A smooth support vector machine forε-insensitive regression. IEEE Transactions on Knowledge and data Engineering, 2005, 17(5): 678-685.
    [60] O. L. Mangasarian, D. R. Musicant. Active set support vector machine classification. Advances in Neural Information Processing Systems 2000 (NIPS 2000), MIT Press, 2001: 577-583.
    [61] O. L. Mangasarian, D. R. Musicant. Lagrangian support vector machines. Journal of Machine Learning Research, 2001, 1: 161-177.
    [62] Y. J. Lee, O. L. Mangasarian. RSVM: Reduced support vector machines. In Proceedings of the SIAM International Conference on Data Mining, Chicago, 2001, SIAM, Philadelphia.
    [63] G. Fung, O. L. Mangasarian. Proximal support vector machine classifiers. Proceedings KDD-2001, San Francisco. Association for Computing Machinery, New York, 2001: 77-86.
    [64] J. A. K. Suykens, E. J. Vandewal. Least squares support vector machine classifiers. Neural Processing Letters, 1999, 9(3): 293-300.
    [65] J. A. K. Suykens, T. Van Gestel, J. De Brabanter, et al. Least squares support vector machines. World Scientific, Singapore, 2002.
    [66] K. S. Chua. Efficient computations for large least square support vector machine classifiers. Pattern Recognition Letters, 2003, 24: 75-80.
    [67] D. Tsujinishi, S. Abe, Fuzzy least squares support vector machines for multiclass problems. Neural Networks, 2003, 16(6): 785-792.
    [68] C. F. Lin, S. Wang. Training algorithms for fuzzy support vector machines with noisy data. Pattern Recognition Letters, 2004, 25(14): 1647-1656.
    [69] S. S. Keerthi, S. K. Shevade. SMO algorithm for least-squares SVM formulations. Neural Computation, 2003, 15(2): 487-507.
    [70] T. V. Gestel, J. A. K. Suykens, G. Lanckriet, et al. Bayesian framework for least squares support vector machine classifiers, Gaussian processes, and kernel fisher discriminant analysis. Neural Computation, 2002, 14(5): 1115-1147.
    [71] T. V. Gestel, J. A. K. Suykens, B. Baesens, et al. Benchmarking least squares support vector machine classifiers. Machine Learning, 2004, 54(1): 5-32.
    [72]袁玉波,严杰,徐成贤.多项式光滑的支撑向量机.计算机学报,2005, 28(1): 9-17.
    [73]熊金志,胡金莲,袁华强等.一类光滑支持向量机新函数的研究.电子学报,2007, 35(2): 366-370.
    [74] Q. Wu, S. Y. Liu and L. Y. Zhang. Adjustable entropy function method for support vector machines. Journal of Systems Engineering and Electronics, 2008, 19(5): 1029-1034.
    [75] C. F. Lin, S. D. Wang. Fuzzy support vector machines. IEEE Transactions on Neural Network, 2002, 13(3): 466-471.
    [76] H. P. Huang, Y. H. Liu. Fuzzy support vector machines for pattern recognition and data mining. International Journal of Fuzzy Systems, 2002, 4(3): 826-835.
    [77] S. Abe, T. Inoue. Fuzzy support vector machines for multiclass problems. ESANN'2002 proceedings-European Symposium on Artificial Neural Networks Bruges (Belgium), 2002: 113-118.
    [78] Y. Zhang, Z. X. Chi and K. Q. Li. Fuzzy multi-class classifier based on support vector data description and improved PCM. Expert Systems with Applications, 2009, 36(5): 8714-8718.
    [79] M. D. Shieh, C. C. Yang. Classification model for product form design using fuzzy support vector machines. Computers & Industrial Engineering, 2008, 55(1): 150-164.
    [80] O. Chapelle, M. Chi and A. Zien. A continuation method for semi-supervised SVMs. In Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh, USA, 2006.
    [81] V. Sindhwani, S. Keerthi and O. Chapelle. Deterministic annealing for semi-supervised kernel machines. In Proceedings of the 23rd International Conference on Machine Learning Pittsburgh, USA, 2006.
    [82] L. Bruzzone, M. M. Chi and M. Marconcini. Transductive SVMs for semisupervised classification of hyperspectral data. In proceedings of IEEE International Geoscience and Remote Sensing Symposium, 2005. IGARSS '05. 2005, 1: 164-167.
    [83] T. Joachims. Transductive inference for text classification using support vector machines. In International Conference on Machine Learning, 1999: 200-209.
    [84] O. Chapelle, A. Zien. Semi-supervised classification by low density separation. In Proceedings of the Tenth International Workshop on Artificial elligence and Statistics, 2005: 57-64.
    [85] T. D. Bie, N. Cristianini. Semi-supervised learning using semi-definite programming. In O. Chapelle, B. Sch?elkopf, and Zien A, editors, Semi-supervised Learning, MIT Press, Cambridge, MA, 2006.
    [86] A. Astorino, A. Fuduli. Nonsmooth optimization techniques for semi-supervised classification. IEEE Transactions on Pattern Analysis and Machine elligence, 2007, 29: 2135-2142.
    [87] O. Chapelle, V. Sindhwani and S. Keerthi. Branch and bound for semi-supervised support vector machines. Advances in Neural Information Processing Systems 19, MIT Press, Cambridge, MA, 2007: 217-256.
    [88] V. N. Vapnik, S. Golowich and A. Smola. Support vector method for function approximation, regression estimation, and signal processing, In Advances in Neural Information Processing Systems 9, Cambridge, MA, MIT Press, 1997: 281-287.
    [89] H. Drucker, C. J. C. Burges, L. Kaufman, et al. Support vector regression machines,Advances in Neural Information Processing Systems, 9, Cambridge, MA, MIT Press, 1997: 155-161.
    [90] Y. Quan, J. Yang and L. X. Yao. An improved way to make large-scale SVR learning practical. EURASIP Journal on Applied Signal Processing, 2004, 8: 1135-1141.
    [91] S. P. Liao, H. T. Lin and C. J. Lin. A note on the decomposition methods for support vector regression. Technical Report, National Taiwan University, 2001.
    [92] P. Laskov. An improved decomposition algorithm for regression support vector machines. Advances in Neural Information Processing System 12, MIT Press, 2000: 484-490.
    [93] S. K. Hevade, S. S. Keerthi, C. Bhattacharyya, et al. Improvements to the SMO algorithm for SVM regression. IEEE Transactions on Neural Networks, 2000, 11(5): 1188-1193.
    [94] G.W. Flake, S. Lawrence. Efficient SVM regression training with SMO. Machine Learning, 2002, 46 (1-3): 271-290.
    [95] H. S. Yazdi, T. Royani, M. S. Yazdi, et al. fuzzy cost support vector regression. International Journal of elligent Systems and Technologies, 2008, 3(4): 238-243.
    [96] P. Y. Hao, J. H. Chiang. A fuzzy model of support vector regression machine. International Journal of Fuzzy Systems, 2007, 9(1): 45-50.
    [97] Q. S. She, Z. Z. Luo and Y. P. Zhu. A fuzzy membership model for FSVR-based image coding. Proceedings of the 2008 Fourth International Conference on Natural Computation, 2008, 2: 8-12.
    [98]熊金志,胡金莲,袁华强等.支持向量回归机的光滑函数研究.模式识别与人工智能,2008, 21 (3): 273-279.
    [99]陶卿,曹进德,孙德敏.基于支持向量机分类的回归方法.软件学报,2002, 13 (5): 1024-1028.
    [100] Q. Tao, D. M. Sun, J. S. Fan, et al. The maximal margin linear classifier based on the contraction of the closed convex hull, Journal of Software, 2002, 13(3): 404-409 (in Chinese).
    [101] M. Pontil, A. Verri. Support vector machines for 3D object recognition. IEEE Tranactions on Pattern Analysis and Machine elligence, 1998, 20(6): 637-645.
    [102] O. Chapelle, P. Haffner and V. N. Vapnik. Support vector machines for histogram-based image classification. IEEE Transactions on Neural Networks, 1999, 10(5): 1055-1064.
    [103] F. Perez-Cruz, O. Bousquet. Kernel methods and their potential use in signal processing. IEEE Signal Processing Magazine, 2004, 21(3): 57-65.
    [104] G. Camps-Valls, L. Gomez-Chova, et al. Robust support vector method for hyperspectral data classification and knowledge discovery. IEEE Transactions on Geoscience Remote Sensing, 2004, 42(7): 1530-1542.
    [105] F. Melgani, L. Bruzzone. Classification of hyperspectral remote sensing images with Support Vector Machines. IEEE Transactions on Geoscience Remote Sensing, 2004, 42(8): 1778-1790.
    [106]高学,金连文,尹俊勋等.一种基于支持向量机的手写汉字识别方法.电子学报,2002, 30 (5): 651-654.
    [107]阎威武,朱宏栋,邵惠鹤.基于最小二乘支持向量机的软测量建模.系统仿真学报,2003, 15(10): 1494-1496.
    [108]周伟达,张莉,焦李成.自适应支撑矢量机多用户检测.电子学报,2003, 31(1): 92-97.
    [109]绕鲜,董春曦,杨绍全.基于支持向量机的入侵检测系统.软件学报,2003, 14(4): 798-803.
    [110]朱家元,杨云,张恒喜等.支持向量机的多层动态自适应参数优化.控制与决策,2004, 19(2): 223-225.
    [111]王华忠,张雪申,俞金寿.基于支持向量机的故障诊断方法.华东理工大学学报,2004, 30(2): 179-182.
    [112]张绍武,潘泉,张洪才等.基于支持向量机的多类蛋白质折叠子预测.西北工业大学学报,2004, 22(2): 200-204.
    [113]祁亨年,杨建刚,方陆明.基于多类支持向量机的遥感图像分类及其半监督式改进策略.复旦大学学报(自然科学版),2004, 43(5): 781-784.
    [114]潘晨,闫相国,郑崇勋等.利用单类支持向量机分割血细胞图像.西安交通大学学报,2005, 39(2): 150-153.
    [115]刘隽,周涛,周佩玲.GA优化支持向量机用于混沌时间序列预测.中国科学技术大学学报,2005, 35(2): 258-263.
    [116]崔万照,朱长纯,保文星等.基于模糊模型支持向量机的混沌时间序列预测.物理学报,2005, 54(7): 3009-3018.
    [117]张春城,周正欧.基于支持向量机的浅地层探地雷达目标分类识别研究.电子学报,2005, 33(6): 1091-1094.
    [118]孙宗海,杨旭华,孙优贤.基于支持向量机的模糊回归估计.浙江大学学报(工学版),2005, 39(6): 810-813.
    [119]邓乃扬,田英杰.数据挖掘中的新方法—支持向量机.北京:科学出版社,2004.
    [120] N. Cristianini, J. T. Shawe. An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge, UK, 2000.
    [121]薛毅.最优化原理与方法.北京:北京工业大学出版社,2003.
    [122]褚蕾蕾,陈绥阳,周梦.计算智能的数学基础.北京:科学出版社,2002.
    [123] C. A. Micchelli. Interpolation of scattered data: distance matrices and conditionally Positive definite functions. Constructive Approximation, 1986, 2: 11-22.
    [124] D. Mackay. Introduction to Gaussian processes. In Neural Networks and Machine Learning (NATO Asi series): Ed. By Chris Bishop, 1999.
    [125] F. Girosi. An equivalence between sparse approximation and support vector machines. Neural Computation, 1998, 10(6): 1455-1480.
    [126] B. Sch?lkopf, C. J. C. Burges and A. J. Smola. Advances in kernel methods- support vector learning. MIT Press, 1999.
    [127] D. Haussler. Convolution kernels on discrete structures. Technical report UCSC-CRL-99-10, University of Californian Santa Cruz, Computer science Department, July l999.
    [128] C. Watkins. Dynamic alignment kernels. Technical Report CSD-TR-98-11, Royal Holloway, University of London, Computer science Department, January l999.
    [129] V. N. Vapnik.许建华,张学工译.统计学习理论.北京:电子工业出版社,2004.
    [130] B. Sch?lkopf, A. J. Smola, R. C. Williamson, et al. New support vector algorithm. Neural Computation, 2000, 12(5): 1207-1245.
    [131] B. Sch?lkopf, J. C. Platt, J. Shawe-Taylor, et al. Estimating the support of a high-dimensional distribution. Neural Computation, 2001, 13(7): 1443-1447.
    [132] D. Tax, R. Duin. Data domain description using support vector. In proceedings of ESANN99, Bruges (Belgium), D-Facto public., 1999: 251-256.
    [133] D.Tax, A. Ypma and R. Duin. Support vector domain description. Pattern Recognition letters, 1999, 20: 1191–1199
    [134] D. Xin, Z. H. Wu and W. F. Zhang. Support vector domain description for speaker recognition. In Proceedings of Neutral Networks for Signal Processing XI, 2001, Proceedings of the 2001 IEEE Signal Processing Society Workshop. North Falmouth, MA, USA, 2001: 481-488.
    [135] J. Seo, H. Ko. Face detection using support vector domain description in color images. In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2004: 729-732.
    [136] Y. Alexander, T. David and D. Robert. Robust machine fault detection with independent component analysis and support vector data description. In Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing, 1999: 67-76.
    [137] F. Sha, L. K. Saul and D. Lee. Multiplicative updates for nonnegative quadratic programming in support vector machines. Advances in Neural and Information Processing Systems. Cambridge, MA, 2003.
    [138]朱永生,王成栋,张优云.二次损失函数支持向量机性能的研究.计算机学报,2003, 26(8): 982-989.
    [139]阎辉,张学工,李衍达.支持向量机与最小二乘法的关系研究.清华大学学报(自然科学版),2001, 41(9): 77-80.
    [140]阎威武,邵惠鹤.支持向量机和最小二乘支持向量机的比较及应用研究.控制与决策,2003, 18(3): 358-360.
    [141]郑胜,柳健,田金文.基于向量机的边缘检测算法优化研究.电子与信息学报,2005, 27(5): 717-721.
    [142] P. M. Murphy, D. W. Aha. UCI repository of machine learning databases, 1998. .http://www.ics.uci.edu/~mlearn/MLRepository.html.
    [143] C. Chen, O. L. Mangasarian. A class of smoothing functions for nonlinear and mixed complementarity problems. Computational Optimization and Application, 1996, 5: 97-138.
    [144] S. M. Lu, X. Z. Wang. A comparison among four SVM classification methods: LSVM, NLSVM, SSVM and NSVM. Proceedings of the Third International Conference on Machine Learning and Cybernetics. Shanghai, 2004: 4277-4282.
    [145] C. Chen, O. L. Mangasarian. Smoothing methods for convex inequalities andlinear complementarity problems. Mathematical Programming, 1995, 71: 51 - 69.
    [146] L. Armijo. Minimization of functions having Lipschitz-continuous first partial derivatives, Pacific Journal of Mathematics, 1966, 16: 1-3.
    [147] D. P. Bertsekas. Nonlinear programming. Athena Scientific, Belmont, MA, second edition, 1999.
    [148] D. R. Musicant. NDC: normally distributed clustered datasets, 1998. .http://www.cs.wisc.edu/~musicant/data/ndc.
    [149] T. K. Ho and E. M. Kleinberg. Checkerboard dataset, 1996. .http://www.cs.wisc.edu/math-prog/mpml.html.
    [150]李兴斯.一类不可微优化问题的有效解法.中国科学(A辑), 1994, 24(4): 371-377.
    [151]李兴斯.非线性极大极小问题的一个有效解法.科学通报, 1991, 36(19): 1448-1451.
    [152] L. W. Zhang, H. W. Tang. A maximum entropy algorithm with parameters for solving minimax problem. Archives of Control Sciences, 1997, 6(1-2): 47-59.
    [153]杨庆之,杨德庄,张敏洪.调节熵函数,计算数学,2001, 23 (1): 81-86.
    [154]戴华.矩阵论.科学出版社,2001.
    [155]周水生,周利华.训练支持向量机的低维Newton算法.系统工程与电子技术,2004, 26(9): 1315-1318.
    [156]郭崇慧,孙建涛,陆玉昌.广义支持向量机优化问题的极大熵方法.系统工程理论与实践,2005, 6: 27-32.
    [157]陶卿,孙德敏等.基于闭凸包收缩的最大边缘线性分类器.软件学报,2002, 13(3): 404-409.
    [158] Delve. Data for evaluating learning in valid experiments, Kin-family dataset. .http://www.cs.toronto.edu/~delve/data/kin/desc.html.
    [159] Delve. Data for evaluating learning in valid experiments, Comp-activ dataset. ..http://www.cs.toronto.edu/~delve/data/comp-activ/desc.html.
    [160] X. G. Zhang. Using class-center vector to built support vector machines. Neural Network for Signal Processing IX, Proceedings of the 1999 IEEE Workshop, Madison, WI, USA, 1999: 33-37.
    [161]张翔,肖小玲,徐光祐.基于样本之间紧密度的模糊支持向量机方法.软件学报,2006, 17 (5): 951-958.
    [162] J. H. Chiang, P. Y. Hao. A new kernel-based fuzzy clustering approach: support vector clustering with cell growing. IEEE Transactions on Fuzzy Systems, 2003, 11(4): 518-527.
    [163] Q. Tao, et al. Posterior Probability support vector machines for unbalanced data. IEEE Transactions on Neural Networks. 2005, 16(6): 1561-1573.
    [164] J. Kennedy, R. C. Eberhart. Particle swarm optimization. Proceedings of the IEEE International Conference on Neural Networks, 1995, 4: 1942-1948.
    [165] U. Paquet, A. P. Engelbrecht. Particle swarms for linearly constrained optimization. Fundamenta Informaticae, 2007, 76: 147-170.
    [166] U. Paquet, A. P. Engelbrecht. Training support vector machines with particle swarms, Proceedings of the IEEE International Joint Conference on Neural Networks, Portland, USA, 2003.
    [167] K. Bennett, A. Demiriz. Semi-supervised support vector machines. Advances in Neural Information Processing Systems, 1998, 12.
    [168] M. M. Chi, L. Bruzzone. Semisupervised classification of hyperspectral images by SVMs optimized in the primal. IEEE Transactions on Geoscience and Remote Sensing, 2007, 45(6): 1870-1880.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700