用户名: 密码: 验证码:
原始空间中支持向量机若干问题的研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
支持向量机成为一种主要的机器学习技术已经有十多年了,然而它的大部分学习算法都是在对偶空间针对其对偶问题提出的。近年来的研究表明,直接在原始空间对支持向量机的原始问题进行求解也是训练支持向量机的一种有效途径。随着人们在原始空间对支持向量机研究的深入,实际应用中碰到的各种问题也开始在原始空间进行求解,如半监督学习问题等。但总体来说,支持向量机在原始空间中的研究还不是很多,也不够完善。因此,本文重点研究了原始空间中支持向量机分类算法的以下四个问题。
     1.针对光滑支持向量机中现有的光滑函数逼近精度不高的问题,将正号函数变形并展开为无穷多项式级数,由此得到了一族多项式光滑函数,并证明了这类光滑函数的优良性能,它既能满足任意阶光滑的要求,也能达到任意给定的逼近精度。最后将得到的多项式光滑函数用于求解广义支持向量机。
     2.半监督支持向量机利用大量的未标记样本和少量的标记样本共同学习以改进其泛化性能,最后得到一个非凸优化问题,对其优化采取两种策略:组合优化和连续优化。组合优化的具体方法是给出了一个自训练半监督支持向量机分类算法,它的子程序是用前面得到的多项式光滑函数在原始空间求解标准支持向量机。接下来用连续优化的方式给出了一个多项式光滑的半监督支持向量机分类算法,给出的多项式函数有严格的理论基础,并且在样本的高密度区逼近精度高,而当逼近精度低时,则出现在样本的低密度区。
     3.直接方法是一类常用的无约束优化技术,简便实用,它和之前用于支持向量机的循环算法不同,不是一次更新w的所有分量,而是每次通过解一个单变量的子问题来更新w的一个分量。本文分别用Hooke and Jeeves模式搜索法、Rosenbrock转轴法和Powell方向加速法求解线性支持向量机,并分析了算法的复杂性。
     4.支持向量机采用的线性Hinge损失函数对噪声样本产生的损失没有限制,这是支持向量机对噪声敏感的根本原因。由于特殊的损失函数能有效抑制噪声产生的损失,本文据此给出了一个全新的双曲正切损失函数,并在此基础上给出了相应的健壮支持向量机。
     实验表明上述方法和结果在支持向量机算法中均具有较好的学习性能。
Support vector machine (SVM) has been a dominant machine learning technique for more than a decade, however, most of its algorithms were proposed in allusion to its dual problem in the dual space. Research indicates that solving the primal problem is also an effective approach for training SVM in recent years. As people make an intensive study of SVM in the primal space, various problems meted in application were solved in the primal space, such as the problem of semi-supervised learning. But as a whole, the research on SVM is not familiar and perfect in the primal space. Therefore, this dissertation mainly focuses on the study of the following four problems of SVM classification algorithm in the primal space.
     To deal with the problem of poor approximation performance of smoothing functions available of smoothing support vector machines, plus function was transformed into an equivalent infinite series. Thus a family of polynomial smoothing functions were derived. The properties of them were discussed. It is shown that the approximation accuracy and smoothing rank of polynomial functions can be as high as required. Finally, the polynomial smoothing functions were used to solve the generalized support vector machine.
     Semi-supervised SVM makes use of the large collection of unlabeled data jointly with a few labeled examples for improving generalization performance. Finally, a non-convex optimization problem was obtained. We adopted two strategies for minimizing above optimization problem: combinatorial optimization and continuous optimization. The way of combinatorial optimization is presenting a self-training semi-supervised SVM classification algorithm, whose subprogram use the above obtained polynomial smoothing functions to solve the standard SVM in the primal. After that, a polynomial smooth semi-supervised support vector machines classification algorithm was presented in the way of continuous optimization. The introduced polynomial functions have a good command of theory and have high approximation accuracy in high density regions of samples and poor approximation performance appear in low density regions of samples.
     Direct search method is a common unconstrained optimization technique, it is different from the cyclic algorithms used in SVM former, which update all components of w at a time. However, direct search method updates one component of w at a time by solving a one-variable sub-problem. On account of the simpleness and practicability of direct search method, three algorithms of the method were used to solve linear SVM. The three algorithms are Hooke and Jeeves pattern search algorithm、Rosenbrock coordinate-turning method and Powell’s direction acceleration method. The detailed solving algorithm was given and the complexity of the algorithm was analyzed.
     The essential reason of the sensitivity of SVM to noise is that the adopted linear loss function has no limits on the penalty loss of noise samples. According to the fact that special loss function is able to control the loss value caused by noise samples, a novel hyperbolic tangent loss function was constructed, and based on the new loss function, the corresponding robust SVM- hyperbolic tangent SVM was proposed.
     Experiments were performed to verify the above four problems in SVM classification algorithm, experimental results show they can obtain satisfactory learning performance.
引文
[1] T.M. Michel著.曾华军,张银奎译.机器学习.北京:机械工业出版社,2003.
    [2] Filip Mulier. Vapnik-Chervoenkis (VC) Learning Theory and its applications. IEEE Transactions on Neural Networks.1999,10(5):985-987.
    [3] V.N. Vapnik. The Nature of Statistical Learning Theory. Berlin :Springer, 1995. (中文版:张学工译.统计学习理论的本质.北京:清华大学出版社,2000.)
    [4] C. Cortes, V. Vapnik. Support vector networks. Machine Learning, 1995,20(3):273-297.
    [5] Cristianini Nello, Shawe-Taylor John著,李国正,王猛,曾国华译.支持向量机导论.北京:机械工业出版社,2004.
    [6]邓乃扬,田英杰.数据挖掘中的新方法—支持向量机.北京:科学出版社,2004.
    [7] V.N. Vapnik. Statistical Learning Theory. Springer,Berlin,1998.
    [8] V.N. Vapnik. An overview of statistical learning theory. IEEE Transactions on Neural Networks.1999, 10(5):988-999.
    [9] B. Scholkopf, K.K.Sung, C.J.C.Burges, et al. Comparing support vector machines with Gaussian kernels to radial basis function classiers. IEEE Transactions on Sial Processing.1997,45(11):2758-2765.
    [10] Joachims T. Text categorization with support vector machines. Learning with many relevant features. In Proceedingd of the European Conference on Machine Learning Berlim Springe 1998:137-142.
    [11] Heisele B. Hierarchical classification and feature reduction for fast face detection with support vector machine. Pattern Recognition. 2003,36:2007-2017.
    [12] Bryant M, Gerber F. SVM classifier applied to the MSTAR public data set. In Algorithms for Synthetic Aperture Radar Imaery VI Proceedings of the SPIE 3721,1999:355-360.
    [13] Gish H, Schimdt M. Text-indepenten speaker identification. IEEE Transactions on Signal Proessing Magazine. 1994,42(1):18-32.
    [14] Walavalkar L. Support vector learning for gender classification using audioand visual cues. International Journal of Pattern Recognition and Artificial Intelligence.2003,17(3):417-439.
    [15] Brown M, Lewis G.H, Gunn R.S. Linear spectral mixture models and support vector machines for remote sensing. IEEE Transactions on Geoscience and Remote Sensing.1999,38(5):2346-2360.
    [16] Boser B, Guyon I, Vapnik V. A training algorithm for optimal margin classifiers. Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory ACM press, 1992:144-152.
    [17] Osuna E, Freund R, Girosi F. Training support vector machines: an application to face detection. Proceeding of the Comute Vision and Pattern Recognition.1999:130-136.
    [18] Platt J. Sequential minimal optimization: A fast algorithm for training support vector machines. In: Scholkopf B., etal. eds.. Advances in Kernel Methods-Support Vector Learning, Cambridge, MA:MIT press,1999:185-208.
    [19] Erdem Z, Polikar R, Gurgen F et al. Ensemble of SVMs for incremental learning. Workshop on Multiple Classifier Systems.2005:246-256.
    [20] Li Xuchun, Yan Zhu, Eric Sung. Sequential bootstrapped support vector machines. IEEE Transactions on Neural Networks.2005,10(5):1000-1017.
    [21]萧嵘,王继成,孙正兴等.一种SVM增量学习算法.南京大学学报(自然科学版).2002,38(2):152-157.
    [22]曾文华,马健.一种新的支持向量机增量学习方法.厦门大学学报(自然科学版).2002,41(6):687-691.
    [23]朱美琳,杨佩.基于支持向量机的多分类增量学习算法.计算机工程.2006,32(17):83-85.
    [24]萧嵘,王继成,孙正兴等.一种SVM增量学习算法α? ISVM.软件学报.2001,12(12):23-28.
    [25]李凯,黄厚宽.支持向量机增量学习算法研究.北方交通法学学报.2003,27(5):36-39.
    [26] Domeniconi C., Gunopulos D. Incremental support vector machines construction. Proceedingd of IEEE international conference on data mining. San Joes, USA:IEEE,2001:589-592.
    [27]周伟达,张莉,焦李成.支撑矢量机推广能力的分析.电子学报.2001,29(5): 15-19.
    [28]李忠伟,张健沛,杨静.基于支持向量机的增量学习算法研究.哈尔滨工程大学学报.2005,26(5):87-90.
    [29]李东晖,杜树新.基于壳向量的线性支持向量机快速增量学习算法.浙江大学学报(工学版).2006,40(2):24-28.
    [30] Zhang Jianpei, Li Zhongwei,Yang Jing et al. A Gradual Training Algorithm of Incremental Support Vector Machine Learning. international conference on advances in natural computation , Changsha , CHINE, August2005.2005(2612):1132-1139.
    [31]孔锐,张冰.一种快速支持向量机增量学习算法.控制与决策.2005,10,20(10):51-54.
    [32] Syed N. , Liu H. , Sung K. Incremental learning with support vector machines. Proceeding of IJCAI Conference, Sweden, August1999.
    [33] Yu Shu,Yang Xiaowei,Hao Zhifeng et al. An adaptive support vector machine learning algorithm for large classification problem. 3rd International Symposium on Neural Networks Chengdu, China, 2006. Lecture Notes in Computer Science, 2006, vol:3971:981-990.
    [34] Liva Ralaivola, Florence Alche.Incremental support vector machine learning : A local approach. international conference on artificial neural networks, Vienna , AUTRICHE 2001,(2130):322-330.
    [35] Hastie T., Tibshiranj R. Classification by pairwise coupling. The Annals of Statistics. 1998,26(2):451-471.
    [36] Wu Tingfan, Lin Chihjen, Weng Ruby C. Probability estimates for multi-class classification by pairwise coupling. Journal of Machine Learning Research.2004,5:975-1005.
    [37] Duan Kaibo, Keerthi S. Sathiya, Chu Wei et al. Multi-category classification by soft-max combinationf binary classifiers. Multiple Classifier Systems of Lecture Notes in Computer Science.2003,2709:11-13.
    [38] Berdensteiner E.J., Bennett K.P. Multicategory classification by support vector machines. Computational Optimization and Applications.1999,12(3):53-79.
    [39] Elisseeff A. Guermeur Y. and Paugam-Moisy H. Margin error and generalization capabilities of multi-class discriminant models. Technical Report NC-TR-99-051, NeuroCOLT2.
    [40] Bennett K, Blue J. A support vector machine approach to decision trees. Rensselaer Polytechnic Institute, Troy, NY: R.P.I Math Report.1997:97-100.
    [41] Bennett K.P. and Bredensteiner E. Duality and geometry in SVM classifiers. Proceeding of the 17th International Conference on Machine Learning ,California,2000:57-64.
    [42] Keerthi S, Shevade S, Bhattcharyya C. A fast iterative nearest point algorithm for support vector machine classifier design. IEEE Transactions on Neural Networks.2000,11(1):124-136.
    [43] Yang M H, Ahuja N. A geometric approach to train support vector machines. Proceeding of IEEE Conference on Computer Vision and Pattern Recognition. 2000:430-437.
    [44] Lu Zengxiang, Li Yada. Interactive support vector machine learning algorithm and its application. Journal of Tsinghua University (Science Editor).1999,39(7):93-97.
    [45] Zhang L., Zhou W.D., Jiao L.C. Wavelet support vector machine. IEEE Transactions on Systems, Man and Cybernetics, Part B.2004,34(1):34-39.
    [46] Suykens J.A.K., Vandewalle J. Recurrent least squares support vector machines. IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications.2000,47(7):1109-1114.
    [47] Yu Shu, Yang Xiaowei, Hao Zhifeng etal. An adaptive support vector machine learning algorithm for large classification problem. Lecture Notes in Computer Science.2006, 3971:981-990.
    [48] Jin Bo, Zhang Yanqing. Classifying very large data sets with minimum enclosing ball based support vector machine. 2006 IEEE International Conference on Fuzzy Systems. Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada. July 16-21,2006.
    [49] Mangasarian O.L. and Thompson M.E. Massive data classification via unstrained support vector machines. Journal of Optimization Theory and Applications.2006, 131(3):315-325.
    [50]李红莲,王春花,袁保宗等.针对大规模训练集的支持向量机的学习策略.计算机学报.2004, 27(5):715-719.
    [51] Zhou Shuisheng, Liu Hongwei ,Zhou Lihua etal. Semismooth Newton support vector machine. Pattern Recognition Letters.2007,28:2054-2062.
    [52] Sathiya S. , Chapelle O. and Decoste D. Building support vector machines with reduced classifier complexity. Journal of Machine Learning Research.2006, 7:1493-1515.
    [53] Suykens J. and Vandewalle J. Least squares support vector machine classifiers. Neural Processing Letters.1999,9(3):293-300.
    [54] Fung Glenn and Mangasarian Olvi. Linear Programming support vectormachines classifier. Patent approval pending ,Wisconsin Alumni Research Foundation.2003.
    [55] Fung G. and Mangasarian O.L. Proximal support vector machine classifiers, In Provost F. and Srikant R editors, Proceedings KDD-2001:Knowledge Discovery and Data Mining, August 26-29,2001,San Francisco, CA, pages77-86, New York, 2001.
    [56] Mangasarian O L,Wild E W. Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Transactions on pattern analysis and machine intelligence,2006,28(1):69-74.
    [57] Mangasarian O.L. Exact 1-Norm support vector machines via unconstrained convex differentiable minimization. Journal of Machine Learning Research. 2006,7:1517-1530.
    [58] Jayadeva and Khemchandani R. Twin support vector machines for pattern classification. IEEE Transactions on pattern analysis and machine intelligence. 2007, 29(5):905-910.
    [59] Mangasarian O.L. and Musicant D.R. Lagrangian support vector machines. Journal of Machine Learning Research.2001,1:161-177.
    [60] Hsieh Cho-Jui, Chang Kai-Wei, Lin Chih-Jen, etal. A dual coordinate descent method for large-scale linear SVM. In Proceedings of the Twenty Fifth International Conference on Machine Learning (ICML), 2008. URL http://www.csie.ntu.edu.tw/?cjlin/papers/cddual.pdf.
    [61] Issam Dagher. Quadratic kernel-free non-linear support vector machine. Journal of Global optimation.2008,41:15-30.
    [62] Wang Zhe, Chen Songca. New least squares support vector machines based on matrix patterns. Neural Processing Letters.2007,26:41-56.
    [63]方景龙,陈铄.复杂分类问题支持向量机的简化.电子学报.2007, 35(5):858-861.
    [64] Fung G. and Mangisaiian O.L. Finite Newton method for Lagrangian ' support vector machine classification. Data Mining Institute Technical Report 02-01, February 2002.
    [65] Chapelle O. Training a support vector machine in the primal. Neural Computation. 2007, 19(5):1155-1178.
    [66]王磊.支持向量机学习算法的若干问题研究.电子科技大学博士学位论文,2007年6月.
    [67] Wang Lei, Sun Shixin and Zhang Kai. A fast approximate algorithm fortraining L1-SVM in primal space. Neurocomputing. 2007,70:1554-1560.
    [68] Lee Y J and Mangasarian O. L. RSVM: Reduced support vector machines.Technical Report 00-07, Data Mining Institute, Computer Sciences Department,University of Wisconsin, Madison, Wisconsin, July 2000. Proceedings of the First SIAM International Conference on Data Mining, Chicago, April 5-7,2001, CD-ROM Proceedings. ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/00-07.ps
    [69]周水生,周利华.训练支持向量机的低维Newton算法.系统工程与电子技术.2004, 26(9):1315-1318.
    [70]郭崇慧,孙建涛,陆玉昌.广义支持向量机优化问题的极大熵方法.系统工程理论与实践.2005, 6(6):27-32.
    [71]王若鹏,邢志栋.GSVM优化问题的一种新的光滑函数法.系统工程与电子技术.2007, 29(6):982-985.
    [72] Burges C. A tutorial on support vector machines for pattern recognition . Data Mining and Knowledge Discovery.1998,2(2):127-167.
    [73]郑春红.支撑向量机应用的关键技术研究.西安电子科技大学博士学位论文,2005年4月.
    [74] Lee Yuh-Jye, Mangarasian O.L. SSVM:A smooth support vector machine for classification. Computational Optimization and Applications.2001,22(1):5-21.
    [75] M. Arun Kumar, M. Gopal. Application of smoothing technique on twin support vector machines. Pattern Recognition Letters. 2008,29:1842-1848.
    [76] Xiong Jinzhi, Hu Tianming, Li Guangming etal. A comparative study of three smooth SVM classifiers. Proceedings of the 6th worth congress on intelligent control and automation, June 21-23,2006,Dalian,China.
    [77] Joachims T. Making large-scale support vector machine learning practical. In: Scholkopf B., etal. eds.. Advances in Kernel Methods-Support Vector Learning, Cambridge, MA:MIT press,1999:169-184.
    [78] Mangasarian O.L., Musicant D.R. Successive overrelaxation for support vector machines. IEEE Transactions on Neural Networks.1999,10(8):1032-1037.
    [79]袁玉波,严杰,徐成贤.多项式光滑的支撑向量机.计算机学报.2005,28(1):9-17.
    [80]袁玉波,杨传胜,黄延祝等.数据挖掘与最优化技术及其应用.北京:科学出版社,2007.
    [81]熊金志,胡金莲,袁华强等.一类光滑支持向量机新函数的研究.电子学报.2007,35(2):366-370.
    [82]熊金志,袁华强,彭宏.多项式光滑的支持向量机一般模型研究.计算机研究与发展.2008,45(8):1346-1353.
    [83]王斌,胡金莲,熊金志.一种求支持向量机光滑函数的新方法.系统仿真学报.2008, 20(15):4018-4020.
    [84] Mangasarian O.L.,Musicant D.R.. Successive overrelaxation for support vector machines. IEEE Transactions on Neural Networks.1999,10(8):1032-1037.
    [85] Mangasarian O L. Generalized support vector machine[C]// Smola A J,Bartlett P,Scholkopf B,Schuurmans D. Advances in Large Margin Classifiers, MIT Press,2000:135-146.
    [86]莫国端,刘开第.函数逼近论方法.北京:科学出版社,2004.
    [87] Chen Chunhui, Mangasarian O.L. Smoothing methods for convex inequalities and linear complementarity problems. Mathematical programming. 1995,71:51-69.
    [88] Chen B. and Harker P.T. Smooth approximations to nonlinear complementarity problems. SIAM Journal of Optimization.1997,7:403-42.
    [89] Chen C., Mangasarian O.L. A class of smoothing functions for nonlinear and mixed complementarity problems. Computational Optimization and Applications. 1996,5:97-138.
    [90] Chapelle O, Sindhwani V and Keerthi S. Optimization techniques for semi-supervised support vector machines. Journal of Machine Learning Research.2008, 9:203–233.
    [91] Wang L., Shen X., and Pan W.. On transductive support vector machines. In Verducci J. , Shen X. , and Lafferty J. , editors, Prediction and Discovery. American Mathematical Society, 2007.
    [92] Chi Mingmin. Advanced semi-supervised techniques for the classification of remote sensing data. phD Dissertaion, March 2006.
    [93] Qin Jianzhao, Li Yuanqing. An Improved Semi-supervised Support Vector Machine Based Translation Algorithm for BCI Systems.Proceedings of the 18th International Conference on Pattern Recognition. Hong Kong, China, 2006.
    [94] Wang Junhui, Shen Xiaotong, Pan Wei. On transductive support vector machines. Prediction and Discovery. American Mathematical Society, 2007.
    [95] Xu Linli. Convex large margin training techniques-unsupervised,semi-supervised and robust support vector machines. Ph D. thesis of the university of waterloo,2007.
    [96] Fung G. and Mangasarian O. Semi-supervised support vector machines for unlabeled data classification.Optimization Methods and Software. 2001,15:29–44,
    [97] Li Maokuan, Zhao Honghai. Semi-supervised support vector machines for data classification. Journal of Qingdao University (Natural Science Edition).2004, 17(4):44-48.
    [98] Chapelle O, Chi M and Zien A. A continuation method for semi-supervised SVMs. International Conference on Machine Learning. Pennsylvania ,2006.
    [99] Astorino A and Fuduli A. Nonsmooth optimization techniques for semi-supervised classification . IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007,29(12):2135–2142.
    [100] Bennett K.P, Demiriz A. Semi-supervised support vector machines . In Advances in Neural Information Processing Systems 12, 1998.
    [101] Bengio Y. and Grandvalet Y.. Semi-supervised learning by entropy minimization. In Advances in Neural Information Processing Systems, volume 17, 2004.
    [102] De Bie T. and Cristianini N.. Semi-supervised learning using semi-definite programming. In Chapelle O., Scho¨elkopf B. , and Zien A., editors, Semi-supervised Learning. MIT Press, 2006.
    [103] Chapelle O and Zien A. Semi-supervised classification by low density separation. Tenth International Workshop on Artificial Intelligence and Statistics. Barbados,2005.
    [104] Chapelle O., Sindhwani V., and Keerthi S. Branch and bound for semi-supervised support vector machines. Neural Information Processing Systems, 2006.
    [105] Joachims T. Transductive inference for text classification using support vector machines. International Conference on Machine Learning. Pittsburgh , 1999.
    [106] Sindhwani V, Keerthi S and Chapelle O. Deterministic annealing for semi-supervised kernel machines.International Conference on Machine Learning. Pennsylvania , 2006.
    [107] Sun Li, Jing Ling, Xia Xiaodong. A new proximal support vector machine for semi-supervised classification. Lecture Notes in Computer Science.2006,3971.1076-1082.
    [108] Li Yuanqing, Li Huiqi , Guan Cuntai and Chin Zhengyang: A Self-training Semi-supervised Support Vector Machine Algorithm And Its Applications in Brain Computer Interface. IEEE International Conference on Acoustics, Speech and Signal Processing, Hawaii, USA,2007:I-385-I-388.
    [109] Collobert R., Sinz F., Weston J.etal. Large scale transductive SVMs. Journal of Machine Learning Research.2006,7:1687–1712.
    [110] Nazareth,L. A conjugate direction algorithm without line searches, Journal of Optimization Theory and Applications .1977,23(3).373-387.
    [111]袁亚湘,孙文瑜.最优化理论与方法.北京:科学出版社,1993.
    [112]袁亚湘.非线性优化计算方法.北京:科学出版社,2008.
    [113]薛毅.最优化原理与方法.北京:北京工业大学出版社,2001.
    [114]陈宝林.最优化理论与算法.第2版.北京:清华大学出版社,2005.
    [115] Chang K W,Hsieh C J,Lin C J. Coordinate descent method for large-scale L2-loss linear support vector machines.Journal of Machine Learning Research.2008,9:1369-1398.
    [116] Mangasarian O.L. A finite Newton method for classification. Optimization methods and software.2002,17(5):913-929.
    [117] Grippo Luigi and Sciandrone Marco . Globally convergent block-coordinate techniques for unconstrained optimization. Optimization Methods and Software,1999,10:587–637.
    [118] Mokhtar S. Bazaraa, Hanif D. Sherali and C.M. Shetty. Nonlinear programming: theory and algorithms. 3th ed. Wiley-Interscience. USA,2006.
    [119]袁亚湘.非线性规划数值方法.上海:上海科学技术出版社,1993.
    [120]谢政,李建平,汤泽滢.非线性最优化.湖南长沙:国防科技大学出版社,2003.
    [121] Joachims Thorsten. Training linear SVMs in linear time. In Proceedings of the ACM Conference on Knowledge Discovery and Data Mining (KDD). ACM, 2006.
    [122] Zhang Tong. Solving large scale linear prediction problems using stochastic gradient descent algorithms. In Proceedings of the 21th International Conference on Machine Learning (ICML),2004.
    [123] Smola Alex J., Vishwanathan S V N, and Le Quoc. Bundle methods for machine learning. In J.C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Advances in Neural Information Processing Systems 20. MIT Press,Cambridge, MA, 2008.
    [124] Shalev-Shwartz Shai, Singer Yoram, and Srebro Nathan. Pegasos: primal estimated sub-gradient solver for SVM. In Proceedings of the 24th International Conference on Machine Learning(ICML), 2007.
    [125] Jiang Xiufeng, Zhang Yi and Jian Chenglv. Fuzzy SVM with a new fuzzy membership function. Neural Computing and Alication.2006,15:268-276.
    [126] Daisuke Tsujinishi, Shigeo Abe. Fuzzy least squares support vector machines for multiclass problems. Neural Networks.2003,16:785-792.
    [127] Liu Y.H., Chen Y.T. Face recongnition using total margin-based adaptive fuzzy support vector machines. IEEE Transactions on Neural Networks,2007,18(1):178-192.
    [128] Tsang E.C., Yeung D.S., Chan P.P. Fuzzy support vector machines for solving two-class problems. International Conference on Machine Learning and Cybernetics.2003:3336-3341.
    [129]张瑞,马逸尘,段现报.面向分类去噪问题的模糊支持向量机新算法.西安交通大学学报.2007,41(12):1414-1417.
    [130] Wang Y.Q., Wang S.Y., Lai K.K. A new fuzzy support vector machine to evaluate credit risk. IEEE Transaction on Fuzzy Systems. 2005,13(6):820-831.
    [131]张翔,肖小玲,徐光祐.基于样本之间紧密度的模糊支持向量机方法.软件学报.2006, 17(5):951-958.
    [132] Zhang X.G. Using class-center vectors to build support vector machines. Proceedings of the 1999 IEEE Signal Processing Society Workshop, Madison,WI,USA,1999:3-11.
    [133] Lin Chunfu and Wang Shengde. Fuzzy support vector machines. IEEE Transactions on Neural Networks.2002, 13(2): 464-471.
    [134] Lin Chunfu and Wang Shengde. support vector machines :Theory and Applications. Springer Berlin / Heidelberg,2005:233-254.
    [135] Tsang Eric C.C., Yeung Daniel S., Chan Patrick P.K. Fuzzy support vector machines for solving two-class problems. Proceedings of the second international conference on machine learning and cybernetics,xi’an China,2003:1080-1083.
    [136] Wang L, Jia H D, Li J. Training robust support vector machine with smooth ramp loss in the primal space. Neurocomputing.2008,71:3020-3025.
    [137] Bo Liefeng, Wang Ling, Jiao Licheng. Recursive finite Newton algorithm forsupport vector regression in the primal. Neural Computation. 2007:1082-1096.
    [138]杨俊燕,张优生,朱永生.ε不敏感损失函数支持向量机分类性能研究.西安交通大学学报.2007,41(11):1315-1320.
    [139] Song Q., Hu W.J., Xie W.F. Robust support vector machine with bullet hole image classification. IEEE transactions on systems, man and cybernetics. Part C, Applications and reviews .2002,32(4):440-448.
    [140] Groenen P.J.F., Nalbantov G., Bioch J.C. SVM-Maj: a majorization approach to linear support vector machines with different hinge errors. Econometric Institute Report EI 2007-49,November 1,2007.
    [141] Lin Y, Lee Y, Wahba G. Support vector machines for classification in nonstandard situations. Machine Learning. 2002,46:191–202.
    [142] Xu L,Crammer K,Schuurmans D. Robust support vector machine training via convex outlier ablation,in Proceedings of the 21st National Conference on Artificial Intelligence,AAAI’06,2006.
    [143] Shen X, Tseng G C, Zhang X, Wong W H. Onψ-learning. Journal of American Statistical Association.2003, 98:724–734.
    [144] Wang S C, Jiang W, Tsui K L. Adjusted support vector machines based on a new loss function. Annals of Operations Research, published online: 03 December 2008.
    [145] kimeldorf G S, Wahba G. A correspondence between Bayesian estimation on stochastic processes and smoothing by splines. Annals of Mathematical Statistics.1970,41:495-502.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700