几类支持向量机变型算法的研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
统计学习理论是一种基于小样本的机器学习理论,经过几十年的发展,目前已经形成了一套比较完整的理论体系.支持向量机是在此理论基础之上提出来的一种新机器学习方法.它根据结构风险最小化原则,通过核函数在一个高维特征空间中构造最优线性决策函数,避免了维数灾难,且可达到全局最优解,具有较高的泛化能力.由于支持向量机具有良好的性能,已经被广泛地应用于模式分类,函数逼近和密度估计等问题,日益受到学者们的广泛关注,其理论研究、算法实现和应用方面都取得了重大进展,成为机器学习领域的热点课题.
     为了降低标准支持向量机的计算复杂度,提高其学习速度和泛化能力,本文主要研究几类支持向量机变型算法的理论及其应用,主要内容如下:
     一、对目前支持向量机的研究现状做了综述,然后简要地介绍了支持向量机的相关基本知识.
     二、研究一类基于核最小平方误差的支持向量机变型算法.首先给出最小二乘支持向量机的分类几何解释;其次将用于分类问题的临近支持向量机推广到回归问题上,提出临近支持向量回归机,并给出一种基于Cholesky分解的快速算法,此外证明了新模型对分类问题和回归问题的模型等价性;然后结合最小二乘支持向量机和临近支持向量机的优点,提出直接支持向量机,该模型可同时适应于分类问题和回归问题,且求解更简单,训练速度快,泛化能力也未降低.该模型与最小二乘支持向量机相比,增强了问题的凸性,保证得到的解全局最优;与临近支持向量机相比,修正了线性与非线性模型不统一的缺点,测试速度更快.最后通过数值实验验证了上述研究的可行性和有效性.
     三、首先理论上证明了模糊支持向量机模型与多惩罚因子支持向量机的等价性,提供了将模糊支持向量机隶属度参数作为模型选择参数进行自适应求解的理论依据;然后针对模糊支持向量机的隶属度设计方法,分别基于支持向量机分类面与样本的几何分布关系和基于支持向量机分类的本质,提出两种更合理的新隶属度设计方法,通过数值实验验证了这两种方法的有效性,并与现有一些方法进行了比较研究.
     四、研究二次损失函数的模糊支持向量机泛化能力.首先给出二次损失函数模糊支持向量机的数学模型,然后证明其等价于带有新核函数的硬间隔支持向量机,将模糊隶属度参数转化为新核函数的参数;其次将硬间隔支持向量机的四种泛化能力估计方法,推广到二次损失函数模糊支持向量机;最后通过理论分析和数值实验系统地比较它们的估计性能,得到对于二次损失函数的模糊支持向量机最好的泛化能力估计界,为后期的模型选择奠定基础.
     五、研究了双惩罚因子的二次损失函数支持向量机的模型选择及其应用.将二次损失函数支持向量机应用于乳腺癌X线影像病灶点的识别,针对问题中两类数据的不均衡性,对其采用不同的惩罚因子,然后根据前面的研究结果,通过最小化泛化错误率上界,给出一种自动确定模型参数的方法.最后通过对X线影像中的肿块检测和钙化簇检测实验,验证了该方法不但有效,而且可达到更好的泛化精度.
Statistical learning theory is a theatrical framework of machine learning for small samples. During the past decades, it has been developmenting to be a relatively comprehensive system of theory. Support vector machine (SVM) comes out as a new machine learning algorithm based on this theory. According to the structural risk minimization (SRM) rule, it can get the global optimal linear decision function in a higher dimensional feature space via a kernel function. It avoids the curse of dimensionality and is of good generalization ability. Since its good performance in pattern recognition, function approximation and density estimation, it has attracted a great attention of researches, and developed rapidly in theory, computing and applications, and becomes a hot topic in machine learning.
     In order to improve the training speed and/or generalization ability of traditional SVM, This dissertation mainly focuses on the research of the theory and application based on some variants of SVM. The contents in this dissertation are described as follows.
     1. A review of current status of related research of SVM is given, and then it is followed by a brief introduction of the fundamentals of SVM.
     2. A study on several SVM variant based on kernel minimum square error (MSE). Firstly, the geometric description of Least Square SVM (LSSVM) classifier is described; Secondly, the Proximal SVM (PSVM) model for classification problem is extended to regression problem, thus Proximal Support Vector Regression Machine is presented, as well as a fast computing method based on Cholesky decomposition. The equivalence of classification model and regression model is also proved. Taking the advantages of LSSVM and PSVM,a new model called Direct Support Vector Machine (DSVM) is proposed. The new model can be used both in classification and regression problems, but be much simpler and has faster training speed and higher generalization ability. Compared to LSSVM, it enhances the convexity of the problem, guarantees to get the global optimum; compared to PSVM, it overcomes the disadvantage of differences of linear and nonlinear cases, and is higher in testing speed. In the end, comprehensive numerical experiments show the feasibility and affectivity of all these researches above.
     3. A study on fuzzy SVM (FSVM). Firstly, the equivalence of FSVM and SVM with multiple penalty factors is proved theoretically, which is the theoretical foundation of setting the fuzzy membership adaptively as hyperparameters of models. Secondly, as the strategy of pre-setting the fuzzy membership, two designing ways are given, based on the geometrical distribution of classification hyperplane and data samples and the nature of classification of SVM, respectively. Then the numerical experiments are done to show the effectiveness of these two methods, as well as comparisons with other methods available.
     4. Evaluating the performance of fuzzy SVM with L2 loss function (L2-FSVM). Firstly, the model of L2-FSVM is given, and then it is transformed equivalently as hard margin SVM with a new kernel function, in which the fuzzy memberships are restated as kernel parameters. Secondly, four methods of estimating the generalization error of hard SVM are extended to be applied in that of L2-FSVM. In the end, via comparative analysis and overall numerical experiments, the best estimation of generation error of L2-FSVM is concluded, which can be used as a criterion of model selection.
     5. Research on the model selection method of dual-penalty-factor SVM with L2 loss function and its application. Since the imbalance of binary class of data in the digital mammography, SVM with L2 loss function and different penalty factors are used. Then, according to the previous research results, one method of determining these hyperparameters automatically is presented, via minimizing the generalization error bound. By the experiments of detection of mass and microcalcifications in the digital mammography, the effectiveness of the proposed method is demonstrated. It is concluded that this method outperforms other setting ways of hyperparameters in terms of generalization ability.
引文
[1] Rich E.. Artificial Intelligence. New York: McGraw Hill Inc.. 1983.
    [2] Mitchell T.. Machine Learning. New York: McGraw Hill Inc.. 1997.
    [3]边肇祺.模式识别.北京:清华大学出版社.2004.
    [4] Vapnik V.. The Nature of Statistical Learning Theory. Springer Verlag. 1995.
    [5] Vapnik V., Lerner A.. Pattern recognition using generalized portrait method, Automation and Remote Control. 1963, 24.
    [6] Vapnik V. N.. Statistical Learning Theory. New York: Wiley. 1998.
    [7] Vapnik V., Chervonenkis A.. On the uniform convergence of relative frequencies of events to their probabilities, Theory of probability and its applications.1971, 16(2):264 280.
    [8] Cortes C., Vapnik V.. Support vector networks, Machine Learning. 1995, 20:273 297.
    [9] Cristianini N., Schawe Taylor J.. An Introduction to Support Vector Machines. Cambridge: Cambridge University Press, 2000.
    [10] Burges C.. A Tutorial on support vector machines for pattern recognition, Data Mining and Knowledge Discovery. 1998, 2(2):121 167.
    [11] Scholkopf B., Smola A. J.. Learning with Kernel Support Vector Machines, Regularization, Optimization and Beyond. Cambridge: MIT Press. 2002.
    [12] Chapelle O.. Support vector machine: Introduction principles, adaptive tuning and prior knowledge. Ph.D. Dissertation, University Paris 6. 2002.
    [13] Shawe Taylor J., Cristianini N.. Kernel Method for Pattern Analysis. New York: Cambridge University Press, 2004.
    [14]邓乃扬,田英杰.数据挖掘中的新方法—支持向量机.北京:科学出版社,2004.
    [15] Weston J., Gammerman A., Stitson M.O., et al.. Support Vector Density Estimation, Advances in Kernel Methods. Cambridge: MIT Press. 1999.
    [16] Vapnik V., Mukherjee S.. Support Vector Method for Multivariate Density Estimation, Advances in Neural Information Processing Systems.Cambridge: MIT Press. 2000, 659 665.
    [17]张莉.支撑矢量机与核方法研究.西安电子科技大学博士学位论文.2002.
    [18]周伟达.核机器学习方法研究.西安电子科技大学博士学位论文.2003.
    [19]董春曦.支持向量机及其在入侵检测中的应用研究.西安电子科技大学博士学位论文.2004.
    [20]郑春红.支撑矢量机应用的关键技术研究.西安电子科技大学博士学位论文.2005.
    [21]周水生.竞争学习向量量化和支持向量机的关键技术研究.西安电子科技大学博士学位论文.2005.
    [22]吴青.基于最优化理论的支持向量机学习算法研究.西安电子科技大学博士学位论文.2009.
    [23] Domeniconi, C., Gunopulos D., Jing Peng. Large margin nearest neighbor classifiers, IEEE Transactions on Neural Networks. 2005, 16(4):899 909.
    [24] Jiang Hui, Li Xin wei, Liu Chao jun. Large margin hidden Markov models for speech recognition, IEEE Transactions on Audio, Speech and Language Processing. 2006, 14(5): 1584 1595.
    [25] Huang K., Yang H., King I. et al.. Maxi–Min margin machine: learning large margin classifiers locally and globally, IEEE Transactions on Neural Networks. 2008, 19(2):260 272.
    [26] Zhang K., Tsang, I.W., Kwok, J. T.. Maximum margin clustering made practical. IEEE Transactions on Neural Networks, 2009, 20(4):583 596.
    [27] Fine, S., Scheinberg, K.. INCAS: An incremental active set method for SVM. Technical Report. IBM Research Labs, Haifa. 2002.
    [28] Goldfarb D., Scheinberg K.. Solving structured convex quadratic programs by interior point methods with application to support vector machines and portfolio optimization. Technical Report, IBM Research Labs, Haifa. 2005.
    [29] Woodsend K., Gondzio J.. Exploiting separability in large scale linear support vector machine training, Computational Optimization and Applications. 2009. DOI:10.1007 /s10589 009 9296 8.
    [30] Zhang Dao qiang, Chen Song can. Clustering incomplete data using kernel based fuzzy C means algorithm, Neural Processing Letters. 2004, 18(3):155 162.
    [31] Wu W., Massarat D.L., Jong S. de.. The kernel PCA algorithms for wide data. Part I: theory and algorithms. Chemometrics and Intelligent Laboratory Systems. 1997, 36: 165 172.
    [32] Hao S., Jegelka, S., Gretton A.. Fast kernel based independent component analysis, IEEE Transactions on Signal Processing. 2009, 57(9):3498 3511.
    [33] Li Shen, Tan Eng chong. Dimension Reduction based penalized logistic regression for cancer classification using microarray data, IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2005, 2(2):166 175.
    [34] Mika S., Ratsch G., Weston J., et al.. Fisher discriminant analysis with kernels, In IEEE Neural Networks for Signal Processing Workshop. 1999, 41 48.
    [35] Saunders C., Gammerman A., Vovk V.. Ridge regression learning algorithm in dual variables, ICML'98: Proceedings of the Fifteenth International Conference on Machine Learning. 1998, 515 521.
    [36] Cawley G. C., Nicola L. C.. Reduced rank kernel ridge regression, Neural Processing Letters. 2004,16(3): 293 302.
    [37] R?nnar S., Lindgren F., Geladi P. et al.. A PLS kernel algorithm for data sets with many variables and fewer objects, Part 1: Theory and Algorithm. Chemometrics and Intelligent Laboratory Systems. 1994, 8(4):111 125.
    [38] Rosipal R. and Trejo L. J.. Kernel partial least squares regression in Reproducing kernel Hilbert space, Journal of Machine Learning Research. 2002, 2(2):97 123.
    [39] Tipping M. E.. Sparse Bayesian Learning and the relevance vector machine, Journal of Machine Learning Research. 2001, 1(3):211 244.
    [40] Donoho, D. L.. Compress sensing, IEEE Transactions on Information Theory. 2006, 52(4): 1289 1306.
    [41] Li Y. Q., Cichocki A., Amari S.. Analysis of sparse representation and blind source separation, Neural Computation. 2004, 16(6):1193 1234.
    [42] Federico G.. An equivalence between sparse approximation and support vector machines, Neural Computation. 1998, 10(6):1455 1480.
    [43] Ben Hur A., Horn D., Siegelmann H.T. et al.. Support vector clustering, Journal of Machine Learning Research. 2001, 2(12):125 137.
    [44] Yang J. H., Estivill Castro V., Stephan K.. Support vector clustering through proximity graph modeling, ICONIP’02: Proceedings of 9th International Conference on Neural Information Processing. 2002, 898 903.
    [45] Mamoun Awad, Latifur Khan and Farokh Bastani. An effective support vector machines (SVM) performance using hierarchical clustering, ICTAI'04: Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence, IEEE Computer Society. Washington D.C. USA. 2004, 663 667.
    [46] Lee J., Lee D.. An improved cluster labeling method for support vector clustering, IEEE Transactions on Pattern Analysis and Machine Intelligence. 2005, 27(3): 461 464.
    [47] Finley T., Joachims T.. Supervised clustering with support vector machines, Proceedings of the 22nd international conference on Machine learning, Bonn Germany, New York: ACM International Conference Proceeding Series. 2005, 119:217 224.
    [48] Winters Hilt S., Yelundur A., McChesney C.. Support vector machine implementations for classiication and clustering, BMC Bioinformatics. 2006, 7.
    [49] Nath, J., Saketha, Shevade, S. K. An efficient clustering scheme using support vector methods, Pattern Recogniton. 2006, 39(8):1473 1480.
    [50] Manuele B., Figueiredo, Mario A. T.. Soft clustering using weighted one class support vector machines, Pattern Recogniton. 2009, 42(1):27 32.
    [51] Bennett K., Demiriz A.. Semi supervised support vector machines, Advances in Neural Informaton Processing Systems II. Cambridge MA: MIT Press. 1999, 368 374.
    [52] Fung G., Mangasarian O. L.. Semi supervised support vector machines for unlabeled data classification, Optimization Methods and Software. 2001, 15(1):29 44.
    [53] Collobert R., Sinz F., Weston J.. Large scale transductive SVMs, Journal of Machine Learning Research. 2006, 7(6):1687 1712.
    [54] Chapelle O., Sindhwani V., Keerthi S.. Optimization techniques for Semi supervised support vector machines, Journal of Machine Learning Research. 2008, 9(2):203 233.
    [55] Zhang Rui, Wang Wen jian, Ma Yi chen. Least square transduction support vector machine, Neural Processing Letters. 2009, 29(2):133 142.
    [56] Silva, C., Santos, J.S., Wanner, E.F. et al.. Semi supervised training of least squares support vector machine using a multi objective evolutionary algorithm, Evolutionary Computation. 2009, (5):2996 3002.
    [57] Chakraborty S.. Bayesian semi supervised learning with support vector machine, Statistical Methodology. (in press). DOI:10.1016/j.stamet.2009.09.002.
    [58] Boser B., Guyon I., Vapnik V. N.. A training algorithm for optimal margin classifiers, Proceedings of 5th Annual Workshop Computation on Learning Theory. Pittsburgh PA: ACM, 1992.
    [59] Osuna E., Freund R., Girosi F.. An improved training algorithm for support vector machines, Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing. NewYork: IEEE Press. 1997, 276 285.
    [60] Joachims T.. Making large scale SVM learning practical, In: Burges C. and Scholkopf B.. Advances in Kernel Methods: Support Vector Learning. Cambridge, MA: MIT press. 1998.
    [61] Joachims T.. SVMlight, http://svmlight.joachims.org/. 1998.
    [62] Lin C. J.. On the convergence of the decomposition method for support vector machines, IEEE Transactions on Neural Networks. 2001, 12(6):1288 1298.
    [63] Hsu C. W., Lin C. J.. A simple decomposition method for support vector machines, Machine Learning. 2002, 46(1): 291 314.
    [64] Platt J. C.. Fast training of support vector machines using sequential minimal optimization. In:Sch?lkopf B., Burges C. J. and Smola A. J.(ed.). Advance in Kernel Methods Support Vector Learning, Cambridge, MA: MIT Press. 1999, 185 208.
    [65] Platt J. C.. Using analytic QP and sparseness to speed training of support vector machines. In M. Kearns, S. Solla and D.Cohn, Advances in Neural Information Processing Systems II. Cambridge MA: MIT Press. 1999, 557 563.
    [66] Keerthi S. S., Shevade S. K., Bhattacharyya C., et al.. Improvements to Platt's SMOalgorithm for SVM classier design, Technical report. National University of Singapore. 1999.
    [67] Cao L. J., Keerthi S.S., Ong C.J. et al.. Developing parallel sequential minimal optimization for fast training support vector machine, Neurocomputing. 2006, 70(1/3): 93 104.
    [68] Lin C. J.. Asymptotic convergence of an SMO algorithm without any assumptions, IEEE Transactions on Neural Networks. 2002, 13(1): 248 250.
    [69] Keerthi S. S., Gilbert E. G.. Convergence of a generalized SMO algorithm for SVM classifier design, Machine Learning. 2002, 46(1/3): 351 360.
    [70] Takahashi N., Nishi T.. Rigorous proof of termination of SMO algorithm for support vector machines, IEEE Transactions on Neural Network. 2005, 16(3):774 776.
    [71] Chapelle O.. Training a support vector machine in the primal, Neural Computation. 2007, 19(5):1155 1178.
    [72] Keerthi S. S., Shevade S. K., Bhattacharyya C. et al. A fast iterative nearest point algorithm for support vector machine classifier design, IEEE Trans. Neural Networks. 2000, 11(1):124 136.
    [73] Laskov P.. An improved decomposition algorithm for regression support vector machines. Advances in Neural Information Processing System II. Cambridge MA: MIT Press. 2000, 484 490.
    [74] Hevade S. K., Keerthi S. S., Bhattacharyya C., et al. Improvements to the SMO algorithm for SVM regression. IEEE Transactions on Neural Networks, 2000, 11(5): 1188 1193.
    [75] Sch?lkopf B., Smola A. J., Williamson R. C. et al.. New Support Vector Algorithms, Neural Computation. 2000, 12(5):207 1245.
    [76] Liao S. P., Lin H. T., Lin C. J.. A note on the decomposition methods for support vector regression. Technical Report, National Taiwan University. 2001.
    [77] Flake G.W., Lawrence S.. Efficient SVM regression training with SMO, Machine Learning. 2002, 46(1/3):271 290.
    [78] Quan Y., Yang J., Yao L. X.. An improved way to make large scale SVR learning practical. EURASIP Journal on Applied Signal Processing. 2004, 8(1):135 1141.
    [79] Chang C.C., Lin C.J.. LIBSVM A Library for Support Vector Machines[CP], 2007. http://www.csie.ntu.edu.tw/~cjlin/libsvm/index.htm.
    [80]孙剑,郑南宁,张志华.一种训练支持向量机的改进序贯最小化算法,软件学报.2002,13(10):2007 2013.
    [81]张浩然,韩正之.回归支持向量机的改进序列最小优化学习算法,软件学报.2003,14(12):2006 2013.
    [82] Sch?lkopf B., Smola A. J., Williamson R. C. et al.. New support vector algorithms, Neural Computation. 2000, 12(5):1207–1245.
    [83] Ikeda, K., Murata, N.. Geometrical properties of nu support vector machines with different norms, Neural Computing. 2005, 17(11):2508 2529.
    [84] Ikeda K. Effects of kernel function on nu support vector machines in extreme cases, IEEE Transactions on Neural Networks. 2006, 17(1):166 172.
    [85] Steinwart I.. On the Optimal parameter scoice forνsupport vector machines, IEEE Transactions on Pattern Analysis and Machine Intelligence. 2003, 25(10):1274 1284.
    [86] Wu Qi. Regression application based on Fuzzyνsupport vector machine in symmetric rriangular fuzzy space, Expert Systems with Applications, (In Press). 2009.
    [87] Chang C. C., Lin C. J.. Trainingνsupport ector classifiers: theory and algorithms, Neural Computation. 2001, 13(9):2119 2147.
    [88] Zhou Wei da, Zhang Li, Jiao Li cheng. Linear programming support vector machines, Pattern Recognition. 2002, 35(12):2927 2936.
    [89] Torii Y., Abe S.. Fast Training of linear programming support vector machines using decomposition techniques, LNCS, Springer Berlin Heidelberg. 2006. DOI: 10.1007/11829898_15.
    [90] Fletcher R.. Practical Methods of Optimization. New York: Wiley. 1987.
    [91]朱永生,王成栋,张优云.二次损失函数支持向量机性能的研究,计算机学报.2003, 26(8): 982 989.
    [92] Franc V., Hlavá? V.. An Iterative Algorithm learning the maximal margin classifier, Pattern Recognition. 2003, 36(9):1985 1996.
    [93] Ferris M. C., Munson T. S.. Semi smooth support vector machines, Mathematic Programming Ser. B.. 2004, 67(2):185–204.
    [94] Mangasarian O. L., Musicant D. R.. Successive over relaxation for support vector machines, IEEE Transactions on Neural Network. 1999, 10(5):1032 1037.
    [95] Mangasarian O. L.. Generalized support vector machine. Advances in Large Margin Classifiers, Smola A. J., Bartlett P., Schokopf B. and Schuurmans D. editors, Cambridge: MIT Press. 2000, 135 146.
    [96] Mangasarian O. L., Musicant D. R.. Data discrimination via nonlinear generalized support vector machines. In Ferris M. C., Mangasarian O. L. and Pang J. S. editors, Complementarity: Applications, Algorithms and Extensions, Kluwer Academic Publishers. 2001, 233 251.
    [97] Lee Y. J., Mangasarian O. L.. RSVM: Reduced support vector machines, In Proceedings of the SIAM International Conference on Data Mining. Chicago, Philadelphia:SIAM. 2001.
    [98] Mangasarian O. L., Musicant D. R.. Active set support vector machine classification, Advances in Neural Information Processing Systems, Cambridge: MIT Press. 2000, 77 583.
    [99] Mangasarian O. L., Musicant D. R.. Lagrangian support vector machines, Journal ofMachine Learning Research. 2001, 1:161 177.
    [100] Fung G., Mangasarian O. L.. Finite Newton method for Lagrangian support vector machine classification, Neuro computing. 2003, 55(1 2):39 55.
    [101] Duan Hua, Shao Xiao jian, Hou Wei zhen et al.. An incremental learning algorithm for Lagrangian support vector machines, Pattern Recognition Letters. 2009, 30(15):384 1391.
    [102] Lee Y. J., Mangasarian O. L.. SSVM: A smooth support vector machine, Computational Optimization and Applications. 2001, 20(1):5 22.
    [103]袁玉波,严杰,徐成贤.多项式光滑的支撑向量机,计算机学报.2005,28(1): 9 17.
    [104]熊金志,胡金莲,袁华强等.一类光滑支持向量机新函数的研究,电子学报. 2007, 35(2):366 370.
    [105] Wu Q., Liu S. Y., Zhang L. Y.. Adjustable entropy function method for support vector machines, Journal of Systems Engineering and Electronics. 2008, 19(5): 1029 1034.
    [106] Lee Y. J., Hsieh W. F., Huang C. F..εSSVR: A smooth support vector machine forεinsensitive regression, IEEE Transactions on Knowledge and data Engineering. 2005, 17(5): 678 685.
    [107] Suykens J. A. K.,Vandewalle J.. Least square support vector machine classifiers, Neural Processing Letters. 1999, 9(3):293 300.
    [108] Gestel T. V., Suykens J. A. K., Lanckriet G. et al. Bayesian framework for least squares support vector machine classifiers, Gaussian processes and kernel Fisher discriminate analysis, Neural Computation. 2002, 14(5):1115 1147.
    [109] Suykens J. A. K., Brabanter J. De, Lukas L. et al.. Weighted least squares support vector machines: robustness and sparse approximation, Neurocomputing. 2002, 48(1 4):85 105.
    [110] Chua K. S.. Efficient computations for large least square support vector machine classifiers, Pattern Recognition Letters. 2003, 24(1/3):75 80.
    [111] Keerthi S. S., Shevade S. K.. SMO algorithm for least squares SVM formulations, Neural Computation. 2003, 15(2):487 507.
    [112] Tsujinishi D., Abe S.. Fuzzy least squares support vector machines for multi class problems, Neural Networks. 2003, 16(6):785 792.
    [113] Tony V., Suykens J. A. K.. Benchmarking Least Squares Support Vector Machine Classifiers, Machine Learning. 2004, 54(1):5–32.
    [114] Cawley G. C., Talbot N. L. C.. Fast exact leave one out cross validation of sparse least squares support vector machines, Neural Networks. 2004, 17(10):1467 1475.
    [115] Zhou Li gang, Lai Kin keung, Yu Lean. Least squares support vector machines ensemble models for credit scoring, Expert Systems with Applications. 2010, 37(1):127 133.
    [116] Mitra V., Wang C. J., Banerjee S.. Text classification:A least square support vector machineMachine Learning Research. 2001, 1:161 177.
    [100] Fung G., Mangasarian O. L.. Finite Newton method for Lagrangian support vector machine classification, Neuro computing. 2003, 55(1 2):39 55.
    [101] Duan Hua, Shao Xiao jian, Hou Wei zhen et al.. An incremental learning algorithm for Lagrangian support vector machines, Pattern Recognition Letters. 2009, 30(15):384 1391.
    [102] Lee Y. J., Mangasarian O. L.. SSVM: A smooth support vector machine, Computational Optimization and Applications. 2001, 20(1):5 22.
    [103]袁玉波,严杰,徐成贤.多项式光滑的支撑向量机,计算机学报.2005,28(1): 9 17.
    [104]熊金志,胡金莲,袁华强等.一类光滑支持向量机新函数的研究,电子学报. 2007, 35(2):366 370.
    [105] Wu Q., Liu S. Y., Zhang L. Y.. Adjustable entropy function method for support vector machines, Journal of Systems Engineering and Electronics. 2008, 19(5): 1029 1034.
    [106] Lee Y. J., Hsieh W. F., Huang C. F..εSSVR: A smooth support vector machine forεinsensitive regression, IEEE Transactions on Knowledge and data Engineering. 2005, 17(5): 678 685.
    [107] Suykens J. A. K.,Vandewalle J.. Least square support vector machine classifiers, Neural Processing Letters. 1999, 9(3):293 300.
    [108] Gestel T. V., Suykens J. A. K., Lanckriet G. et al. Bayesian framework for least squares support vector machine classifiers, Gaussian processes and kernel Fisher discriminate analysis, Neural Computation. 2002, 14(5):1115 1147.
    [109] Suykens J. A. K., Brabanter J. De, Lukas L. et al.. Weighted least squares support vector machines: robustness and sparse approximation, Neurocomputing. 2002, 48(1 4):85 105.
    [110] Chua K. S.. Efficient computations for large least square support vector machine classifiers, Pattern Recognition Letters. 2003, 24(1/3):75 80.
    [111] Keerthi S. S., Shevade S. K.. SMO algorithm for least squares SVM formulations, Neural Computation. 2003, 15(2):487 507.
    [112] Tsujinishi D., Abe S.. Fuzzy least squares support vector machines for multi class problems, Neural Networks. 2003, 16(6):785 792.
    [113] Tony V., Suykens J. A. K.. Benchmarking Least Squares Support Vector Machine Classifiers, Machine Learning. 2004, 54(1):5–32.
    [114] Cawley G. C., Talbot N. L. C.. Fast exact leave one out cross validation of sparse least squares support vector machines, Neural Networks. 2004, 17(10):1467 1475.
    [115] Zhou Li gang, Lai Kin keung, Yu Lean. Least squares support vector machines ensemble models for credit scoring, Expert Systems with Applications. 2010, 37(1):127 133.
    [116] Mitra V., Wang C. J., Banerjee S.. Text classification:A least square support vector machineand KRR. In Proceedings of IJCNN, Washington, DC, July 2001, 1486–1491.
    [134] Magill D.. Adaptive Minimum MSE Estimation, IEEE Transactions on Information Theory. 1963, 9(4):289 289.
    [135] Fung G., Mangasarian O. L.. Proximal support vector machine classifiers. Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA. August 2001, 77 82.
    [136] Fung G.. (Matlab) software for PSVM. 2008, http://www.cs.wisc.edu/dmi/svm/psvm/.
    [137] Reshma K., Jayadeva S. C.. Knowledge based proximal support vector machines, European Journal of Operational Research. 2009, 195(3):914 923.
    [138] Ye Qiao lin, Ye Ning. Improved proximal support vector machine via generalized eigenvalues, International Joint Conference on Computational Sciences and Optimization. 2009, 1:705 709.
    [139] Yang Xu bing, Chen Song can, Chen Bin et al.. Proximal support vector machine using local information, Neurocomputing. (In Press), 2009.
    [140] Agarwal D. K.. Shrinkage estimator generalization of proximal support vector machine. Proceeding of the 8th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Edmonton, Canada. 2002, 173 182.
    [141] Fung G., Mangasarian O. L.. Multicategory proximal support vector classifiers, Machine Learning. 2005, 59(1 2):77 97.
    [142] Li K., Huang H.. Incremental learning proximal support vector machine classifiers, Proceedings of the 1st International Conference on Machine Learning and Cybernetics, Beijing. 2002, 3:1635 1637.
    [143] Reshma J., Chandra S.. Fast and robust learning through fuzzy linear proximal support vector machines, Neurocomputing. 2004, 61(10):401 411.
    [144] Jayadeva K. R., Chandra S.. Fuzzy linear proximal support vector machines for multi category data classification, Neurocomputing. 2005, 67(8):426 435.
    [145] Saravanan N., Kumar V. N. S., Ramachandran K. I.. Fault diagnosis of spur bevel gear box using artificial neural network (ANN), and proximal support vector machine (PSVM), Applied Soft Computing. 2010, 10(1):344 360.
    [146] Manevitz L. M., Malik Y.. One class svms for document classification, The Journal of Machine Learning Research. 2002, 2(2):139 154.
    [147] Tran Quang anh, Li Xing, Duan Hai xin. Efficient performance estimate for one class support vector machine, Pattern Recognition Letters. 2005, 26(8):1174 1182.
    [148] Shin H. J., Eom Dong Hwan, Kim Sung Shick. One class support vector machines—an application in machine fault detection and classification, Computers & IndustrialEngineering. 2005, 48(2):395 408.
    [149] Hao Pei yi, Fuzzy one class support vector machines, Fuzzy Sets and Systems. 2008, 159(18):2317 2336.
    [150] Choi Young sik. Least squares one class support vector machine, Pattern Recognition Letters. 2009, 30(13):1236 1240.
    [151] Wu Roung shiunn, Chung Wen hsin, Ensemble one class support vector machines for content based image retrieval, Expert Systems with Applications. 2009, 36(3):4451 4459.
    [152] Inoue T., Abe, S.. Fuzzy support vector machines for pattern classification, (IJCNN'01) Proceedings of International Joint Conference on Neural Networks. 2001, 2:1449 1454.
    [153] Lin C. F., Wang S. D.. Fuzzy Support Vector Machines. IEEE Trans, on Neural Networks. 2002, 13(2):464 471.
    [154] Lin C. F., Wang S. D.. Training algorithms for fuzzy support vector machines with noisy data, Pattern Recognition Letters. 2004, 25(14):1647 1656.
    [155] Jiang X. F., Yi Z., Lv J. C.. Fuzzy SVM with a new fuzzy membership function, Neural Computation and Application. 2006, 3(15):268 276.
    [156]张翔,肖小玲,徐光祐.基于样本之间紧密度的模糊支持向量机方法,软件学报.2006, 17(5):951 958.
    [157]安金龙,王正欧,马振平.基于密度法的模糊支持向量机,天津大学学报.2004,37(6): 544 548.
    [158] Tao Qing, Wang Jue. A new fuzzy support vector machine based on the weighted margin, Neural Processing Letters. 2004, 20(3):139 150.
    [159] Tao Qing. Posterior probability support vector machines for unbalanced data, IEEE Transactions on Neural Networks. 2005, 16(6):1561 1573.
    [160] Lee K. Y., Kim D. W., Lee K. H. et al. Possibilistic support vector machines, Pattern Recognition. 2005, 38(8):1325 1327.
    [161] Lauer F., Bloch G.. Incorporating prior knowledge in support vector machines for classification: A review, Neuro computing. 2008, 71(7 9):1578 1594.
    [162] Bach F. R., Lanckriet G. R. G., Jordan M. I., Multiple kernel learning, conic duality and the SMO algorithm, Proceedings of the twenty first international conference on Machine learning, Banff, Alberta, Canada. 2004, 2 8. DOI:10.1145/1015330.1015424.
    [163] Lanckriet G. R. G., Cristianini N., Bartlett P.. et al.. Learning the kernel matrix with semidefinite programming, Journal of Machine Learning Research. 2004, 5:27 72.
    [164] Qiu S., Lane T.. Multiple kernel learning for support vector regression, 2005. http://www.cs.unm.edu/~treport/tr/05 12/QiuLane.pdf.
    [165] Sonnenburg, S., R?tsch G., Sch?fer C. et al.. Large Scale Multiple Kernel Learning, Journalof Machine Learning Research. 2006, 7:1531 1565.
    [166] Zien A., Ong C. S.. Multiclass multiple kernel learning, Proceedings of the 24th international conference on Machine learning, Corvalis, Oregon. 2007, 1191 1198.
    [167] Huyen D., Alexandros K., Adam W. et al.. Margin and radius based multiple kernel learning, (ECML PKDD '09) Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases, Springer Verlag, Berlin, Heidelberg. 2009, 330 343.
    [168] Mercer J.. Functions of positive and negative type and their connection with the theory of integral equations, Export Phil. Trans. R. Soc. Lond. A, 1909, 209:415 446.
    [169] Aizerman M., Braverman E., Rozonoer L.. Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control. 1964, 25: 821–837.
    [170] Zhang L., Zhou W., Jiao L.. Wavelet support vector machine, IEEE Transactions on Systems, Man and Cybernetics, Part B. 2004, 34(1):34 39.
    [171] Yang Shu yuan, Wang Min, Jiao Licheng. Ridgelet kernel regression, Neurocomputing. 2007, 70(16 18):3046 3055.
    [172] Haussler D.. Convolution kernels on discrete structures. Technical report UCSC CRL 99 10, University of Californian Santa Cruz, Computer science Department. July l999.
    [173] Watkins C.. Dynamic alignment kernels. Technical Report CSD TR 98 11, Royal Holloway, University of London, Computer science Department. January l999.
    [174] Leslie C., Eskin E., Noble W.S.. The spectrum kernel: A string kernel for SVM protein classi?cation. In R. B. Altman, A. K. Dunker, L. Hunter, K. Lauderdale, and T. E. Klein, editors,Proceedings of the Paci?c Symposium on Biocomputing, Kaua’i, Hawaii. 2002, 564–575.
    [175] Lafferty J., Lebanon G.. Diffusion kernels on statistical manifolds, Journel Machacine Learning Research. 2005, 6:129 163.
    [176] Heisele B., Serre T., Prentice S., et. al.. Hierarchical classification and feature reduction for fast face detection with SVM, Pattern Recognition. 2003, 36(9):2007 2017.
    [177] Chuang C. F., Shih F. Y.. Recognizing facial action units using independent component analysis and support vector machine, Pattern Recognition. 2006, 39(9):1795 1798.
    [178] Neumann J., Schnorr C., Steidl G.. Combined SVM based feature selection and classification, Machine Learning. 2005, 61:129–150.
    [179] Zhang X., Lu X., Shi Q., et. al. Recursive SVM feature selection and sample classification for mass spectrometry and microarray data. BMC Bioinformatics. 2006,7(4):197 219.
    [180]常群,王晓龙,林沂蒙等.支持向量分类和多宽度高斯核,电子学报.2007,35(3):484 487.
    [181] Zhang L., Zhou W., Jiao L.. Decision tree support vector machine, International Journal on Artificial Intelligence Tools. 2007, 16(1):1 16.
    [182] Comak E., Arslan A.. A new training method for support vector machines: Clustering k NN support vector machines, Expert Systems with Applications. 2008, 35(3):564 568.
    [183] Li R., Cui Y. M., He H. et al.. Application of support vector machine combined with k nearest neighbors in solar flare and solar proton events forecasting, Advances in Space Research. 2008, 42(9):1469 1474.
    [184] Pavlov D., Mao J., Dom B.. Scaling up support vector machines using boosting algorithm, In Proceedings of the 2000 International Conference on Pattern Recognition. 2000.
    [185] Kim H. C., Pang S., Je H. M. et al.. Pattern classification using support vector machine ensemble, In Proceedings of the International Conference on Pattern Recognition. 2002, 2: 20160 20163.
    [186] Matic N., Guyon I., Denker J. et al . Writer adaptation for on line handwritten character recognition. In: 2nd International Conference on Pattern Recognition and Document Analysis. 1993:187 191.
    [187] Shilton, A., Palaniswami M., Ralph D. et. al.. Incremental Training of Support Vector Machines, IEEE Transactions on Neural Networks. 2005, 16(1):114 131.
    [188] Shinya K., Shigeo A.. Incremental training of support vector machines using hyperspheres, Pattern Recognition Letters. 2006, 27(13):1495 1507.
    [189] Kazushi I., Takemasa Y.. Incremental support vector machines and their geometrical analyses, Neuro computing. 2007, 70(13 15):2528 2533.
    [190] Liang Zhi zheng, Li You fu. Incremental support vector machine learning in the primal and applications, Neuro computing. 2009, 72(10 12):2249 2258.
    [191] Burges C. J.. A Tutorial on support vector machines for pattern recognition, Data Mining and Knowledge Discovery. 1998, 2(2):121 167.
    [192] Byun H., Lee S. W.. Applications of support vector machines for pattern recognition: a survey, Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines. 2002, 8(1):213 236.
    [193] Osuna E., Freund R., Girosi F.. Training support vector machines: an application to face detection. In: Proceeding of CVPR’97, Puerto Rico. 1997.
    [194] Pontil M., Verri A.. Support vector machines for 3D object recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence. 1998, 20(6): 637 645.
    [195] Joachims T., Text categorization with support vector machines: learning with many relevant Features, European Conference on Machine Learning (ECML). 1998.
    [196] Bahlmann C., Haasdonk B., Burkhardt H.. On line handwriting recognition with support vector machines a kernel approach, Proceedings of Eighth International Workshop on Frontiers in Handwriting Recognition. 2002.
    [197]高学,金连文,尹俊勋等.一种基于支持向量机的手写汉字识别方法,电子学报.2002, 30(5): 651 654.
    [198] Yao Y., Frasconi P., Pontil M.. Fingerprint classification with combinations of support vector machines, Proceedings of the Third International Conference on Audio and Video Based Biometric Person Authentication, 2001, 253 258.
    [199] Sun A., Lim E. E. P., Ng W. K.. Web classification using support vector machine, Proceedings of the 4th international workshop on Web information and data management. McLean, Virginia, USA. 2002.
    [200]绕鲜,董春曦,杨绍全.基于支持向量机的入侵检测系统,软件学报.2003,14(4):798 803.
    [201] Perez Cruz F., Bousquet O.. Kernel methods and their potential use in signal processing, IEEE Signal Processing Magazine. 2004, 21(3): 57 65.
    [202] Wan V., Campbell W. M. Support vector machines for speaker verification and identification, Proceedings of IEEE Workshop Neural Networks for Signal Processing. 2000, 775–784.
    [203]周伟达,张莉,焦李成.自适应支撑矢量机多用户检测,电子学报.2003,31(1):92 97.
    [204]张春城,周正欧.基于支持向量机的浅地层探地雷达目标分类识别研究,电子学报.2005,33(6):1091 1094.
    [205] Chapelle O., Haffner P., Vapnik V. N.. Support vector machines for histogram based image classification, IEEE Transactions on Neural Networks. 1999, 10(5):1055 1064.
    [206] Hong P., Tian Q., Huang T. S.. Incorporate support vector machines to content based image retrieval with relevance feedback, IEEE Int'l Conference on Image Processing. 2000.
    [207] Keren D., Osadchy M., Gotsman C.. Antifaces: a novel fast method for image detection, IEEE Transactions on Pattern Analysis and Machine Intelligence. 2001, 23(7):747 761.
    [208] Vancouver, C., Reyna R. A., Hernandez N. et al.. Segmenting images with support vector machines, In proceedings of international conference on image processing. 2000, 1:820 823.
    [209] Tsai H. H., Sun D. W.. Color image watermark extraction based on support vector machines, Information Sciences. 2007, 177(2):550 569.
    [210] Li S., Kwok J. T., Zhu H. et al.. Texture classification using support vector machines, Pattern Recognization. 2003, 36(12):2883 2893.
    [211] Melgani F., Bruzzone L.. Classification of hyperspectral remote sensing images with Support vector machines, IEEE Transactions on Geoscience Remote Sensing. 2004, 42(8):1778 1790.
    [212] Chi M. M., Bruzzone L.. Semi supervised classification of hyperspectral images by SVMs optimized in the primal, IEEE Transactions on Geoscience and Remote Sensing. 2007, 45(6): 1870 1880.
    [213] Bhanu P. K. N., Ramakrishnan A.G., Suresh S. et al.. Fetal lung maturity analysis using ultrasound image features. In: IEEE Transactions on Information Technology in Biomedicine,2002, 6(1):38 45.
    [214]潘晨,闫相国,郑崇勋等.利用单类支持向量机分割血细胞图像,西安交通大学学报.2005,39(2):150 153.
    [215] Zhang Zhao, Zhang Su, Zhang Chen xi. SVM for density estimation and application to medical image segmentation, Journal of Zhejiang University Science B. 2006, 7(5):365 372.
    [216] Brown M. P. S., Grundy W. N., Lin D. et al.. Knowledge based analysis of microarray gene expression data using support vector machines, Proceedings of the National Academy of Sciences of the United States of America. 2000, 97(1):262 267.
    [217] Hua S., Sun Z.. Support vector machine approach for protein subcellular localization prediction, Bioinformatics. 2001, 17(8):721 728.
    [218] Comak E., Polat K, Gune S.. A new medical decision making system: Least square support vector machine(LSSVM)with Fuzzy Weighting Pre processing, Expert Systems with Applications. 2007, (32):409 414.
    [219] Rychetsky M., Ortmann S., Glesner M.. Support vector for engine knock detection, Proceedings of the International Joint Conference on Neural Network. 1999, 969–974.
    [220] Ge M., Du R., Zhang G. et al.. Fault diagnosis using support vector machine with an application in sheet metal stamping operations, Mechanical System and Signal Processing. 2004, 18 (1):143–159.
    [221] Xu Y., Wang L.. Fault diagnosis system based on rough set theory and support vector machine, Lecture Notes in Artificial Intelligence. 2005, 980–988.
    [222] Drezet P. M. L., Harrison R. F.. Support vector machines for system identification, UKACC International Conference on Control. 1998.
    [223] De Kruif B., De Vries T.. Support vector based least squares for learning non linear dynamics, Proceedings of the 41st IEEE Conference on Decision and Control. 2002, 2:1343–1348.
    [224] Müller K. R., Smola A., R?tsch G.. Predicting Time Series with Support Vector Machines. Proceedings ICANN'97, Springer Lecture Notes in Computer Science. 1997, 999.
    [225] Francis E. H., Cao L. J.. Application of support vector machines in financial time series forecasting, Omega. 2001, 29(4):309 317.
    [226] Kim K. J.. Financial time series forecasting using support vector machines, Neurocomputing. 2003, 55(1/2):307 319.
    [227]崔万照,朱长纯,保文星等.基于模糊模型支持向量机的混沌时间序列预测,物理学报,2005,54(7):3009 3018.
    [228] Asuncion A., Newman D. J.. UCI Machine Learning Repository [http://www.ics.uci.edu/ ~mlearn/MLRepository.html]. Irvine, CA: University of California, Department ofInformation and Computer Science. 1998.
    [229] Golub G. H., Van loan C. F.. Matrix computations. 3rd edition, Baltimore and London: The Johns Hopkins Univ. Press, 1996.
    [230] Zhang X. G.. Using class center vector to built support vector machines. Neural Network for Signal Processing IX, Proceedings of the 1999 IEEE Workshop, Madison, WI, USA.1999, 33 37.
    [231]陶卿,曹进德,孙德敏.基于支持向量机分类的回归方法,软件学报,2002,13(5): 1024 1028.
    [232] Krishnapuram B., Carin L., Figueiredo M. A. et al.. Sparse multinomial logistic regression: fast algorithms and generalization bounds, IEEE Transactions on Pattern Analysis and Machine Intelligence. 2005, 27(6):957 968.
    [233] Cawley, G. C., Talbot N. L. Efficient approximate leave one out cross validation for kernel logistic regression. Machine Learning. 2008, 71(2):243 264.
    [234] Adankon M. M., Heriet, M.. Model selection for the LS SVM: application to handwriting recognition, Pattern Recognition. 2009, 42(12):3264—3270.
    [235] Cawley, G. C., Talbot N. L.. Efficient leave one out cross validation of kernel fisher discriminant classifiers, Pattern Recognition. 2003, 36(11):2585 2592.
    [236] Kaariainen M.. Semi supervised model selection based on cross validation, International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada. 2006:16 21.
    [237] Gold C., Sollich P.. Model selection for support vector machine classification, Neuro computing. 2003, 55(1 2):221 249.
    [238] Huang C. M., Lee Y. J., Lin D. K. J. et al.. Model selection for support vector machines via uniform design, Computational Statistics and Data Analysis. 2007, 52(1):335 346.
    [239] Peng Xin jun, Wang Yi fei. A geometric method for model selection in support vector machine, Expert Systems with Applications, Part 1. 2009, 36(3):5745 5749.
    [240] Mathias M., Cheriet M. A., Peng Xin jun et al.. Optimizing resources in model selection for support vector machine, Pattern Recognition. 2007, 40(3):953 963.
    [241] Lin S. W., Lee Z. J., Chen S. C. et al. Parameter determination of support vector machine and feature selection using simulated annealing approach. Applied Soft Computing, 2008, 8(4):1505 1512.
    [242] Yuan Xiao fang, Wang Yao nan. Parameter selection of support vector machine for function approximation based on chaos optimization, Journal of Systems Engineering and Electronics. 2008, 19(1):191 197.
    [243] Lebrun G., Charier C., Lezoray O. et al.. Tabu Search Model Selection for SVM.International Journal of Neural Systems Special Issue on Issue’s Topic. 2008.
    [244] Escalante H. J., Montes M., Sucar L. E.. Particle Swarm Model Selection, Journal of Machine Learning Research. 2009, 10:405 440.
    [245] Huang C. L., Wang C. J.. A GA based feature selection and parameters optimization for support vector machines, Expert Systems with Applications. 2006, 31(2):231 240.
    [246] Gold C., Holub A., Sollich P.. Bayesian approach to feature selection and parameter tuning for support vector machine classi?ers, Neural Networks. 2005, 18(5 6):693–701.
    [247] Gunter L., Zhu J.. Efficient computation and model selection for the support vector regression, Neural computation. 2007, 19(6):1633 1655.
    [248] Chang M.W., Lin C. J.. Leave one out bounds for support vector regression model selection, Neural Computation, 2005, 17(5):1188 1222.
    [249] Gao J. B., Gunn S. R., Harrisa C. J.. Probabilistic framework for SVM regression and error bar estimation, machine learning. 2002, 46(1/3):71 89.
    [250] Momma M., Bennett K. P.. A pattern search method for model selection of support vector regression, In Proceedings of SIAM Conference on Data Mining. 2002.
    [251] Duan K., Keerthi S. S., Poo, A. N.. Evaluation of simple performance measures for tuning SVM hyper parameters, Neurocomputing. 2003, 51(4):41 59.
    [252] Chung K. M., Kao W. C., Sun C. L. et al.. Radius margin bounds for support vector machines with the RBF kernel, Neural Computation. 2003, 15(11):2643 2681.
    [253] Luntz A., Brailovsky V.. On estimation of Characters obtained in statistical procedure of recognition (in Russian), Technicheskaya Kibernetica. 1969.
    [254] Kubat M., Matwin S.. Addressing the curse of imbalance data sets: One sided sampling, In prodeedings of the 14th International conference on Machine Learning, Morgan Kaufmann. 1997, 179 186.
    [255] Ling C. X., Li C.. Data mining for direct marketing problems and solutions, Proceedings of the Fourth International Conference on Knowledge Discovery and Data Ming. New York, 1998.
    [256] Chawla N. V., Bowyer K.W., Hall L.O. et al.. Smote: Synthetic minority over sampling technique, Journal of Artificial Intelligence Research. 2002, 16:321 357.
    [257] Domingos P.. MetaCost: a general method for making classifiers cost sensitive, KDD '99: Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining. 1999, 155 164.
    [258] Chew H. G., Bogner R. E., Lim C. C.. Target detection in radar imagery using support vector machines with training size biasing, In Proceedings of the 6th International Conference on Control, Automation, Robotics, and Vision. Singapore. 2000.
    [259] Chew H. G., Bogner R. E., Lim C. C.. Dual nu support vector machine with error rate and training size biasing, In Proceeding of the 26th IEEE ICASSP 2001. Salt Lake City, USA. 2001, 1269 1272.
    [260]冈萨雷斯.数字图像处理.北京:电子工业出版社.2005.
    [261] Lee S. K., Lo C. S., Wang C. M.. A computer aided design mammography screening system for detection and classification of microcalcifications, International Journal of Medical Informatics. 2000, 60(1):29 57.
    [262] Gulsrud T. O., Husoy J. H.. Optimal filter for detection of clustered microcalcifications, Proceedings 15th International Conference on Pattern Recognition. 2000,1:508 511.
    [263] Li Huai, Wang Yue, Ray K. J. et al.. Computerized radiographic mass detection—part II: decision support by featured database visualization and modular neural networks, IEEE Transactions on Medical Imaging. 2001, 20(4): 302 313.
    [264] Bazzani A., Bevilacqua A., Bollini D. et al.. An SVM classifier to separate false signals from microcalcifications in digital mammograms, Physics in Medicine and Biology. 2001, 46(6): 1651 1663.
    [265] Papadopoulos A., Fotiadis D. I., Likas A.. An automatic microcalcification detection system based on hybrid neural network classifier, Artificial Intelligence in Medicine. 2002, 25(2): 149 167.
    [266] Tourassi G. D., Vargas Voracek R., Catarious D. M. et al.. Computer assisted detection of mammographic masses: a template matching scheme based on mutual information, Medical Physics. 2003, 30(8): 2123 2130.
    [267] Thangavel K., Karnan M., Sivakumar R. et al.. Automatic detection of microcalcification in mammograms a review, International Journal on Graphics Vision and Image, Processing. 2005, 5(5):31 61.
    [268] Wei Li yang, Yang Yong yi, et al.. A Study on Several Machine Learning Methods for Classification of Malignant and Benign Clustered Microcalcifications. IEEE Transaction on Medical Imaging, 2005, 4(3):371 380.
    [269] EI Naqa I., Yang Yong yi, Wernick M. N., et al.. A support vector machine approach for detection of microcalcifications, IEEE Transactions on Medical Imaging. 2002, 21(12): 1552 1563.
    [270] Li Ying, Jiang Jian min. Combination of SVM knowledge for microcalcification detection in digital mammograms, IDEAL 2004, LNCS 3177. 2004, 359 365.
    [271] Seong H. P., Jin M. G., Chan H. J.. Receiver operating characterisric (ROC) curve: practical review for radiologists, Korean J. Radiol. 2004, 5(1): 11 18.55.
    [272] University of South Florida. Digital Database for Screening Mammography (DDSM),http://marathon.csee.usf.edu/Mammography/Database.html, 2008.
    [273] Bornefalk H., Hermansson A. B.. On the comparison of FROC curves in mammography CAD systems, Medical Physics. 2005, 32(2):412 417.
    [274] Wang Y., Adali T.. Automatic threshold selection using histogram quantization, SPIE J. Biomedical Optics. 1997, 2(2): 211 217.
    [275] Vapnik V., Chapelle O.. Bounds on error expectation for support vector machines, Neural Computation. 2000, 12(9):9 15.
    [276] Wahba G., Lin Y., Zhang H.. Generalized approximate crossvalidation for support vector machines: Another way to look at marginlike quantities. In A. Smola, P. Bartlett, B. Sch¨olkopf and & D. Schuurmans (Eds.). Advances in large margin classi?ers Cambridge, MA: MIT Press. 2000, 297–309.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700