支持向量机若干问题及应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
支持向量机(SVM)作为结构风险最小化准则的具体实现工具,具有全局最优、结构简单、泛化性能强等优点。该技术已成为机器学习界的研究热点,并在很多领域得到了成功的应用。
     本文针对支持向量机,作了如下几个方面的研究:
     (1)指出基于贪婪思想的LS-SVM稀疏化算法得到的解容易落入局部极小点,即超平面并不稀疏。提出Invfitting法则分析迭代过程中所有支持向量,删除掉对决策函数影响最小的支持向量。并将Invfitting法则与逐次增加的支持向量的Backfitting法则有机结合,发展了更具有全局最优性的HBILS-SVM算法,从而敞少支持向量的数目,使得超平面更加稀疏。
     (2)分析了现有SVM几何算法中RCH的不足:RCH改变训练样本的凸包的几何形状,并且仅有极点表出的必要而非充分条件。引入了具有不改变几何体的形状、容易确定极点等优良特性的CCH的概念。据此讨论了基于CCH的SVM几何算法。同时,根据CCH极点的特性,提出了概率加速几何算法减少迭代中的计算量。
     (3)提出TM-ν-SVM解决了TM-SVM无法确定正则化参数的不足,确定了TM-ν-SVM的间隔误差和子支持向量的上下界,分析表明TM-ν-SVM算法可取得比ν-SVM算法更好的结果。同时具体分析了TM-ν-SVM的几何意义,即优化过径等价于求特征空间中两个SCCH间的最近点对。进一步,讨论了SCCH的几阿性质,据此给出了对应的几何算法。
     (4)讨论了将SVR转化为SVC的样本平移(SS)算法,并给出了基于经验法向量的样本平移(OSS)算法。进一步地,为减少噪声对经验法向量的影响,结合支持向量的几何算法,提出了基于特征空间中法向量的在线样本平移(OFGSS)算法。该方法可减少平移大小对回归函数的影响,降低噪声影响,具有较强的泛化性能。
     (5)分析对比了增量支持向量机和支持向量机的几何算法的优缺点。探讨了基于几何算法的支持向量机核参数确定方法。该方法结合了几何算法的优点,并利用对参数的近似梯度计算,从而以更快的速度得到最优核参数。为支持向量机的模型选择提供了一条有效的途径。
     (6)在回顾TSVM的各种学习算法之后提出了TSVM的一个改进算法——SMTSVM算法。SMTSVM算法通过引入序列最小化思想估计调整测试样本的临时标签后的Largrange系数,从而得到新的决策分类函数以及调整后的经验误差估计。该方法可解决过于简单地估计经验误差带来的分类精度上的不足。
     (7)讨论了将SVM应用到蛋白质相互作用预测工作中。通过利用蛋白质的结构域信息以及残基序列信息分别构建新的SVM输入向量,从而有效地预测了蛋白质相互作用及判定预测位点。数值模拟实验表明结合所讨论的特征表示方法得到的SVM预测器的性能远好于其它结果。
As a tool of the structural risk minimization principle,support vector machine(SVM) brings along a bunch of advantages over other approaches,including uniqueness,globality,simple structure and good generalization properties.Because of its excellent learning power,SVM has become the topic of machine learning and successfully used over a lot of applications.
     In this dissertation,several aspects about SVM have been studied substantially as follows:
     (1) The idea in sparse algorithms for the least squares SVM(LS-SVM) is greedy,which causes the hyperplane to be not sparse enough.A novel Invfitting approach to analyze support vectors is shown,in which some support vector has smallest impact on the decision function will be deleted in iteration.Combine it and the Backfitting,a more global optimized HBILS-SVM algorithm is developed, which can effectively avoid the local solution,reduce the number of support vectors,and then derive a more sparse decision function.
     (2) The reduced convex hull(RCH) changes the shape of the convex hull of samples,and provides the necessary but not sufficient condition for the representation of its extreme points.The compressed convex hull(CCH) for avoiding these deficiencies is introduced.A CCH-based geometric SVM is shown after discussing the theory of CCH.Further,based on the characteristic of extreme points in CCH,a probabilistic accelerated CCH-based geometric SVM is presented to improve the computational speed.
     (3) A TM-ν-SVM is proposed to determine the regularization factor in TMSVM, in which the bounds of the fractions of the margin errors and sub-support vectors are both lower than those inν-SVM,i.e.,it derives better performance thanν-SVM.The geometric framework for TM-ν-SVM indicates that it is equivalent to finding the pair of closest points of the two soft compressed convex hulls (SCCH).Based on the discussion of the geometric propensities of SCCH,the corresponding geometric algorithm is developed.
     (4) An overview on the sample shift(SS) method in transforming support vector regression(SVR) to classification is given,a novel empirical gradient-based sample shift(GSS) approach is shown to solve the shortage in SS.Further,to reduce the impact of noise on empirical gradient,an online feature gradient-based sample shift(OFGSS) method by combining the geometric SVM is developed. Compared to the former,it reduces the impact of the shift value and noise,i.e., it has good generalization.
     (5) After comparing the merits and shortages of the incrcmcntal SVM with the gcometric SVM,a model selection way based on thc later is discussed.It combines the merits of the geometric SVM and the approximate gradient computation of the kernel parameters,which provides a rapid method to determine the kernel parameters in SVM.
     (6) After reviewing on a variety of algorithms for transductive support vector machine(TSVM),a sequential minimization algorithm for TSVM is introduced by introducing the sequential minimization idea to estimate Lagrange coefficients after adjusting the temporary label of a test sample,and then derives the new decision function and the estimation of empirical risk.This algorithm can solve the deficiency of the overly-simple estimation of empirical risk.
     (7) An application in predicting protein-protein interactions of SVM is discussed. In this application the information of the structure domain and sequcncc of proteins are respectively used to construct novel vector representation,which can effectively predict protein-protein interactions and interface sites.Numerical simulations show that the proposed vector representation derives bcttcr prediction performance than others.
引文
[1]Adankon M M,and Chcriet M.Optimizing resources in model selection for support vector machine.Pattern Recogn.,2007,40:953-963.
    [2]Amari S,and Wu S.Improving support vector machine classifier by modifying kernel function.Neural Networks,1999,12:783-789.
    [3]Ayat N E,Cheriet M,and Sueh C Y.Automatic model selection for the optimiation of the SVM kernels.Pattern Recogn.Comput.Sci.,2005,38(10):1733-1745.
    [4]Bahlmann C,Haasdonk B,and Burkhardt H.On-line Handwriting Recognition with Support Vector Machines-A Kernel Approach.In Proc.8th Intern.Workshop Front.Handwrit.Recogn.,2002:49-54.
    [5]Balcazar J L,Yang D,and Watanabe O.Provably fast training algorithms for support vector machines.In IEEE Intern.Conf.Data Mini.,2001,pp:43-50.
    [6]Bateman A,et al.The Pfam protein families database.Nucleic Acids Res.,(Database Issue),2004,32:D138-D141.
    [7]Bengio Y.Gradient-based optimization of hyper-parameters.2000,12:8.
    [8]Bennett K P,and Bredensteiner E J.Geometry in learning.In Gorini C,Hart E,Meyer W,and Phillps T,(Eds.),Geometry at Work,,Washington,DC:Mathematical Association of America,1998.
    [9]Bennett K P.Combining support vector and mathematical programming methods for classification.In Scholkopf,B.,Burges,C.,and Smola,A.,(Eds.),Advances in Kernel Methods-Support Vector Learning,MIT-Press,1998.
    [10]Bennett K P,and Bredensteiner E J.Duality and geometry in SVM classifier.In Proc.17th Inter.Conf.Mach.Learn.Pat Langley Editor,Morgan Kaufmann,San Francisco,2000,pp:57-64.
    [11]Bi J,and Bennett K P.A Geometric Approach to Support Vector Regression.Online.May,2003.
    [12]Boardman M,and Trappenberg T.A heuristic for parameter optimization with support vector machines.In 2006 Intern.Joint Conf.Neural Netw.,2006,pp:1337-1344.
    [13]Boser B E,Guyon I M,and Vpanik V N.A training algorithm for optimal margin classifiers.In 5th Annu.ACM Workshop Comput.Learn.Theo.,1992.
    [14]Bredensteiner E J,and Bennett K P,Multi-category classification by support vector machines.Comput.Optimi.Appli.,1999:53-79.
    [15]Breiman L.Arcing classifiers.Ann.Statist.,1998,26:801-809.
    [16]Brown M P S,Grundy W N,Lin D,et al.Knowledge-based analysis of microarray gene expression data by using support vector machine.In Proc.Nat.Acad.Sci.97,2000,pp:262-267.
    [17]Burges C J C.Simplified support vector decision rule.In Proc.13th intern,conf.Mach.Learn,.,San Match,CA,1996:71-77.
    [18]Burges C J C.A tutorial on support vector machines for pattern recognition.Data Mini.and Knowl.e Disc.,1998,2:1-47.
    [19]Chang C C,Hsu C W,et al.The Analysis of Decomposition Methods for Support Vector Machines.IEEE Trans.Neural Netw.,2000(11),4:1003-1008.
    [20]Chang C C,and Lin C J.LIBSVM:A library for support vector machines.2004[Online],Available:http://www.csie.ntu.edu.tw/~cjlin/libsvm.
    [21]Chapelle O,and Vapnik V N.Model selection for support vector machines.In Adv.Neural Inf.Proc.Syst.,1999,pp:230-236.
    [22]Chapelle O,Vapnik V N,Bousquet O,et al.Choosing multiple parameters for support vector machines.Mach.Learn.,2002,46:131-159.
    [23]Chen J H,and Chen C S.Reducing SVM Classification Time Using Multiple Mirror Classifiers.IEEE Trans.Syst.,Man,Cyber.,2004(34),2:1173-1183.
    [24]Chen P H,Lin C J,and Scholkopf B.A tutorial on v-support vector machines.Technique Report,2000.
    [25]Chen X W,and Liu M.Prediction of protein-protein interactions using random decision forest framework.Bioinformatics,2005,21:4394-4400.
    [26]Chen Y S,Wang G P,and Dong S H.Learning with progressive transductive support vector machine.Pattern Recogn.Lett.,2003,24:1845-1855.
    [27]Cherkassky V,and Ma Y.Practical selection of SVM parameters and noise estimation for SVM regression.Neural Networks.,2004,17(1):113-226.
    [28]Christianini V,and Shawe-Taylor J.An Introduction to Support Vector Machines.Cambridge University Press.2002.
    [29]Chu W,Chong J O,and Keerthi S S.An improved conjugate gradient method scheme to the solution of least squares SVM.IEEE Trans.Neural Netw.,2005,16:498-501.
    [30]Chung K M,Kao W-C,Wang L-L,et al.Radius margin bounds for support vector machines with the RBF kernel.Neural Comput.,2003,38(10):2643-2681.
    [31]Chung K M,Kao W-C,et al.Decomposition methods for linear support vector machines.In IEEE Intern.Conf.Acoust.,Speech,Signal Proc.,2003(4):868-871.
    [32]Cortes C,and Vapnik V N.Support vector networks.Mach.Learn.,1995,20:273-297.
    [33]Cox P G,and Adhami R.Multi-class support vector machine classifier applied to hyper-spectral data.In Proc.34th Southeastern Symp.System Theory,2002,pp:271-274
    [34]Crammer K,and Singer Y.On the learnability and design of output codes for multiclass problems.Comput.Learn.Theory,2000:35-46.
    [35]Crisp D J,and Burges C J C.A geometric interpretation of v-SVM classifiers.In Adv.Neural Inform.Proc.Syst.(NIPS) 12,1999,pp:244-250.
    [36]Cristianini N,and Schawe-Taylor J.An introduction to support vector machine.Cambridge:Cambridge University Press,2000.
    [37]Dong B,Cao C,and Lee S E.Applying support vector machines to predict building energy consumption in tropical region.Energy and Build.,2005,37:545-553.
    [38]Drucker H,Burges C J C,Kaufman L,et al.Support Vector Regression Machines.Adv.Neural Infor.Proc.Syst.,9,Cambridge,MA:MIT Press,1997,pp:155-161.
    [39]Erin B.Duality and Geometry in SVM Classifiers.In Proc.17th Int.Conf.Mach.Learn.,Langley P,Eds.,San Marco,CA.2000,pp:57-64.
    [40]Fariselli P,et al.Prediction of protein-protein interaction sites in heterocomplexes with neural networks.Europ.J.Biochem.,2002,269:1356-1361.
    [41]Ferrer M,and Harrison S C.Peptide ligands to human immunodeficiency virus type 1 gp120 identified from phage display libraries.J.Virol.,1999,73:5795-5802.
    [42]Ferreira L V,et al.Solving systems of linear equations via gradient systems with discontinuous righthand sides:Application to LS-SVM.IEEE Trans.Neural Netw.,2005,16:501-505.
    [43]Franc V,and Hlavac V.An iterative algorithm learning the maximal margin classifier.Pattern Recogn.,2003,36(9):1985-1996.
    [441 Frieβ T T.Support vector networks:The kernel adaptron with bias and soft margin.Technical Report,Univ.Sheffield,Department of Automation Control system Engine,U.K.1998.
    [45]Friedman J H.Multivariate adaptive regression splines.Ann.Statist.,1991,19(1):1-141.
    [46]Frigui H,and Krishnapuram R.A robust competitives clustering algorithm with applications in computer vision.IEEE Trans.Pattern Anal.Mach.Intell.,1999,21(5):450-465.
    [47]Gao J B,Gunn S R,and Harris C J.A Probabilistic Framework for SVM Regression and Error Bar Estimation.Mach.Learn.,2002,46:71-89.
    [48]Gilbert E G.Minimizing the quadratic form on a convex set.SIAM J.Contr.,1966,4(1):61-79.
    [49]Glaser F,et al.The ConSurf-HSSP Database:the mapping of evolutionary conservation among homologs onto PDB structures.PROTEINS:Struct.Fund.Bioinf.,2005,58:610-617.
    [50]Guermeur Y,et al.A new multi-class SVM based on an uniform convergence result.In Proc.IEEE-INNS-ENNS Intern.Joint Conf.Neutral Netw.,2000,4:183-188.
    [51]Harrison D,and Rubinfeld D L.Hedonic prices and thc demand for clean air.J.Environ.Econo.and Manag.,1978,5:81-102.
    [52]Hevade S K,Keerthi S S,Bhattacharyya C,et al.Improvemcnts to the SMO Algorithm for SVM Regression.IEEE Trans.Neural Netw.,2000,11(5):1188-1193.
    [53]Hsu C W,and Lin C J.A simple decomposition method for support vector machines.Mach.Learn.,2002,46:291-314.
    [54]Hush D,and Scovel C.Polynomial-Time Decomposition Algorithms for Support Vector Machines.LANL Technical Report LA-UR-00-3800,Los Alamos:Los Alamos National Laboratory,2000.
    [55]Imbault F,and Lebart K.A stochastic optimation approach for parameter tuning of support vector mahchines.In Proc.17th Intern.Conf.Pattern Recogn.(ICPR04),2004,4,pp:597-600.
    [56]Ito T,et al.Exploring the protein interactome using comprehensive two-hybrid projects.Trends Biotechnol.,2001,19:S23-S27.
    [57]Jaakkola T S,and Haussler D.Probabilitic kernel regression models.In Proc.1999Conf.AI Statist.,1999.
    [58]Jeng J T,Chuang C C,and Su S F.Support vector interval regression networks for interval regression analysis.Fuzzy Set Sysy.,2003,138(2):283-300.
    [59]Jiao L C,Bo L F,and Wang L.Fast Sparse Approximation for Least Squares Support Vector Machine.IEEE Trans.Neural Netw.,2007,18:1-13.
    [60]Joachims T.Text categorization with support vector machines:Learning with many relevant features.In Proc.10th Europ.Conf.Mach.Learn.(ECML),Chemnitz,Germany,1998,pp:137-142.
    [61]Joachims T.Making Large-Scale SVM Learning Practical.Adv.Kernel Methods —Vector Learn.,Cambridge,MA:MIT Press,1998,pp:169-184.
    [62]Joachims T.Transductive Inference for Text Classification using Support Vector Machines.In Intern.Conf.Mach.Learn.(ICML),1999.
    [63]Joachims T.Transductive learning via spectral graph partitioning.In:Proc.Intern.Conf.Mach.Learn.(ICML'03),2003,pp:290-297.
    [64]Jones S,and Thornton J M.Prediction of protein-protein interaction sites using surface patches.J.Mol.Biol.,1997,272:133-143.
    [65]Keerthi S S,et al.Improvements to Platt' s SMO Algorithm for SVM Classier Design.Technical report,National University of Singapore,1999.
    [66]Keerthi S S,Shevade S K,Bhattacharyya C,et al.A fast iterative nearest point algorithm for support vector machine classifier design.IEEE Trans.Neural Netw.,2000,11(1):124-136.
    [67]Keerthi S S.Convergence of Generalized SMO Algorithm for SVM Classifier Design.Mach.Learn.,2002,46(13):351-360.
    [68]Keerthi S S,and Shevade S K.SMO for least squares SVM formulations.Neural Comput.,2003,15:487-507.
    [69]Kini R M,and Evans J H.Prediction of potential protein-protein interaction sites from amino acid sequence:Identification of a fibrin polymerization site.FEBS Letter,1996,385:81-86.
    [70]Laskov P.An Improved Decomposition Algorithm for Regression Support Vector Machines,Adv.Neural Infor.Proc.Syst.12.,Cambridge,MA:MIT Press,pp:484-490.
    [71]LeCun Y,et al.Backpropagation applied to handwritten zip code recognition.Neural Comput.,1999,1:541-551.
    [72]Li Y,Lin C,and Zhang W D.Improved sparse least-squares support vector machine classifiers.Neurocomput.,2006,69:1655-1658.
    [73]Liao S P,Lin H T,and Lin C J.A Note on the Decomposition Methods for Support Vector Regression,Technical Report,National Taiwan University,2001.
    [74]Lin C J.On the convergence of the decomposition method for support vector machines.IEEE Trans.Neural Netw.,2001,6(12):1288-1298.
    [75]Lin C J.Asymptotic convergence of an SMO algorithm without any assumptions.IEEE Trans.Neural Netw.,2002,1(13):248-250.
    [76]Lin K M,and Lin C J.A Study on Reduced Support Vector Machines.IEEE Tran.Neural Netw.,2003,14:1449-1459.
    [77]Lin H T,Lin C J,and Weng R C.A note on platts probabilistic outputs for support vector machines.Technical Report,May,2003.
    [78]Liu H J.An empty bottle intelligent inspector based on support vector machines and fuzzy theory.Proc.6-th WCICA,2006,pp:9739-9743.
    [79]Liu H,and Huang S T.Fuzzy transductive support vector machines for hypertext classification.Internat.J.Uncert.,Fuzz.Knowledge-Based Syst.,2004,12(1):21-36.
    [80]Mangasarian O L,and Musicant D R.Successive overrelaxation for support vector machines.IEEE Trans.Neural Netw.,1999,10(5):1032-1037.
    [81]Martin S,Roe D,Faulton J L.Predicting protein-protein interactions using signature products.Bioinformatics,2005,21:218-226.
    [82]Mavroforakis M E,and Theodoridis S.A geometric approach to support vector machine(SVM) classification.IEEE Trans.Neural Netw.,2007,17(3):671-682.
    [83]Maydt J,and Lienhart R.A fast method for training support vector machines with a very large set of linear features.In IEEE Intern.Conf.Multim.Expo.,2002,1(1):309-312.
    [84]Mayoraz E,and Alpaydin E.Support vector machines for multi-class classification.IWANN,1999,2:833-842.
    [85]Mitchell B F,Dem'yanov V F,and Malozewmov V N.Finging the point of a polyhedron closet to the origin.SIAM J.Contr.,1974,12:19-26.
    [86]Mukherjee S,Osuna E,and Girosi F.Nonliner prediction of chaotic time series using a support vector machine.In Neural Netw.Signal Proc.(NNSP),Amelia Island,FL,1997,pp:24-26.
    [87]Mulier F.Vapnik-Chervonenkis(VC) Learning Theory and Its Applications.IEEE Trans.Neural Netw.,1999,10(5).
    [88]Muller K R,Smola A J,et al.Predicting Time Series with Support Vector Machines.In Proc.Intern.Conf.Artif.Neural Netw.,Lausaunne,Switzerland,Springer,1997,pp:999-1004.
    [89]Naderi M H,et al.Prediction of protein surface accessibility with information theory.Proteins,2001,42:452-459.
    [90]Opper M,and Winther O.Gaussian processes and SVM:Mean field and leave-one-out.In Smols,A.J.,Bartlett,P.L.,Scholkopf,B.,and D.Schuurmans(Eds.),Advances in large margin classifier.Cambridge,MA:MIT Press.2000,pp:311-326.
    [91]Osuna E,Freund R,and Girosi F.An improved training algorithm for support vector machines.In Proc.IEEE NNSP'97,Amelia Island,FL,1997.
    [92]Osuna E,Freund R,and Girosi F.Training support vector machines:An application to face detection.In Proc.IEEE Conf.Comput.Vision,Pattern Recogn.,1997,pp:130-136.
    [93]Pavlov D,Mao J,and Dom B.Scaling-up support vector machines using boosting algorithm,In Proc.Pattern Recogn.,2000,15th Intern.Conf.,2000,12:219-222.
    [94]Platt J.Fast training of support vector machines using sequential minimal optimization.In Scholkopf,B.,Burges,C.J.,Smola,A.J.,(Eds),Advances in Kernel Methods-Support Vector Learning,Cambridge,MA:MIT Press.1999,pp:185-208.
    [95]Platt J.Probabilistie outputs for support vector machine and comparison to regularized likelihood methods.In Smola,A.J.Barlett,P.,Schoelkopf,B.,Schuurmans,D.,(Eds.),Advances in Large Margin Classfiers,20(0,pp:61-74.
    [96]Platt J,et al.Large Margin DAGs for Multiclass Classification.In Adv.in Neural Inform.Proc.Syst.,MIT Press,2000,12:547-553.
    [97]Platt J,Cristianini N,and Shawe-Taylor J.Large margin DAG's for multiclass classification.Adv.Neural Infor.Proe.Syst.,MIT Press,Cambridge,MA,2000,12:547-553.
    [98]Ripley B D.Pattern Recognition and Neural Networks.Cambridge University Press.1996.
    [99]Ronan C,and Samy B.Support Vector Machines for Large-Scale Regression Problems.IDIAP-RR00-17.http://www.idiap.ch,2000.
    [100]Roobaert D.DirectSVM:a fast and simple support vector machine perceptron.In Proc.2000 IEEE Signal Proc.Society Workshop,2000,1:356-365.
    [101]Salwinski L,et al.,The database of interacting proteins:2004 update.Nucleic Acids Res.,(Database Issue),2004,pp:D449-D451.
    [102]Schittkowshi K.Optimal parameter selection in support vector machine.J.Indust.& Manag.Optimi.,2005,1(4):465-476.
    [103]Scholkopf B,Burges C J C,and Vapnik V N.Incorporating invarianccs in support vector learning machines.In Artif.Neural Netw.-ICANN' 96,Berlin,1996,pp:47-52.
    [104]Scholkopf B,Mika S,Burges C J C,et al.Input space versus feature space in kernel-based methods.IEEE Trans.Neural Netw.,1999,10(5).
    [105]Scholkopf B,Smola A J,Williamson R C,et al.New support vector machines.Neural Comput.,2000,12:1207-1245.
    [106]Smits G F,and Jordaan E M.Improved SVM Regression using Mixtures of Kernels.IJCNN,2002,pp:2785-2790.
    [107]Smola A J,Scholkopf B.A Tutorial on Support Vector Regression,NeuroCOLT Technical Report NC-TR-98-030.Royal Holloway College,University of London,UK,1998.
    [108]Steinwart I.Consistency of support vector machines and other regularized kernel machines.IEEE Trans.Inform.Theory,2005,51:125-142.
    [109]Suykens J A K,and Vandewalle J.Least squares support vector machine classifiers.Neural Process.Lett.,1999,9:293-300.
    [110]Suykens J A K,Lukas L,Dooren P V,et al.Least squares support vector machine classifiers:a large scale algorithm.Proc.Euro.Conf.Circuit Theory Design,1999,pp:839-842.
    [111]Suykens J A K,and Vandewalle J.Multiclass least squares support vector machines.In IJCNN' 99 Intern.Joint Conf.Neural Netw.,Washington,DC,1999.
    [112]Suykens J A K,Brabanter J D,Lukas L,et al.Weighted least squares support vector machines:Robustness and sparse approximation.Neurocomput.,2002,48:85-105.
    [113]Suykens J A K,Gestel V T,Brabanter D J,et al.Least Squares Support Vector Machines.Singapore:World Scientifie,2002.
    [114]Syed N A,Liu H,and Sung K K.Ineremental learning with support vector machine.In Intern.Joint Conf.Artif.Intell.,1999.
    [115]Tao Q,Wu G,and Wang J.A generalized S-K algorithm for learning v-SVM.Pattern Recogn.Lett.,2004,25(10):1165-1171.
    [116]Tao Q.A general soft method for learning SVM classifiers with L1 norm.Thechical Reprot,Institute of Automation,Chinese Academy of Sciences,2005.
    [117]Tao Q,Wu G,and Wang J.Posterior probability support vector machine for unbalanced data.IEEE Trans.Neural Netw.,2005,16(6):1561-1573.
    [118]Tao Q,Wu G,and Wang J.The theoretical analysis of FDA and application.Pattern Recogn.,2006,39(6):1199-1204.
    [119]Tay F E H,and Cao L J.Modified support vector machines in financial time series forecasting.Neurocomput.,2002,48:847-861.
    [120]Tsang I W,Kwok J T,and Cheung P M.Core vector machines:Fast SVM training on very large data sets.J.Mach.Learn.Rese.,2005,6:363-392.
    [121]Tzeng G H,Goo Y J,Wu C H,et al.A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy.Expert Syst.with Appli.,2006,1.
    [122]Van Gestel T,Suykens J A K,Lanckriet G,et al.Bayesian framework for least squares support vector machine classifiers,Gaussian processes and kernel fisher discriminant analysis.Neural Comput.,2002,15:1115-1148.
    [123]Vapnik V N,and Chervoknenkis A Y.Theory of Pattern Recognition.Nauka,Moscow,1974.
    [124]Vapnik V N.Estimation of Dependencies Based on Empirical Data.Moscow:Nauka,1979.
    [125]Vapnik V N,and Chervoknenkis A Y.The Necessary and Sufficient Conditions for the Uniform Convergence of Averages to their Expected Values.Teoriya Veroyatnostei I Ee Primeniniya.1981,26(3):543-564.
    [126]Vapnik V N.Estimation of Dependencies Based on Empirical Data.Berlin:Springer-Verlag,1982.
    [127]Vapnik V N,and Chervoknenkis A Y.The Necessary and Sufficient Conditions for Consistency of the Method of Empirical Risk Minimization.Pattern Recogn.,Image Anal.,1991,1(3):284-305.
    [128]Vapnik V N.The Natural of Statistical Learning Theory.New York:Springer.1995.
    [129]Vapnik V N.Statistical Learning Theory.New York:Wiley.1998.
    [130]Vapnik V N.An Overview of Statistical Learning Theory.IEEE Trans.Neural Netw.,1999,10(5):988-999.
    [131]Vapnik V N,and Chapelle O.Bounds on error expectation for support vector machines.Neural Comput.,2000,12(9).
    [132]Vincent P,and Bengio Y.Kernel matching pursuit.Mach.Learn.,2001,48:165-187.
    [133]Vishwanathan S V M,and Narasimha M M.SSVM:a simple SVM algorithm.In Proc.2002 Intern.Joint Conf.Neural Netw.,2002,3:2393-2398
    [134]von MERING C,et al.Comparative assessment of largescale data sets of proteinprotein interactions.Nature,2002,417:399-403.
    [135]Wahba G,Lin Y,and Zhang H.Generalized approximate cross validation for support vector machines,or,another way to look at margin-like quantities.Techical Report.Department of Statistics,University of Wisconsin,Feb.,25,1999.
    [136]Wang B,et al.Predicting protein interaction sites from residue spatial sequence profile and evolution rate.FEBS Letters,2006,580:380-384.
    [137]Wang H Q,Huang D S,and Wang B.Optimization of radial basis function classifiers using simulated annealing algorithm for cancer classification.Electr.Lett.,2005,41(11).
    [138]Wang J Q,Tao Q,and Wang J.Kernel projection algorithm for large-scale SVM problems.J.Comput.Sci.Techn.,2002,17(5):556-564.
    [139]Wang Y,and Huang S T.Training TSVM with the proper number of positive samples.Pattern Recogn.Lett.,2005,26:2187-2194.
    [140]Weston J,and Watkins C.Support Vector Machines for Multi-class Pattern Recognition.In Proc.7th Europ.Symp.Artif.Neural Netw.,April,1999.
    [141]Yang L H,et al.A discrimination of similar handwritten numerals based on invariant curvature features.Pattern Recogn.,2005,38:947-963.
    [142]Yoon M,Yun Y,and Nakayama.A role of total margin in support vector machines.In Proc.Intern.Joint Conf.Neural Netw.,2003,3:2049-2053.
    [143]Zeng X Y,and Chen X W.SMO-based pruning methods for sparse least squares support vector machines.IEEE Trans.Neural Netw.,2005,16:1541-1546.
    [144]Zhou D,Xiao B,Zhou H,et al.Global geometry of SVM classifiers,Technical report,AI Lab,Institute of Automation,Chinese Academy of Sciences,2002.
    [145]边肇祺,张学工.模式识别(第二版).北京:清华大学出版社.2000.
    [146]姜静清.最小二乘支持向量机及应用研究.[博士学位论文],吉林,吉林大学,2007,pp:25-44.
    [147]陶卿,曹进德,孙德敏.基于支持向量机分类的回归方法.软件学报,2002,13(5):1024-1028.
    [148]陶卿,孙德敏,范劲松等.基于闭凸包收缩的最大边缘线性分类器.软件学报,2002,13(3):404-409.
    [149]王珏,周志华,周傲英.机器学习及其应用.北京:清华大学出版社,2006,pp:32-58.
    [150]王磊.支持向量机学习算法若干问题研究.[博士学位论文],西安,电子科技大学.2007,pp:90-106.
    [151]张贤达.矩阵分析与应用.北京:清华大学出版社.2004.
    [152]周志华,王珏.机器学习及其应用2007.北京:清华大学出版社,2007,pp:49-84.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700