变样本量学习最小二乘支持向量机算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Variable Samples Learning Least Square Support Vector Machine Algorithm
  • 作者:加尔肯别克 ; 袁杰
  • 英文作者:JIA Erkenbieke;YUAN Jie;School of Electrical Engineering,Xinjiang University;
  • 关键词:最小二乘支持向量机 ; 稀疏性 ; 变样本量学习 ; 预剪枝 ; KKT条件
  • 英文关键词:Least Squares Support Vector Machine(LS-SVM);;sparseness;;variable samples learning;;pre-pruning;;KKT condition
  • 中文刊名:JSJC
  • 英文刊名:Computer Engineering
  • 机构:新疆大学电气工程学院;
  • 出版日期:2019-01-15
  • 出版单位:计算机工程
  • 年:2019
  • 期:v.45;No.496
  • 基金:国家自然科学基金(61863033);; 新疆维吾尔自治区自然科学基金(2016D01C032)
  • 语种:中文;
  • 页:JSJC201901032
  • 页数:8
  • CN:01
  • ISSN:31-1289/TP
  • 分类号:198-204+211
摘要
为增加最小二乘支持向量机(LS-SVM)算法解的稀疏性,提高其运算效率,提出一种变样本量学习LSSVM算法。从训练集中随机抽取部分样本作为初始工作集,在学习阶段将样本训练过程分为样本增量和样本减量2个阶段。在样本增量阶段,按KKT条件选取特定样本加入工作集并进行训练,在样本减量阶段,采用负松弛变量剪枝策略与基于对偶目标函数差的剪枝策略实现剪枝。在此基础上,采用工作集中的剩余样本构造学习分类器。实验结果表明,相对SMO、SMO-new、ISLS-SVM算法,该算法具有稀疏性高、运算速度快、无精度损失等优点。
        In order to increase the sparseness of the solution of Least Squares Support Vector Machine( LS-SVM)algorithm and improve its operation efficiency,a variable samples learning LS-SVM algorithm is proposed. Some samples are randomly selected from the training set as the initial working set,and the training process is divided into two stages:sample increment and sample reduction. In the sample increment stage,select specific samples according to KKT conditions to join the working set and train. In the sample reduction stage,Negative Slack Variable Pruning Strategy( NSVPS) and Dual Objective Function Deviation Pruning Strategy( DOFDPS) pruning strategy are used to achieve pruning. On this basis,the residual classifier is used to construct learning classifier. Experimental results show that compared with SMO,SMO-new and ISLS-SVM algorithm,the algorithm has the advantages of high sparsity,fast operation speed and no loss of precision.
引文
[1]CHERKASSKY V.The nature of statistical learning theory[J].Technometrics,2002,38(4):409.
    [2]邓乃杨,田英杰.支持向量机---理论、算法与拓展[M].北京:科学出版社,2009.
    [3]DENG N,TIAN Y,ZHANG C.Support vector machines:optimization based theory,algorithms,and extensions[M].[S.l.]:Chapman and Hall/CRC,2012.
    [4]李琦,李晓航,邢丽萍,等.基于lp-范数约束的LSSVR多核学习算法[J].控制与决策,2015,30(9):1603-1608.
    [5]BRABANTER K D,BRABANTER J D,SUYKENS J A K,et al.Optimized fixed-size kernel models for large data sets[J].Computational Statistics and Data Analysis,2010,54(6):1484-1504.
    [6]SUYKENS J A K,VANDEWALLE J.Least squares support vector machine classifiers[J].Neural Processing Letters,1999,9(3):293-300.
    [7]SUYKENS J A K,BRABANTER J D,LUKAS L,et al.Weighted least squares support vector machines:robustness and sparse approximation[J].Neurocomputing,2002,48(4):85-105.
    [8]SUYKENS J A K,LUKAS L,VANDEWALLE J.Sparse approximation using least square vector machines[C]//Proceedings of 2000 IEEE International Symposium on Circuits and Systems.Washington D.C.,USA:IEEE Press,2000:757-760.
    [9]SUYKENS J A K,LUKAS L,VANDEWALLE J.Sparse least squares support vector machine classifiers[C]//Proceedings of European Symposium on Artificial Neural Networks.Berlin,Germany:Springer,2000:37-42.
    [10]KRUIF B J D,VRIES T J A D.Pruning error minimization in least squares support vector machines[J].IEEETransactions on Neural Networks,2003,14(3):696-702.
    [11]ZENG X,CHEN X W.SMO-based pruning methods for sparse least squares support vector machines[M].Washington D.C.,USA:IEEE Press,2005.
    [12]LOPEZ L J,BRABANTER K D,DORRONSORO J R,et al.Sparse LSSVMs with L0-norm minimization[C]//Proceedings of European Symposium on Artificial Neural Networks,Computational Intelligence and Machine Learing.Berlin,Germany:Springer,2011:189-194.
    [13]KAIZHU H.Sparse learning for support vector classification[J].Pattern Recognition Letters,2010,31(13):1944-1951.
    [14]QI L,LI X H,BA W.Sparse least squares support vector machine w ith L0-norm in primal space[C]//Proceedings of 2015 IEEE International Conference on Information and Automation.Washington D.C.,USA:IEEE Press,2015:2778-2783.
    [15]杨晓伟,路节,张广全.一种高效的最小二乘支持向量机分类器剪枝算法[J].计算机研究与发展,2007,44(7):1128-1136.
    [16]马跃峰,梁循,周小平.一种基于全局代表点的快速最小二乘支持向量机稀疏化算法[J].自动化学报,2017,43(1):132-141.
    [17]周欣然,滕召胜,易钊.构造稀疏最小二乘支持向量机的快速剪枝算法[J].电机与控制学报,2009,13(4):626-630.
    [18]CAUWENBERGHS G,POGGIO T.Incremental and decremental support vector machine learning[C]//Proceedings of International Conference on Neural Information Processing Systems.Cambridge,USA:M ITPress,2000:388-394.
    [19]KEERTHI S S,SHEVADE S.SMO algorithm for leastsquares SVM formulations[J].Neural Computation,2003,15(2):487-507.
    [20]周志华.机器学习[M].北京:清华大学出版社,2016.编辑吴云芳

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700