孪生支持向量机关键问题的研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
孪生支持向量机(Twin Support Vector Machines, TWSVM)是在支持向量机(Support Vector Machines, SVM)的基础上提出的一种新的机器学习方法。对于分类问题,TWSVM要寻找的是一对非平行的分类超平面;对于回归问题,TWSVM要在训练数据点两侧产生一对不平行的函数,分别确定回归函数的不敏感上、下界。TWSVM在形式上类似于SVM,但其计算效率是SVM的4倍。鉴于TWSVM优秀的学习性能,目前已成为机器学习领域的研究热点。然而,由于TWSVM是机器学习领域中相对较新的理论,它在很多方面尚不成熟、不完善,需要进一步地研究和改进。其中,关于它的学习算法的研究是该理论的重点和难点之一。本文主要从提升泛化性能、提高学习速度以及增强学习过程的健壮性等几个方面对TWSVM进行研究。具体的研究内容如下:
     1.对光滑孪生支持向量机的新方法进行研究。针对目前光滑孪生支持向量分类机中采用的Sigmoid光滑函数逼近精度低的问题,采用具有更强逼近能力的Chen-Harker-Kanzow-Smale (CHKS)函数作为光滑函数,提出了光滑CHKS孪生支持向量分类机模型。其次,针对光滑孪生支持向量分类机没有考虑到样本位置对算法性能影响的问题,本文设计了一种隶属度函数,根据样本位置的不同赋予其不同的权重,提出了加权光滑CHKS孪生支持向量分类机模型,并从理论上证明其收敛性。最后,将所提算法推广到回归问题中,并采用离散粒子群优化(Particle Swarm Optimization, PSO)算法作为同时优化算法参数和特征选择的方法,提出了基于离散PSO模型选择的光滑CHKS孪生支持向量回归机,从理论上证明其任意阶光滑性和收敛性。
     2.对孪生支持向量机模型的无约束不可微近似求解方法进行研究。根据优化理论中的Karush-Kuhn-Tucker (KKT)互补条件,建立了孪生支持向量分类机的无约束不可微优化模型,并采用可以直接求解不可微优化问题的自适应调节最大熵函数法作为所提模型的求解方法。该方法在参数值较小的情况下就可逼近问题的最优解,克服了传统最大熵函数法需取很大的参数值才能逼近最优解,并且有可能导致数值溢出的问题。最后,将此算法推广到回归问题,提出了基于自适应调节最大熵函数法的孪生支持向量回归机模型。
     3.对最小二乘孪生支持向量回归机及其特征选择算法进行研究。为了提高孪生支持向量回归机(Twin Support Vector Regression, TSVR)的计算效率,引入最小二乘思想,将TSVR二次规划问题的不等式约束条件修正为等式约束条件,并将其代入目标函数,从而将TSVR的二次规划问题转化成为两个线性方程组问题,提出了最小二乘孪生支持向量回归机学习算法(Least Square TSVR,LSTSVR)。理论分析表明线性情况下的LSTSVR的计算复杂度仅与样本的维数有关,因此,LSTSVR的提出为大样本问题提供了一种有效的求解方法。在此基础上,为了提高LSTSVR求解高维问题的效率,本文提出了一种LSTSVR特征选择算法。首先,用1范数度量代替LSTSVR的2范数度量,可将LSTSVR中的两个线性方程组问题转化为两个线性规划问题。其次,通过具有快速收敛能力的牛顿法求解线性规划对偶问题中的外罚问题,原问题可以归结为求解线性方程系统。除了保留LSTSVR原有的优势,还具有速度快以及非常稀疏性的优势。对线性问题而言,意味着该方法可以自动选择样本的特征,从而达到降维的目的。
     4.对最小二乘孪生参数化不敏感支持向量回归机进行研究。首先,引入最小二乘方法,将孪生参数化不敏感支持向量回归机(Twin Parametric InsensitiveSupport Vector Regression, TPISVR)的两个二次规划问题转化为两个线性方程组问题,提出最小二乘孪生参数化不敏感支持向量回归机(Least Square TPISVR,LSTPISVR),从理论上分析了LSTPISVR的计算复杂性。其次,鉴于LSTPISVR的参数较多的问题,提出一种具有快速全局搜索能力的混沌布谷鸟优化算法,并将其作为LSTPISVR的参数选择方法,以提高LSTPISVR参数寻优的效率。
Twin Support Vector Machines (TWSVM) is a new machine learning methodbased on the theory of Support Vector Machine (SVM). Unlike SVM, forclassification problems, TWSVM wants to generate two nonparallel hyper-planes. Forregression problems, TWSVM aims at generating two nonparallel functions such thateach function determines-insensitive down-or up-bounds of the unknown rgressor.The formulation of TWSVM is very much similar to a classical SVM, however, thelearning speed of TWSVM approximately four times faster than that of the classicalof SVM. At present, TWSVM has become one of the popular methods because of itsexcellent learning performance. Because TWSVM is a relatively new theory in thefield of machine learning, it is not mature and perfect. Therefore, TWSVM needsfurther study and improvement. It is one of difficulty and emphases that study thelearning algorithm of TWSVM. This dissertation mainly does researches on TWSVMwith improving the generalization ability, sleeping up the learning speed andenhancing the robustness. All of the research results can be described as follows.
     1. Study on smooth TWSVM. Aiming at the low approximation ability ofSigmoid function of smooth twin support vector machines (STWSVM), using CHKSfunction which has better approximation ability as the smooth function, a new versionof smooth TWSVM called smooth CHKS twin support vector machines model isproposed. Furthermore, similar to TWSVM, STWSVM does not consider the differentpositions of samples effecting on its performance. In order to address this problem,we design a membership function method, which gives different importance for eachtraining sample according to the sample point positions. Based on this idea, a methodcalled weighted smooth CHKS twin support vector machines model is proposed. Wehave proved the convergence performance of our algorithm. Finally, the proposedalgorithm is extended to solve the regression problems. Using the discrete PSOalgorithm as the parameters optimization and feature selection method, a new smoothtwin support vector regression, term as smooth CHKS twin support vector regressionbased on discrete PSO, is proposed. We prove the arbitrary order smoothness andconvergence of our algorithm using mathematical method.
     2. Study on the unconstrained non-differential solving method of TWSVM.Based on KKT complementary condition, unconstrained non-differential optimizationmodel for TWSVM is proposed. An adaptive adjustable entropy function method is given to train the proposed model. The proposed method can find an optimal solutionwith relatively small parameters, which avoids the numerical overflow in thetraditional entropy function method. Finally, the adaptive adjustable entropy functionmethod is used to train twin support vector regression (TSVR) analogously. So anunconstrained non-differential optimization model for TSVR based on adaptiveadjustable entropy function is proposed.
     3. Study on least squares TSVR and its feature selection method. In order toimprove the computational efficiency of TSVR, in this paper, we propose a novelleast squares twin support vector regression, called LSTSVR for short. LSTSVRattempts to solve two modified primal problems of TSVR, instead of two dualproblems usually solved. The solution of the two modified primal problems reduces tosolve just two systems of linear equations as opposed to solving two quadraticprogramming problems along with two systems of linear equations in TSVR, whichleads to extremely simple and fast algorithm. Theoretical analysis shows that thecomputational complexity of LSTSVR only is relation to the dimension of samples inlinear case. Therefore, LSTSVR provides a kind effective method for solving largesamples problem. In order to improve the speed of LSTSVR in solving highdimensional problems, we propose a feature selection method for LSTSVR. Firstly,replacing all the2-norm terms in LSTSVR with1-norm ones so that we can convertthe formulation of LSTSVR to a linear programming (LP) problem. Secondly, byminimizing an exterior penalty problem of the dual of the LP formulation and using afast generalized Newton algorithm, our method yields very spare solutions, such thatit generates a regressor that depends on only a smaller number of input features. In thelinear case, this method can automatically select the input features.
     4. Study on least squares twin parametric insensitive support vector regression(LSTPISVR). In this paper, we formulate a least squares version of twin parametricinsensitive support vector regression (TPISVR). Firstly, introducing the least squaresmethod, the two quadratic programming problems of PISVR is converted into twosystems of linear equations. Then, the computational complexity of our algorithm isanalyzed. Further, chaotic cuckoo optimization algorithm is proposed and is used todo the parameter selection.
引文
[1] Mitchell TM,曾华军(译).机器学习[M].北京:机械工业出版社,2003.
    [2] Changman S. Intelligent jamming region division with machine learning and fuzzyoptimization for control of robot’s part micro-manipulative task [J]. Information Sciences,2014,256:211-224.
    [3]黄凤岗,宁克欧.模式识别[M].哈尔滨:哈尔滨工业大学出版社,1998.
    [4] Jodra P, Jimenez-Gamero MD. On a logarithmic integral and the moments of order statisticsfrom the Weibull-geometric and half-logistic families of distributions [J]. Journal ofMathematical Analysis and Applications,2014,410(2):882-890.
    [5] Bhattacharya R, Patrangenaru V. Statistics on manifolds and landmarks based image analysis:A nonparametric theory with applications [J]. Journal of Statistical Planning and Inference,2014,145:1-22.
    [6] Xu XZ, Ding SF, Shi ZZ, et al. A novel optimizing method for RBF neural network based onrough set and AP clustering algorithm [J]. Journal of Zhejiang University–SCIENCE C.,2012,13(2):131-138.
    [7] Quteishat A, Lim CP. A modified fuzzy min-max neural network with rule extraction and itsapplication to fault detection and classification [J]. Applied Soft Computing,2008,8(2):985-995.
    [8] Zheng YG, Bao LJ. Slow-fast dynamics of tri-neuron Hopfield neural network with twotimescales [J]. Communications in Nonlinear Science and Numerical Simulation,2014,19(5):1591-1599.
    [9] Valtierra-Rodriguez M., de Jesus Romero-Troncoso R, et al. Detection and classification ofsingle and combined power quality disturbances using neural networks [J]. IEEETransactions on Industrial Electronics,2014,61(5):2473-2482.
    [10] Han HG, Qiao JF. Nonlinear model-predictive control for industrial processes: an applicationto wastewater treatment process [J]. IEEE Transactions on Industrial Electronics,2014,61(4):1970-1982.
    [11] Huang SC, Chen B. Automatic moving object extraction through a real-worldvariable-bandwidth network for traffic monitoring system [J]. IEEE Transactions onIndustrial Electronics,2014,61(4):2099-2112.
    [12] Yao W, Chen XQ, Huang YY. A surrogate-based optimization method with RBF neuralnetwork enhanced by linear interpolation and hybrid infill strategy [J]. Optimization Methods&Software,2014,29(2):406-429.
    [13] Ozturk H, Kutay M E. An artificial neural network model for virtual superpave asphaltmixture design [J]. International Journal of Pavement Engineering,2014,15(2):151-162.
    [14] Ludwig O, Nunes U, et al. Eigenvalue decay: A new method for neural networkregularization [J]. Neurocomputing,2014,124:33-42.
    [15] Vapnik V N. Statistical learning theory [M]. New York: Wiley.1998.
    [16]朱永生,王成栋,张优云.二次损失函数支持向量机性能的研究[J].计算机学报,2003,26(8):982-989.
    [17]周水生,詹海生,周利华.训练支持向量机的Huber近似算法[J].计算机学报,2005.28(10):1664-1670.
    [18]方景龙,成铄.复杂分类问题支持向量机的简化[J].电子学报,2007,35(5):858-861.
    [19]曾志强,高济.基于向量集约简的精简支持向量机[J].软件学报,2007,18(11):2719-2727.
    [20] Cortes C, Vapnik V N. Support vector networks [J]. Machine Learning,1995,20:273-297.
    [21] Zhu F, Ye N, Yu W. Boundary detection and sample reduction for one-class support vectormachine [J]. Neurocomputing,2014,123:174-184.
    [22] Wu JX. Efficient HIK SVM learning for Image Classification [J]. IEEE Transactions onImage Processing.2012,21(10):4442-4453.
    [23] Tsai HH, Lai YS, Lo SC. A zero-watermark scheme with geometrical invariants using SVMand PSO against geometrical attacks for image protection [J]. Journal of Systems andSoftware,2013,86(2):335-348.
    [24] Rodrigo M, Joao FV, Neto G, Wilson P. Document-level sentiment classification: Anempirical comparison between SVM and ANN [J]. Expert Systems with Application.2013,40(2):621-633.
    [25] Jayadeva, KR, Suresh C. Twin support vector machines for pattern classification [J]. IEEETransactions on Pattern Analysis and Machine Intelligence,2007,29(5):905-910.
    [26]丁世飞,黄华娟,史忠植.加权光滑CHKS孪生支持向量机[J].软件学报,2013,24(11):2548-2557.
    [27] Huang HJ, Ding SF, Shi ZZ. Primal least squares twin support vector regression [J]. Journalof Zhejiang University–SCIENCE C,2013,14(9):722-732.
    [28] Huang HJ, Ding SF, Wu FL. Invasive weed optimization algorithm for optimizating thepatameters of mixed kernel twin support vector machines [J]. Journal of Computers,2013,8(8):2077-2084.
    [29] Xu XZ, Ding SF, Jia WK, et al. Research of assembling optimized classification algorithm byneural network based on Ordinary Least Squares (OLS)[J]. Neural Computing andApplications,2013,22(1):187-193.
    [30] Xu XZ, Ding SF, Shi ZZ, et al. Particle swarm optimization for automatic parametersdetermination of pulse coupled neural network [J]. Journal of Computers,2011,6(8):1546-1553.
    [31] Pan H, Zhu YP, Xia LZ. Efficient and accurate face detection using heterogeneous featuredescriptors and feature selection [J]. Computer Vision and Image Understanding,2013,117(1):12-28.
    [32] Chen ZY, Zhi ZP. Distributed customer behavior prediction using multiplex data: Acollaborative MK-SVM approach [J]. Knowledge-Based Systems,2012,35:111-119.
    [33] Moraes R, Valiati JF, Gaviao N, Wilson P. Document-level sentiment classification: Anempirical comparison between SVM and ANN [J]. Expert Systems with Application,2013,40(2):621-633.
    [34] Wu JX. Efficient HIK SVM learning for Image Classification [J]. IEEE Transactions onImage Processing,2012,21(10):4442-4453.
    [35] Boser B, Guyon I, Vapnik VN. A training algorithm for optimal margin classifiers [C]. InProceedings of5th Annual Workshop Computation on Learning Theory. Pittsburgh, PA:ACM,1992.
    [36] Osuna E, Freund R. An improved training algorithm for support vector machines [C].Proceedings of the1997IEEE Workshop on Neural Networks for Signal Processing. NewYork: IEEE Press,1997:276-285.
    [37] Lin CJ. On the convergence of the Decomposition method for support vector machine [J].IEEE Transaction on Neural Networks,2001,12:1288-1298.
    [38] Platt JC. Using analytic QP and sparseness to speed training of support vector machines [C].In M. Kearns, S. Solla and D.Cohn, Advances in Neural Information Processing Systems11.Cambridge, MA: MIT Press,1999,557-563.
    [39] Keerthi SS, Shevade SK, et al. Improvements to Platt’s SMO Algorithm for SVM classifierdesign [J]. Neural Computation,2001,13:637-649.
    [40] Chang CC, Lin CJ. LIBSVM: A Library for support vector machines,2001. Softwareavailable at http://www.csie.ntu.edu.tw/~cjlin/libsvm.
    [41] Suykens J, Vandewalle J. Least squares support vector machine classifiers [J]. NeuralProcessing Letters,1999,9(3):293-300.
    [42] Zhang GS, Wang SW, Wang YM. LS-SVM approximate solution for affine nonlinear systemswith partially unknown functions [J]. Journal of Industrial and Management Optimization,2014,10(2):621-636.
    [43] Mehrkanoon S, Suykens JAK. Parameter estimation of delay differential equations: Anintegration-free LS-SVM approach [J]. Communications In Nonlinear Science andNumerical Simulation,2014,19(4):830-841.
    [44] Long B, Li M, Wang HJ, et al. Diagnostics of analog circuits based on LS-SVM usingtime-domain feature [J]. Circuits Systems and Signal Processing,2013,32(6):2683-3706.
    [45] Cheng MY, Nhat-Duc H, Wu YW. Hybrid intelligence approach based on LS-SVM anddifferential evolution for construction cost index estimation: A Taiwan case study [J].Automation in Construction,2013,35:306-313.
    [46] Dong SJ, Luo TH. Bearing degradation process prediction based on the PCA and optimizedLS-SVM model [J]. Measurement,2013,46(9):3143-3152.
    [47] Fung G, Mangasarian O L. Linear programming support vector machines classifier [C].Patent approval pending, Wisconsin Alumni Research Foundation,2003.
    [48] Yang LM, Wang LS. A class of semi-supervise support vector machines by DC programming[J]. Advances in Data Analysis and Classification,2013,7(4):417-433.
    [49] Zhang L, Zhou WD. A fast algorithm for kernel1-norm support vector machines [J].Knowledge-Based Systems,2013,52:223-235.
    [50] Rivas-Perea P, Cota-Ruiz J, et al. An algorithm for training a large scale support vectormachine for regression based on linear programming and decomposition methods [J]. PatternRecognition Letters,2013,34(6):678-679.
    [51] Fung G, Mangasarian O L. Proximal Support Vector Machine Classifiers [C]. In: Proc7thACM SIFKDD Intl Conf on Knowledge Discovery and Data Mining,2001,77-86.
    [52] Yu L, Yao X. A total least squares proximal support vector classifier for credit risk evaluation[J]. Soft Computing,2013,17(4):643-650.
    [53] Zhu ZF, Zhu XQ, Guo YF. Inverse matrix-free incremental proximal support vector machine[J]. Decision Support Systems,2012,53(3):395-405.
    [54] Khemchandani R, Karpatne A, Chandra S. Generalized eigenvalue proximal support vectorregressor [J]. Expert Systems with Applications,2011,38(10):13136-13142.
    [55] Niu LF. Parallel algorithm for training multiclass proximal support vector machine [J].Applied Mathematics and Computation,2011,217(12):5328-5337.
    [56] Mangasarian O L, Wild Edward W. Multi-surface proximal support vector machineclassification via generalized eigenvalues [J]. IEEE Transactions on Pattern Analysis andMachine Intelligence,2006,28(1):69-74.
    [57] Shao YH, Deng NY, Chen WJ. Improved generalized eigenvalue proximal support vectormachine [J]. IEEE Signal Processing Letters,2013,20(3):213-216.
    [58] Yang XB, Chen SC, Chen B. Proximal support vector machine using local information [J].Neurocomputing,2009,73:1-3.
    [59] Shao YH, Deng NY, Chen WJ. A proximal classifier with consistency [J]. Knowledge-BasedSystems,2013,49:171-178.
    [60] Chen WJ, Shao YH, Jiang YB, et al. Ensemble learning for generalized eigenvalues proximalsupport vector machines [J]. International Journal of Computer Applications in Technology,2013,47(2):273-279.
    [61] Cong HH, Yang CF, Pu XR. Efficient Speaker Recognition based on Multi-class TwinSupport Vector Machines and GMMs [C].2008IEEE Conference on Robotics, Automationand Mechatronics,2008,348-352.
    [62] Zhang XS, Gao XB, Wang Y. Twin Support Vector Machine for MCs Detection [J]. Journalof Electronics (China),2009,26(3):318-325.
    [63] Arun Kumar M, Gopal M. Least squares twin support vector machines for patternclassification [J]. Expert Systems with Applications,2009,36(4):7535-7543.
    [64] Chen J, Ji GG. Weighted least squares twin support vector machines for patternclassification[C].2010The2nd International Conference on Computer and AutomationEngineering, Singapore,2010,2:242-246.
    [65] Arun Kumar M, Reshma K, Gopal M, Suresh C. Knowledge based Least Squares Twinsupport vector machines [J]. Information Sciences,2010,180(23):4606–4618.
    [66] Wang D, Ye QL, Ye N. Localized Multi-plane TWSVM Classifier via ManifoldRegularization[C].20102nd International Conference on Intelligent Human-MachineSystems and Cybernetics (IHMSC),2010,2:70-73.
    [67] Wang D, Ye N, Ye QL. Twin support vector machines via fast generalized Newtonrefinement[C].2010The2nd International Conference on Intelligent Human-MachineSystems and Cybernetics, Nanjing, Jiangsu:[s.n.],2010,2:62-65.
    [68] Gao SB, Ye QL, Ye N.1-Norm least squares twin support vector machines [J].Neurocomputing,2011,74:3590-3597.
    [69] Ye QL, Ye N, Gao SB. Density-based weighting multi-surface least squares classificationwith its applications [J]. Knowledge Information System,2012,33:289-308.
    [70] Peng XJ. Least squares twin support vector hypersphere (LS-TSVH) for pattern recognition[J]. Expert Systems with Applications,2010,37:8371-8378.
    [71] Chen XB, Yang J, Ye QL, Liang J. Recursive projection twin support vector machine viawithin-class variance minimization [J]. Pattern Recognition,2011,44:2643-2655.
    [72] Shao YH, Deng NY, Yang ZM. Least squares recursive projection twin support vectormachine for classification [J]. Pattern Recognition,2012,45:2299-2307.
    [73] Hua XP, Ding SF. Matrix pattern based projection twin support vector machines [J].International Journal of Digital Content Technology and its Applications,2012,6(20):172-181.
    [74] Ding SF, Hua XP. Recursive least squares projection twin support vector machines [J].Neurocomputing,2014,130:3-9.
    [75] Shao YH, Wang Z, Chen WJ, Deng NY. A regularization for the projection twin supportvector machine [J]. Knowledge-Based Systems,2013,37:203-210.
    [76] Jayadeva RK. Optimal kernel selection in twin support vector machines [J]. Optimal Letter,2009,3:77-88.
    [77] Yu JZ, Ding SF, Jin FX, et al. Twin support vector machines based on rough sets [J].International Journal of Digital Content Technology and its Applications,2012,6(20):493-500.
    [78] Ding SF, Yu JZ, Huang HJ, et al. Twin support vector machines based on particle swarmoptimization [J]. Journal of Computers,2013,8(9):2296-2303.
    [79] Ding SF, Wu FL, Nie R, et al. Twin support vector machines based on quantum particleswarm optimization [J]. Journal of Software,2013,8(7):1743-1750.
    [80] Peng XJ. TPMSVM: A novel twin parametric-margin support vector machine for patternrecognition [J]. Pattern Recognition,2011,44:2678-2692.
    [81] Shao YH, Wang Z, Chen WJ, et al. Least squares twin parametric-margin support vectormachine for classification [J]. Application Intelligent,2013,39:451-464.
    [82] Peng XJ, Wang YF, Xu D. Structural twin parametric-margin support vector machine forbinary classification [J]. Knowledge-Based Systems,2013,49:63-72.
    [83] Wang Z, Shao YH, Wu TR. A GA-based model selection for smooth twin parametric-marginsupport vector machine [J]. Pattern Recognition,2013,46:2267-2277.
    [84] Arun Kumar M, Gopal M. Application of smoothing technique on twin support vectormachines [J]. Pattern Recognition Letters,2008,29(13):1842-1848.
    [85] Santanu G, Anirban M, Pranab KD. Nonparallel plane proximal classifier [J]. SignalProcessing,2009,89:510-522.
    [86] Peng XJ. A v-twin support vector machine (v-TWSVM) classifier and its geometricalgorithms [J]. Information Sciences,2010,180(20):3863-3875.
    [87] Ganesh R, Naik M, et al. Twin SVM for gesture classification using the surfaceelectromyogram [J]. IEEE Transactions on Information Technology in Biomedicine,2010,14(2):301-308.
    [88] Peng XJ. Building sparse twin support vector machine classifiers in primal space [J].Information Sciences,2011,181:3967-3980.
    [89] Shao YH, Zhang CH, Wang XB, et al. Improvements on twin support vector machines [J].IEEE Transactions on Neural Networks,2011,22(6):962-968.
    [90] Ye QL, Zhao CX, Ye Ning, et al. Localized twin SVM via convex minimization [J].Neurocomputing,2011,74:580-587.
    [91] Peng XJ, Xu D. Twin mahalanobis distance-based support vector machines for patternrecognition [J]. Information Sciences,2012,200:22-37.
    [92] Shao YH, Deng NY. A coordinate descent margin based-twin support vector machine forclassification [J]. Neural Networks,2012,25:114-121.
    [93] Xu YT, Wang LS, Zhong P. A rough margin-based v-twin support vector machine [J]. NeuralComputing&Applications,2012,21:1307-1317.
    [94] Qi ZQ, Tian YJ, Shi Y. Laplacian twin support vector machine for semi-supervisedclassification [J]. Neural Networks,2012,35:46-53.
    [95] Shao YH, Deng NY, Yang ZM, et al. Probabilistic outputs for twin support vector machines[J]. Knowledge-Based Systems,2012,33:145-151.
    [96] Qi ZQ, Tian YJ, Shi Y. Twin support vector machine with Universum data [J]. NeuralNetworks,2012,36:112-119.
    [97] Zhang XS, Gao XB. Twin support vector machines and subspace learning methods formicro-calcification clusters detection [J]. Engineering Applications of Artificial Intelligence,2012,25:1062-1072.
    [98] Ye QL, Zhao CX, Gao SB, et al. Weighted twin support vector machines with localinformation and its application [J]. Neural Networks,2012,35:31-39.
    [99] Shao YH, Deng NY. A novel margin-based twin support vector machine with unitynormhyperplanes [J]. Neural Computing&Applications,2013,22:1627-1635.
    [100] Peng XJ, Xu D. A twin-hypersphere support vector machine classifier and the fast learningalgorithm [J]. Information Sciences,2013,221:12-27.
    [101] Peng XJ, Xu D. Bi-density twin support vector machines for pattern recognition [J].Neurocomputing,2013,99:134-143.
    [102] Wang YN, Tian YJ. Local and global regularized twin SVM [J]. Procedia Computer Science,2013,18:1710-1719.
    [103] Yang ZX, Shao YH, Zhang XS. Multiple birth support vector machine for multi-classclassification [J]. Neural Computing&Applications,2013,22: S153-S161.
    [104] Peng XJ, Xu D. Norm-mixed twin support vector machine classifier and its geometricalgorithm [J]. Neurocomputing,2013,99:486-495.
    [105] Peng XJ, Xu D. Robust minimum class variance twin support vector machine classifier [J].Neural Computing&Applications,2013,22:999-1011.
    [106] Qi ZQ, Tian YJ, Shi Y. Robust twin support vector machine for pattern classification [J].Pattern Recognition,2013,46:305-316.
    [107] Qi ZQ, Tian YJ, Shi Y. Structural twin support vector machine for classification [J].Knowledge-Based Systems,2013,43:74-81.
    [108]业巧林,赵春霞,陈小波.基于正则化技术的对支持向量机特征选择算法[J].计算机研究与发展,2011,48(6):1029-1037.
    [109]谢娟英,张兵权,汪万紫.基于双支持向量机的偏二叉树多类分类算法[J].南京大学学报(自然科学),2011,47(4):354-363.
    [110] Peng XJ. TSVR: An efficient twin support vector machine for regression [J]. NeuralNetworks,2010,23:365-372.
    [111] Peng XJ. Primal twin support vector regression and its sparse approximation [J].Neurocomputing,2010,73:2846-2858.
    [112] Singh M, Chadha J, Ahuja P, et al. Reduced twin support vector regression [J].Neurocomputing,2011,74:1474-1477.
    [113] Xu YT, Wang LS. A weighted twin support vector regression [J]. Knowledge-Based Systems,2012,33:92-101.
    [114] Chen XB, Yang J, Liang J, et al. Smooth twin support vector regression [J]. NeuralComputing&Applications,2012,21:505-513.
    [115] Zhong P, Xu YT, Zhao YH. Training twin support vector regression via linear programming[J]. Neural Computing&Applications,2012,21:399-407.
    [116] Peng XJ. Efficient twin parametric insensitive support vector regression model [J].Neurocomputing,2012,79:26-38.
    [117] Shao YH, Zhang CH, Yang ZM, et al. An-twin support vector machine for regression [J].Neural Computing&Applications,2013,23:175-185.
    [118] Balasundaram S, Tanveer M. On Lagrangian twin support vector regression [J]. NeuralComputing&Applications,2013,22: S257-S267.
    [119] Zhao YP, Zhao J, Zhao M. Twin least squares support vector regression [J]. Neurocomputing,2013,118:225-236.
    [120] Arjunan S P, Kumar D K, Naik G R. A machine learning based method for classification offractal features of forearm sEMG using Twin Support Vector Machines[C]. Engineering inMedicine and Biology Society (EMBC),2010Annual International Conference of the IEEE,2010,4821-4824.
    [121] Ganesh R N, Dinesh K K, Jayadeva. Twin SVM for gesture classification using the surfaceelectromyogram [J]. IEEE Transactions on Information Technology in Biomedicine,2010,14(2):301-308.
    [122] Liu M, Xie Y,Yao Z, et al. A New Hybrid GMM/SVM for Speaker Verification[C].International Conference on Pattern Recognition,2006,4:314-317.
    [123] Liu MH, Dai BQ, Xie YL, et al. Improved GMM-UBM/SVM for Speaker Verification [C].2006IEEE International Conference on Acoustics, Speech, and Signal Processing,2006,1:1925-1928.
    [124] Fine S, Navratil J, Gopinath R. A Hybrid GMM/SVM Approach to Speaker Identification[C].2001IEEE International Conference on Acoustics, Speech, and Signal Processing,2001,1:417-420.
    [125] Zhang XS. Boosting twin support vector machine approach for MCs detection[C].Asia-Pacific Conference on Information Processing,2009,46:149-152.
    [126] Zhang XS, Gao XB. MCs Detection Approach Using Bagging and Boosting Based TwinSupport Vector Machine [C].2009IEEE International Conference on Systems, Man, andCybernetics San Antonio, TX, USA,2009,5000-5005.
    [127] Ding XJ, Zhang GL, Ke YZ, Ma BL, Li ZC. High Efficient Intrusion DetectionMethodology with Twin Support Vector Machines [C].2008International Symposium onInformation Science and Engieering,2008,1:560-564.
    [128] Yang CF, Zhang Y, Lin Z. Function Approximation Based on Twin Support Vector Machines[C].2008IEEE Conference on Cybernetics and Intelligent Systems,2008,259-264.
    [129] Vapnik VN. Statistical Learning Theory [M]. Springer, Berlin,1998.
    [130] Vapnik VN. The nature of statistical learning theory [M]. Springer, Berlin,1995.
    [131]张学工.关于统计学习理论与支持向量机[J].自动化学报,2000,26(1):32-42.
    [132]邓乃扬,田英杰.数据挖掘中的新方法—支持向量机[M].北京:科学出版社,2004.
    [133] Cristianini N, Shawe-Taylor J.著,李国正,王猛,曾国华译.支持向量机导论[M].机械工业出版社,2004.
    [134]吴青.基于优化理论的支持向量机学习算法研究[D].西安:西安电子科技大学,2009.
    [135]刘叶青.原始空间中支持向量机若干问题的研究[D].西安:西安电子科技大学,2009.
    [136]袁玉波,严杰,徐成贤.多项式光滑的支撑向量机[J].计算机学报,2005,28(1):9-17.
    [137] Kennedy J, Eherhart RC. Particle swarm optimization [C].//Proceedings of IEEEInternational Conference on Neural Networks. Piscataway NJ: IEEE Press,1995,4:1942-1948.
    [138] Kennedy J, Eherhart RC. A discrete binary version of the particle swarm algorithm [C].//IEEE Conference on Systems, Man, and Cybernetics. Piscataway NJ: IEEE Press,1997,5:4104-4109.
    [139]李兴斯.一类不可微优化问题的有效解法[J].中国科学(A辑),1994,24(4):371-377.
    [140] Hao PY. New support vector algorithms with parametric insensitive/margin model [J].Neural Networks,2010,23(1):60-73.
    [141] Yang X S, Deb S. Cuckoo search via Lévy flights[C]//Nature&Biologically InspiredComputing,2009. NaBIC2009. World Congress on. IEEE,2009:210-214.
    [142] Shao YH, Deng NY, Chen WJ. Improved generalized eigenvalue proximal support vectormachine [J]. IEEE Signal Processing Letters,2013,20(3):213-216.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700