支持向量机相关方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
基于统计学习理论的支持向量机(Support Vector Machine,SVM)技术具有坚实的数学理论基础,在模型推广能力,全局最优,非线性处理等方面均表现突出,已成为机器学习领域最活跃的研究重点之一。
     本文以传统的支持向量机和新兴的基于非平衡双超平面的支持向量机为主要研究对象,对其相关方法进行了较深入的探讨和研究。研究内容涉及新的加权型分类器模型的构建、多类分类方法、参数选择方法、样本约简方法等方面。主要工作包括:
     1.对非平衡双超平面支持向量机的分类器模型进行研究。提出了一种新的加权最小二乘双支持向量机分类器(WLSTSVM),它通过在错误变量上设置权向量,来解决最小二乘双支持向量机(LSTSVM)因采用SSE损失函数而引起的鲁棒性差的问题。给出了WLSTSVM在线性和非线性情况下的问题表达式及详细的推导过程,并给出了相应的算法实现流程。实验表明,WLSTSVM在提高抗噪能力以及分类识别率方面均有相对较好的表现。
     2.对非平衡双超平面支持向量机的多类分类方法进行探讨。以LSTSVM为基本的两类分类器,给出了一种基于优化有向无环图的LSTSVM多类分类方法(ODAG-LSTSVM)。用一个基于平均距离测度的类别可分性准则来确定将不同类样本分开的难易程度,同时给出了相应的类别序号重排方法,其目的是为了克服传统DAG结构可能引起的误差累计。实验证明了该方法在测试精度和执行速度方面均有不俗表现。
     3.对支持向量机的参数选择方法进行了探讨和研究。鉴于微分进化算法在寻求全局最优解、解决多峰问题等方面的优良表现,对其变异策略和控制参数两方面进行了探讨分析。首先,给出了一种基于自然界量变与质变规律的变异策略,定义了随进化代数自调整的变异发生质变的概率公式;然后,遵循适者生存的原则,给出了一种以当前代最优个体的适应度函数为调节准则的控制参数自适应调整策略。从经典函数求最优解的实验验证了基于上述策略的改进DE算法可以获得更好的平均最优解,且收敛速度较快。最后,将这种改进算法用于NPPC的参数选择,给出了相应的算法实现流程,其实验结果表明其在精度与速度方面均具有良好的性能。
     4.对基于支持向量机的样本约简方法进行了研究。主要包括两方面的内容,即样本个数约简和样本属性约简。首先,给出了一种被称为KD-FFMVM的样本个数约简方法,该方法考虑到消除孤立点、噪声点等的不良影响,以及尽可能多的提取边缘交界样本,来防止支持向量的流失;然后,在分析现有属性约简方法的基础上,对提高核Hebbian算法的收敛速度进行了探讨,给出了一种以柯西分布概率密度函数来修正学习速率,实现迭代过程自适应调整的方法,并实验验证了该方法的有效性;最后,设计了一个基于约简策略的支持向量机分类模型,与标准SVM相比,它可以在保持相当精度的同时,大大减少算法的训练时间。
Based on statistical learning theory, support vector machine (Support Vector Machine, SVM) technology has a solid theoretical foundation in mathematics. Generalization ability of the model, the global optimum, non-linear processing, etc. are outstanding, it has become the most active areas of machine learning research priorities.
     In this paper, the traditional SVMs and emerging nonparallel dual hyperplane SVMs are the main object of study, its methods were related to more in-depth discussion and study. The study involved a new weighted model building for classifier, multi-class classification, parameter selection method, the sample reduction method and so on. Its main tasks are:
     1. The study of Nonparallel dual hyperplane SVM classifier model. A new weighted least squares twin support vector machine classifier (WLSTSVM) is given, which set up through the weight vector of error variables, to solve the poor robustness problem of the least squares twin support vector machine (LSTSVM) due to the SSE loss function. Expression and derivation are provided in the case of linear and nonlinear, including the corresponding algorithm implementation process. Experiments show that WLSTSVM improves noise immunity and the recognition rates.
     2. Multi-classification methods for nonparallel dual hyperplane SVM are discussed. A multi-class LSTSVM classifier based on optimal directed acyclic graph (ODAG-LSTSVM) is given. A class separability criterion based on distance measure and the corresponding category number rearrangement method are provided in order to determine the class divisibility, its purpose is to overcome the cumulative errors cased by the traditional DAG structure. Experiments prove that the method has impressive performance in the test accuracy and execution speed.
     3. The parameter selection methods based on SVM are discussed and studied. The differential evolution algorithm has a good performance in both global optimal solution and the problem of multi-peak, so the mutation strategy and control parameters are discussed. First of all, a mutation strategy based on the quantitative and qualitative change in the law of nature is given, defined the adjusted probability formula for qualitative change; then, follow the principle of survival of the fittest, the adaptive strategies for control parameters are given,which follow the current generation of the best individual fitness function. The optimal solution experiments for classic funtion based on improved DE algorithm get a better average optimal solution and convergence speed. Finally, this improved algorithm is applied to the NPPC's preferences, given the appropriate algorithm process, the results show good performance in both precision and speed.
     4. The sample reduction methods based on SVM are studied. Two aspects are included, namely, the sample number reduction and attribute reduction. First, a number reduction method known KD-FFMVM is given, which takes into account the elimination of isolated points, the adverse effects of noise points, and extracts the edge junction samples as much as possible, to prevent the loss of the support vectors; then, based on the analysis of existing attribute reduction methods, nuclear Hebbian algorithm is discussed in order to improve the speed of convergence, the Cauchy distribution probability density function is used to fix the learning rate, which achieves adaptive adjustment, and the effectiveness of the method has been proved; Finally, a SVM classification model based on the reduction strategies is designed, compared with the standard SVM, it can maintain a considerable accuracy while greatly reducing the algorithm training time.
引文
[1]Lunts A., Brailovskiy V.Evaluation of Attributes Obtained in Statistical Decision Rules.Engineering Cybernetics,1967, Vol.3, No.1:98-109.
    [2]范明,柴玉梅,咎红英等译.统计学习基础一教据挖掘、推理与预测.北京:电子工业出版社,2004.
    [3]Vapnik V., Chapelle O.Bounds on Error Expectation for Support Vector Machine.In:Smola, Bartlett, et al.(ed.), Advances in Large Margin Classifiers, Cambridge, MA:MIT Press, 2000,5-26.
    [4]OPPer M., Wlnther O.Gaussian Processes and SVM:Mean Field and Leave-one-out. In: Smola, Bartlett, et al.(ed.), Andvances in Large Marge Classifiers. Cambridge, MA:MIT Press,2000,311-326.
    [5]Joachims.Estimating the Generalization Performance of a SVM Efficiently.In:Proeeedings of the International Conference on Machine Learning.Morgan Kaufrnan,2000.
    [6]Jaakkola T., Haussler D.Probabilistic Kernel Regression Models.In:Proceedings of the 7th Workshop on AI and Statistics, San Francisco,1999:26-34.
    [7]Wahba G, Lin Y., Zhang H. Generalized Approximate Cross-validation for Support Vector Machines:Another Way to Look at Margin-like Quantities. In:Smola, Bartlett, et al.(ed.), Andvances in Large Marge Classifiers. Cambridge, MA:MIT Press,2000,397-309.
    [8]Ingo Steinwart.On the Influence of the Kernel on the Consistency of Support Vector Machines. Journal of Machine Learning Research,2001,2,67-93.
    [9]O.Chapella,V.Vapnik.Choosing multiple parameters for support vector machines.Machine Learning,2002,46,1,131-159.
    [10]S.S.Keerthi,C.J.Lin.Asymptotic behaviors of support vector machines with Gaussian kernel. Neural computation,2003,15,1667-1689.
    [11]Shun-Ichi Amari,Si Wu.Improving support vector machine classifiers by Modifying kernel functions.Neural Networks,1999,12,6,783-789.
    [12]CourantR, HilbertD.Methods of Mathematical Physies, NY:J.Wiley,1953
    [13]T.S.Jaakkola, M.Diekhans and D.Haussler. Adiscriminative framework for detecting remote Protein homologies. J.ComP.Biol.,7:95-114,2000.
    [14]K.Tsuda, M.Kawanabe, G.Ratsch, S.Sonnenburg and K.R.Muller.A new Discriminative kernel from Probabilistic models.2001.
    [15]Ying Tan,Jun Wang.A Support Vector Machine with a hybrid kernel and minimal Vapnik-Chervonenkis dimension.IEEE Transactions on Knowledge And Data Engineering 2004,16,4,385-395.
    [16]N.Cristianini,A.Eliseef,J.Shawe-Taylor and J.Kandola.On Kernel-Target Alignment.In Advances in Neural Information Processing Systems 14,MIT Press,2002
    [17]G.Lanckriet,N.Cristianini,P.L.Bartlett,L.El Ghaoui,and M.Jordan.Learning the Kernel matrix with semi-definite programming.The Journal of Machine Learning Research.2004, 5,27-72.
    [18]K.Q.Weinberger,F.Sha,L.K.Saul.Learning a Kernel Matrix for Nonlinear Dimensionality Reduction. Proceedings of the 21st International Conference on Machine Learning,Banff, Canada,2004.
    [19]K.Q.Weinberger,L.K.Saul.Unsupervised learning of image manifolds by Semidefinite programming.Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition(CVPR-04).Washington D.C.,2004
    [20]T.Graepel,R.Herbrich.Invariant Pattern Recognition by Semidefinite Programming Machines.Advances in Neural Information Processing Systems 16,33-40,2004.
    [21]Cristianini N, Kandola J, Elisseeff A, et al. On kernel target alignment[C]//Dietterich T G, Becker S. Ghahramani Z. Advances in Neural Information Processing Systems 14. Cambridge:MIT Press,2002:367-374.
    [22]A J, Seholkopf B, Mullher K R. The connection between regularization operators and support vectors kernels[J]. Neural Networks,1998,11(3):637-649.
    [23]Evgeniou T, Pontil M, Poggio T. Regularization networks and support vector machines [J]. Advance in Computational Mathematics,2000,13:1-50.
    [24]吴涛,贺汉根,贺明科.基于插值的核函数构造[J].计算机学报,2003,26(8):990-996.
    [25]Steven C.H.Hoi, Rong Jin. Active Kernel Learning. Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland,2008.
    [26]Mehmet Gonen, Ethem Alpaydrn. Localized Multiple Kernel Learning. Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland,2008.
    [27]Cortes C, Vapnik V.N. Support vector networks[J]. Machine Learning,1995,20:144-152。
    [28]Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines[C].Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing,1997:276-285.
    [29]Joachims T.Making large-scale support vector machine learning practical[A].In:Scholkopf B, Burges C.J.C, Smola A.J, Edits, Advances in Kernel Methods:Support Vector Learning[C].Cambridge, MA:MIT Press,1998:169-184.
    [30]Platt J.C. Fasting training of support vector machines using sequential minimal optimization[A].In:Scholkopf B, Burges C.J.C, Smola A.J, Edits, Advances in Kernel Methods:Support Vector Learning[C]. Cambridge,MA:MIT Press,1998:185-208.
    [31]Roobaert D. DirectSVM:a fast and simple support vector machine perception [C].Proceedings of the 2000 IEEE Signal Processing Society Workshop on Neural Networks, Sydey, Australia,2000,1:356-365
    [32]Yang M.H, Ahuja N. A geometric approach to train support vector machines [C]. Proceedings of the 2000 IEEE Computer Society Conference on Computer Vision and Pattern Recognition,2000,1:430-437
    [33]S.N.Ahmed. Incremental learning with support vector machines.In Proceedings of Workshop on Support Vector Machines, Stockholm, Sweden, (1999):208-307.
    [34]J.A.K.Suykens, J.Vandewalle, B.De Moor. Optimal control by least squres support vector machine. Neural Network,(2001)14(2):23-35.
    [35]M.Carozza, S.Rampone. Towards an incremental SVM for regression. In Proceedings of the IEEE-INN S-ENNS International Joint Conference on Neural Networks, Como, Italy,(2000)6:405-410.
    [36]Cao L.J., Keerthi S.S., et al. Parallel Sequential Minimal Optimization for the Training of Support Vector Machines. IEEE Transaction on Neural Networks,2006, Vol.17, No.4:1039-1049.
    [37]Dong J.X., Krzyzak A., et al. Fast SVM Training Algorithm with Decomposition on Very Large Data Sets. IEEE Transaction on Pattern Analysis and Machine Intelligence,2005, Vol.27:No.4:603-618.
    [38]Vapnik V. The Nature of Statistical Learning Theory, Springer-Verlag, New York,1995. Managasarian O., Musicant D. Successive Overrelaxation for Support Vector Machines. IEEE Transactions on Neural Networks,1999, Vol.10, No.5:1032-1037.
    [39]Scholkopf B., Smola A. Bartlett P. New Support Vector Algorithms. Neural Computation, 2000, vol.12:1207-1245. Chang C.C., Lin C.J. Training v-Support Vector Classifiers:Theory and Algorithms. Neural Computation,2001, Vol.13, No.9:2119-2147.
    [40]Friess T T, et al. The kernel-adatron algorithm:A fast and simple learniOng Procedure for support vector machines, ICML98,1998:188-196. Mangasarian O L, Musicant D R. Successive overrelaxation for support Vector machines. IEEE Trans. On Neural Networks,1999,10(5):1032-1037.
    [41]Scholkopf B, Platt J. Shawe-Taylor J. et al. Estimating the Support of a High-Dimensional Distribution. Neural Computation,2001, Vol.13, No.7:1443-1471. Tax D. Duin R. Support Vector Domain Description. Pattern Recognition Letters,1999, Vol.20,1191-1199.
    [42]Mangasarian O., Musicant D. Lagrange Support Vector Machines. Journal of Machine Learning Research,2001, Vol.1:161-177.
    [43]Fung G, Mangasarian O. Proximal Support Vector Machine Classifiers. In:Proceedings of the 7th International Conference on Knowledge Discovery and Data Mining,San Francisco,CA,2001.
    [44]Lin Chunfu, Wang Shende. Fuzzy support vector machines[J].IEEE Transactions on Neural Networks,2002,13(2):464-471.
    [45]Suykens J.A., Vandewalle J. Least Squares Support Vector Machines Classifiers. Neural Processing Letters-1999, Vol.9, No.3:293-300.
    [46]J.A.K.Suykens, J.De Brabanter, L.Lukas, J.Vandewalle, Weighted least squares support vector machines:robustness and sparse approximation, Neurocomputing 48(2002)85-105.
    [47]Lee Y.J., Mangasarian O. RSVM:Reduced Support Vector Machines. In:Proceedings of the 1st SI AM International Conference on Data Mining,2001.
    [48]LeeY.J., Mangasarian O. SSVM:A Smooth Support Vector Machines. Computational Optimization and Applications,2001, Vol.20, No.l:5-22.
    [49]Vapnik V.N. Statistical learning theory[M].New York:John Wiley and Sons,1998. Joachi-ms T. Transductive inference for text classification using support vector machines [C].Proceedings of the 16th International Conference on Machine Learning, San Francisco: Morgan Kaufmann Publishers,1999:200-209. Chen Yisong, Wang Guoping, Dong Shihai. Learning with progressive transductive support vector machine[C]. Proceedings of the 2002 IEEE International Conference on Data Mining, Maebashi City, Japan,2002:67-74.陈毅松,汪国平,董士海.基于支持向量机的渐进直推式分类学习算法[J].软件学报,2003,14(3):451-460.
    [50]Vapnik V N. Statistical Learning Theory.New York:Wiley.1998.
    [51]Vapnik V N. An Overview of Statistical Learning Theory. IEEE Tran. Neural Netw., 1999,10(5):988-999.
    [52]Weston J, and Watkins C.Support Vector Machines for Multi-class Pattern Recognition. In Proc.7th Europ.Syrnp.Artif. Neural Netw., April,1999.
    [53]Bredenstcincr E J, and Bcnnett K P, Multi-category classification by support vector machines. Comput. Optimi. Appli.,1999:53-79.
    [54]Mayoraz E, and Alpaydin E. Support vector machines for multi-class classification. IWANN,1999,2:833-842.
    [55]Crammer K, and Singer Y. On the learn ability and design of output codes for multiclass problems. Comput. Learn. Theory,2000:35-46.
    [56]Guerrneur Y, et al.A new multi-class SVM based on an uniform convergence result. In Proc. IEEE-INNS-ENNS Intern. Joint Conf. Neural Netw.,2000,4:183-188.
    [57]Bottou L, Cortes C, Denker J.S, Drucker H, Guyon I, Jackel L.D, LeCun Y, Muller U.A, Sackinger E, Simard P, Vapnik V.N. Comparison of classifier methods:a case study in handwriting digit recognition [C]. Proceedings of the 12th International Conference on Pattern Recognition,1994,2:77-82.
    [58]Kreβel U. Pairwise classification and support vector machines [A].In:Scholkopf B, Burges C.J.C, Smola A.J, Edits, Advances in Kernel Methods:Support Vector Learning[C].Cambr-idge, MA:MIT Press,1999:255-268.
    [59]Platt J.C, Cristianini N, Shawe T.J. Large margin DAG's for multiclass classification[A].In: Solla S.A, Leen T.K, Muller K.R, Advances in Neural Information Processing Systems[C].Cambridge, MA:MIT Press,2000,12:547-553.
    [60]K.Bennett, J.Blue. A support vector machine approach to decision trees.Rensselaer Polytechnic Institute, Troy, New York:R.P.I Math Report,(1997)97-100.
    [61]S.M. Cheong, S.H. Oh, S.Y. Lee. Support vector machines with binary tree architecture for multi-class classification. Neural Information Processing-Letters and Reviews,(2004)2(3): 47-51.
    [62]J.Weston, C.Watkins. Multiclass Support Vector Machines. Univ. London. U.K.,Tech.Rep. CSD-TR-98-04,1998
    [63]杜树新,吴铁军.模式识别中的支持向量机方法.浙江大学学报.2003,37(5):521-527
    [64]C.W.Hsu, C.J.Lin. A Comparison of Methods for Multiclass Support Vector Machines. IEEE Transactions on Neural Networks.2002,13(2):415-425.
    [65]Osuna E, Freund R. Training Support Vector Machines:an Application to Face Detection: Proc. Of Computer Vision and Pattern Recognition,1997.130-136.
    [66]Chakrabartty S, Singh G, Cauwenberghs G. Hybrid support vector machine/hidden Markov model approach for continuous speech recognition. Circuits and Systems. Proceedings of the 43rd IEEE Midwest Symposium on.21,2000.828-831. W.M.Campbell. A SVM/HMM system for speaker recognition. In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing,(2003)2:Ⅱ-209-212.
    [67]Joachims T. Text categorization with support vector machines.Technical Report, LS_Ⅶ No.23, University of Dortmund,1997. A.V. Anghelescu, I.B. Muchnik. Combinatorial PCA and SVM methods for feature selection in learning classifications (Applications to text categorization).International Conference on Integration of Knowledge Intensive Multi-Agent Systems,(2003)491-496.
    [68]Chen S, Samingan A K and Hanc L. Support vector machine multiuser receiver for DS-CDMA signals in multipath channels. IEEE Transaction on Neural Network,2001,12(3):264-276. Drucker H, Shahrary B, David C G. Support vector machine:Relevance feedback and information retrieval. Information processing and management,2002,38:305-323.
    [69]Liu W, Shen P, Qu Y. et al. Fast algorithm of support vector machines in lung cancer diagnosis. Proceedings of International Workshop on Medical Imaging and Augmented Reality, Hong Kong,China,2001:188-192. Bhanu P K N, Ramakrishnan A G. Suresh S. et al. Fetal lung maturity analysis using ultrasound image features. IEEE Transactions on Information Technology in Biomedicine, 2002,6(1):38-45.
    [70]H. Han, K. Chang. Positive-example based learning for web page classification using SVM. Proceedings 8th International Conference Knowledge Discovery and Data Mining,(2002): 239-248.
    [71]B. Samanta, K.R. Al-Balushi, S.A. Al-Araimi. Artificial neural networks and support vector machines with genetic algorithm for bearing fault detection engineering. Applications of Artificial Intelligence 2003,16:657-665.
    [72]Huang Z, Cheng H, Hsu CJ, et al. Credit rating analysis with support vector machines and neural networks:a market comparative study. Decision Support Systems,2004(37):543-558.
    [73]Kim K J. Financial time series forecasting using support vector machine. Neurocompution, 2003,55(2):307-319. Li H C, Zhang J S. Local prediction of chaotic time series based on support vector machine [J]. Chinese Physics Letters,2005,22(11):2776-2779.
    [74]Mark G and Ronald J. The current excitement in bioinformatics -analysis of whole-genome expression data:how does it relate to protein structure and function. Current Opinion in Structural Biology,2000,10(5):574-584. Bao L, Sun Z. Identifying genes related to drug anticancer mechanisms using support vector machine. FEBS Letters,2002,521(1):109-114.
    [75]Guo G D, Jain A K, Ma W Y, et al. Learning similarity measure for natural image retrieval with relevance feedback. IEEE Transactions on Neural Networks,2002,13(4):811-820. Bazi Y, Melgani F. Toward an optimal SVM classification system for hyperspectral remote sensing images. IEEE Transactions on Geoscience and Remote Sensing,2006,44(11): 3452-3461. Li J, Allinson N, Tao D, et al. Multitraining support vector machine for image retrieval. IEEE Transactions on Image Processing,2006,15(11):3597-3601.
    [76]V.Vapnik,The Nature of Statistical Learning Theory. New York:Springer-Verlag,1995.
    [77]J.Mercer.Functions of positive and negative type and their connection with the theory of integral equations[J]. Philos. Trans. Roy. Soc. London,1909,209:415-446.
    [78]N.Aronszajn. Theory of reproducing kernels[J].Trans. of the American Mathematical Society,1950,68:337-404.
    [79]T.Poggio. On optimal nonlinear associative recall[J]. Bialogical Cybernetics.1975, (19): 201-209.
    [80]T.Poggio.F.Girosi.Networks for approximation and learning[C]. Proceedings of the IEEE, 1990,78 (9):1481-1495.
    [81]O.L. Mangasarian, E.W. Wild, Multisurface proximal support vector classification via generalized eigenvalues, IEEE Trans. Pattern Anal. Machine Intell.28 (1) (2006) 69-74.
    [82]Jayadeva, R. Khemchandani, S. Chandra, Twin support vector machines for pattern classification, IEEE Trans. Pattern Anal. Machine Intell.29 (5) (2007) 905-910.
    [83]M.Arun Kumar, M.Gopal, Least squares twin support vector machines for pattern chassification, Expert Systems with Applications 36(2009)7535-7543.
    [84]J.A.K.Suykens, J.De Brabanter, L.Lukas, J.Vandewalle, Weighted least squares support vector machines:robustness and sparse approximation, Neurocomputing 48(2002)85-105.
    [85]M.S.Bazarra, H.D.Sherali, C.M.Shetty, Nonlinear Programing—Theory and Algorithms, Seconded.,Wiley,2004Chapter4, pp.149-172.
    [86]G.H. Golub, C.F. Van Loan, Matrix Computations, third ed., The John Hopkins University Press, Baltimore,1996 Chapter 2, p.50.
    [87]UCI Machine Learning Repository,http://archive.ics.uci.edu/ml/
    [88]Chih-Wei Hsu, Chih-Jen Lin, A comparison of methods for multi-class support vector machines, IEEE Trans on Neural Networks,2002.
    [89]Zhou Shuisheng, Study on the key problem of the competing learning vector quantization and support vector machine, PhD thesis, Xi'an University of Electronic Science and Technology,2005.
    [90]Santanu Ghorai,Anirban Mukherjee,Pranab K.Dutta.Nonparallel plane proximal classifier.Signal Processing 89(2009)510-522.
    [91]Storn, R., Price, K.:Differential Evolution-A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces. Technical Report TR-95-012, International Computer Science Institute, Berkeley, CA (1995).
    [92]Goldberg, D. Genetic Algorithms in Search Optimization and Machine Learning. Addison-Wesley,1989.
    [93]J Kennedy, R C Eberhart. Particle swarm op timization[A]. Proc IEEE Conference on Neural Networks [C]. Piscataway,NJ,1995 (4).1942-1948.
    [94]Babu, B., Jehan, M. Differential evolution for multiobjective optimization. In:proceedings of the IEEE Congress on Evolutionary Computation,2003, vol.4, pp.2696-2703.
    [95]Storn, R.,1996. On the usage of differential evolution for function optimization. In:the North American Fuzzy Information Processing Society Conference, Berkeley,1996,pp. 519-523.
    [96]K. Price, R. Storn, and J. Lampinen, Differential Evolution—A Practical Approach to Glo-bal Optimization. Berlin, Germany:Springer-Verlag,2005.
    [97]J. Lampinen and I. Zelinka, On stagnation of the differential evolution algorithm, in Proc. 6th Int. Mendel Conf. Soft Comput., P. Osmera,Ed.,2002, pp.76-83.
    [98]R. Gamperle, S. D. Muller, and P. Koumoutsakos, A parameter study for differential evol-ution, in Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, A. Grmela and N. E. Mastorakis, Eds. Interlaken, Switzerland:WSEAS Press,2002, pp.293-298.
    [99]D. Zaharie, Control of population diversity and adaptation in differential evolution algori-thms, in Proc. Mendel 9th Int. Conf. Soft Comput., R. Matousek and P. Osmera, Eds., Brno, Czech Republic, Jun.2003,pp.41-46.
    [100]S. Das, A. Konar, and U. K. Chakraborty, Two improved differential evolution schemes for faster global search, in ACM-SIGEVO Proc. Genetic Evolut. Comput. Conf., Washin-gton, DC, pp.991-998.
    [101]Jingqiao Zhang and Arthur C.JADE:Self-Adaptive Differential Evolution with Fast and Reliable Convergence Performance.2007 IEEE Congress on Evolutionary Computation(CEC 2007),pp.2251-2258.
    [102]Xuexia Zhang, Weirong Chen, Chaohua Dai, Ai Guo. Self-adaptive Differential Evolution Algorithm for Reactive Power Optimization. Fourth International Conference on Natural Computation,2008,pp.560-564.
    [103]R. Storn and K. Price, Differential evolution-a simple and efficient adaptive scheme for global optimization over continuous spaces, TR-95-012,1995 [Online]. Available: http://http.icsi.berkeley.edu/-storn/litera.html.
    [104]K. V. Price, An introduction to differential evolution, in New Ideas in Optimization, D. Come, M. Dorigo, and F. Glover, Eds. London,U.K.:McGraw-Hill,1999, pp.79-108.
    [105]J. Ronkkonen, S. Kukkonen, and K. V. Price, Real-parameter optimization with differen-tial evolution, in Proc. IEEE Congr. Evolut. Comput., Edinburgh, Scotland, Sep.2005, pp. 506-513.
    [106]J. Liu and J. Lampinen, A fuzzy adaptive differential evolution algorithm, Soft Comput., vol.9, no.6, pp.448-462, Apr.2005.
    [107]M. G. H. Omran, A. Salman, and A. P. Engelbrecht, Self-adaptive differential evolution,in Lecture Notes in Artificial Intelligence. Berlin, Germany:Springer- Verlag,2005, pp.192-199.
    [108]J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer,Self-adapting control parame-ters in differential evolution:A comparative study on numerical benchmarkproblems, IEEE Trans. Evolut.Comput., vol.10, no.6, pp.646-657, Dec.2006.
    [109]Janez Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer Self-Adapting Control Parameters in Differential Evolution:A Comparative Study on Numerical Benchmark Problems. In:Evolutionary Computation, IEEE Transactions on,2006, Vol 10, no 6, pp. 646-657, ISSN:1089-778X.
    [110]R. Storn and K.Price, "Differential evolution-A simple and efficient heuristic for global optimization over continuous spaces," J.Global Optimiz.,vol.11,pp.341-359,1997.
    [111]M.M.Ali and A.Torn, "Population set-based global optimization algorithms:Some modifications and numerical studies," Comput.Oper.Res., vol.31,no.10,pp.1703-1725, 2004.
    [112]J.Liu and J.Lampinen, "A fuzzy adaptive differential evolution algorithm," Soft Computing-A Fusion of Foundations, Methodologies and Applications,vol.9, no.6,pp.448-462,2005.
    [113]J.Vesterstroem and R.Thomsen, "A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems,"in Proc. IEEE Congr. Evolutionary Computation, Portland,OR,Jun.20(R)23,2004,pp.1980(R) 1987.
    [114]T.M.Mitchell,Machine Learning,McGraw-Hill International, Singapore,1997.
    [115]Chun-Hong Zheng, Li-cheng Jiao, "Fuzzy Pre-extracting Method For Support Vector Machine," Proceedings of the First International Conference on Machine Learning and Cybernetics, Bejing,2002.
    [116]Zhang Li, Zhou Weida, Jiao Licheng, "Pre-extracting Support Vectors for Support Vector Machine," Signal Processing Proceedings,2000, pp.1432-1435.
    [117]M.Rychetsky, S.Ortmann, M.Ullmann,et al,"Accelerated training of support Vector Machines," IEEE Proceedings of International Joint Conference on Neural Networks,1999, pp.998-1003.
    [118]Zhang L.,Zhang B, Relationship between Support Vector Set and Kernel Functions in SVM, Journal of Computer Science and Technology,2002,17(5):549-555
    [119]Jinlong An, Stduy on Several Issues of Support Vector Machine, Tianjing University, Jun.2004.
    [120]Y. Jiang, Z-H. Zhou, "Editing Training Data for KNN Classifiers with Neural Network Ensemble," Proceedings of the 1st International Symposium on Neural Networks.,Dalian, China,2004.
    [121]E. Oja, "A Simplified Neuron Model as a Principal Component Analyzer," J. Math. Biology, vol.15, pp.267-273,1982.
    [122]B. Scho" lkopf, A. Smola, and K. Muller, "Nonlinear Component Analysis as a Kernel Eigenvalue Problem," Neural Computation, vol.10, no.5, pp.1299-1319,1998.
    [123]Kwang In Kim, Matthias O.Franz, and Bernhard Scholkopf, "Iterative Kernel Principal Component Analysis for Image Modeling," IEEE Transactions on Pattern Analysis and Machine Intelligence,vol.27,NO.9,pp.1351-1366,September 2005.
    [124]T.D.Sanger,"Optimal Unsupervised Learning in a Single-Layer Linear Feedforward Neural Network," Neural Networks, vol.12, pp.459-473,1989.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700