支持向量机算法设计及在高分辨雷达目标识别中的应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
高分辨雷达一维距离像包含有目标丰富的结构特征,在雷达自动目标识别技术领域具有独特的优势。建立在结构风险最小化原则上的支持向量机,具有泛化能力强、小训练样本、非线性、无局部极小等许多优良性质,使其成为高分辨距离像自动目标识别应用中的有力工具。
     在此背景下,本文研究了支持向量机多目标参数优选和支持向量机多阶段选择性集成的算法设计及在高分辨雷达目标识别中的应用问题。
     本文主要内容安排如下:
     第一章简要介绍了课题的背景、意义以及相关技术的发展和研究现状。
     第二章全面分析了高分辨雷达一维距离像和支持向量机的基本原理。本章推导了一维距离像的获取,结合目标姿态敏感性和平移敏感性等问题对一维距离像进行了特性分析,然后详细阐述了支持向量机的理论基础、基本算法及特点,为后续章节算法的设计与应用提供了广泛依据。
     第三章研究了支持向量机多目标参数优选算法的设计问题。本章分析了参数对算法泛化性能的影响,指出了参数优选的必要性。传统的参数优选是基于训练集上单个泛化误差界的,本文通过实验分析指出这是不充分的,参数优选是个多目标优化问题。本文提出了基于非支配排序遗传算法的支持向量机多目标参数优选算法,实验结果表明与单目标参数优选算法相比多目标参数优选的参数值比较适中并获得了较高的识别率。
     第四章研究了支持向量机多阶段选择性集成算法的设计问题。集成学习可以提高学习机的泛化能力,选择性集成可以在不损失甚至提高集成泛化能力的基础上大幅削减集成成员的数量,这对高分辨雷达自动目标识别是有益的。本文提出了基于遗传算法的支持向量机多阶段选择性集成算法,实验结果表明该算法能够在使用较少集成成员的条件下获得比传统Bagging集成和单个SVM更高的识别率。
     第五章对论文进行了总结,指出了下一步要做的工作。
The high resolution range profile (HRRP) obtained by the high resolution radar contains abundant structure signatures of the target and embraces unique advantages in the field of Radar Automatic Target Recognition (RATR).The Support Vector Machines (SVMs) founded on the Structural Risk Minimization (SRM) principles has many good properties, such as better generalization, small training samples, nonlinear, no local minima, etc, which make it become a powerful learning machines in the application of ATR based on HRRP.
     In this background, the multiobjective parameter selection algorithm and the multistage selective ensemble algorithm for SVMs along with their application in the high resolution radar target recogniton have been researched by the paper.
     The main contents of the paper are arranged as follows:
     In chapter one, the background and significance of the subject, the progress of the related technology and the current status of the research are briefly introduced. Chapter two gives an extensive materials about the basic principles of the high resolution range profile and the support vector machines. The obtaining of HRRP is deduced,and the target-aspec and time-shift sensitivity problems are conclued to analyze the features of the HRRP samples, then the theory foundation, the basic algorithms and the characters of SVMs are detailedly stated, which give a comprehensive bases for the design and application of the algorithm in the successive chapters.
     In chapter three, the design of the multiobjective parameter selection algorithm for SVMs is studied.Based on the simulation, the necessary of optimal parameter selection is demonstrated by their direct influence on the algorithm’s generalization ability. The traditional parameter selection methods are usually conducted through a single generalization error bound , and that seems insufficient according to the results of our experiments. From our view, the issue of parameter selection should be treated as a multiobjectice optimization problem. A multiobjective parameter selection algorithm based on the nondominated sorting genetic algorithm (NSGA-II) is proposed in the paper, and the experiments’results show that compared with single-objective parameter selection algorithm the multiobjective parameter selection algorithm can obtain a better moderate parameter values which ensure a higher correct recognition rate.
     In chapter four, the design of the multistage selective ensemble algorithm for SVMs is investigated. Ensemble Learning can ensure a better generalization ability for a learning machine, and the selective ensemble can hugely reduce the number of the members that constitute a ensemble without lossing or even improving the ensemble’s generalization ability, which is a very helpful property in the application to the ATR based on HRRP.The paper proposes a multistage selective ensemble algorithm for SVMs based on genetic algorithm (GA). The simulation results indicate that the correct recognition rate achieved by the algorithm is higher than the bagging ensemble and a single SVMs while just a less number of ensemble members have been used.
     Finally, the dissertation is concluded in chapter five. Several aspects for future work are also pointed out.
引文
[1]肖怀铁.宽带极化毫米波雷达目标特征信号测量与识别算法研究[D].工学博士学位论文,国防科技大学,2000.
    [2]庄钊文,黎湘,刘永祥.智能化武器系统发展的关键技术—雷达自动目标识别技术[J].科技导报, 2005, 23(8): 20-23.
    [3]吴顺君,梅晓春.雷达信号处理和数据处理技术[M].北京.电子工业出版社,2008.
    [4]刘宏伟,杜兰,袁莉等.雷达高分辨距离像目标识别研究进展[J].电子与信息学报,2005,27(8):1325-1334.
    [5]袁莉.基于高分辨距离像的雷达目标识别方法研究[D].西安电子科技大学博士学位论文,2007.
    [6]孙即祥.现代模式识别[M].长沙:国防科技大学出版社,2000.
    [7]王晓丹,王积勤.雷达目标识别技术综述[J].现代雷达,2003,25(5): 22-26
    [8] Duda R. O, Hart E. E, Stork D. G. Pattern Classification. Second Edition. New York: John Wiley and Sons, 2001.
    [9] Haykin S著,叶世伟,史忠植译.神经网络原理(第2版) [M].北京:机械工业出版社,2004.
    [10] Vapnik.V. N.著,张学工译.统计学习理论的本质[M].北京:清华大学出版社, 2000.
    [11] Cristianini N, Shawe-Taylor J著,李国正,王猛,曾华军译.支持向量机导论[M].北京:电子工业出版社, 2006.
    [12] Shawe-Taylor J, Cristianini N著,赵玲玲,翁苏明译.模式分析的核方法[M].北京:机械工业出版社,2006.
    [13]张学工.关于统计学习理论与支持向量机.自动化学报,2000,126(1):32–41
    [14]杜兰.雷达高分辨距离像目标识别方法研究[D].西安:西安电子科技大学博士学位论文.
    [15]周剑雄.光学区雷达目标三维散射中心重构理论及技术[D].长沙:国防科技大学博士学位论文,2006.
    [16]王雪松.宽带极化信息处理的研究[D].长沙:国防科技大学博士学位论文, 1999.
    [17]廖学军.基于高分辨距离像的雷达目标识别[D].西安:西安电子科技大学博士学位论文, 1999.
    [18]沈明华.散射中心分布特征提取与核方法分类器关键技术研究[D].长沙:国防科技大学博士学位论文, 2007.
    [19]郭雷.宽带雷达目标极化特征提取与核方法识别研究[D].长沙:国防科技大学博士学位论文,2009.
    [20]于雪莲.基于核方法和流形学习的雷达目标距离像识别研究[D].成都:电子科技大学博士学位论文, 2008.
    [21]陈渤.基于核方法的雷达高分辨距离像目标识别技术研究[D].西安:西安电子科技大学博士学位论文, 2008.
    [22]廖东平.支持向量机方法及其在机载毫米波雷达目标识别中的应用研究[D].长沙:国防科技大学博士学位论文, 2006.
    [23]任双桥.支持向量机方法及其在机载毫米波雷达目标识别中的应用研究[D].长沙:国防科技大学博士学位论文, 2006.
    [24] K. Duan, S. Sathiya Keerthi, and A. N. Poo,“Evaluation of simple performance measures for tuning SVM hyperparameters,”Neurocomputing, Apr, 2003,vol. 51, 41–59.
    [25] C.-C. Chang and C.-J. Lin. LIBSVM: a library for support vector machines, 2001.
    [26] Lee M S, Keerthi S S. An efficient method for computing Leave-One-Out error in Support Vector Machines with Gaussian kernels[J]. IEEE Transactions on Neural Networks, 2004, 15(3): 750-757.
    [27] Chapelle O, Vapnik V N, Bousquet O, et al. Choosing multiple parameters for support vector machines[J]. Machine Learning, 2002, 16(1): 131-159.
    [28] Jaakkola T S, Haussler D. Probabilistic kernel regression models[C]. Proceedings of the 1999 Conference on AI and Statistics, 1999.
    [29] Opper M, Winther O. Gaussian process and SVM: Mean field and Leave-One-Out [J]. Advances in Large Margin Classifiers, MIT Press, 2000: 311-326.
    [30] Chung K M, Kao W C, Sun C L. Radius margin bounds for Support Vector Machines with RBF kernel[J]. Neural Computing, 2003, 15(11): 2643-2681.
    [31] Vapnik.V.N: Statistical Learning Theory. New York: John Wiley and Sons, 1998.
    [32] Vapnik.V.N, Chapelle O. Bounds on error expectation for SVM[J]. Advances in Large Margin Classifiers, MIT Press, 2000: 261-280.
    [33]ángel Kuri-Morales Iván Mejía-Guevara. Evolutionary Training of SVM for Multiple Category Classification Problems with Self-adaptive Parameters.J.S. Sichman et al. (Eds.): IBERAMIA-SBIA 2006, LNAI 4140, 2006,329– 338 .
    [34] Lin S-W, Lee Z-J,Chen S-C,Tseng T-Y Parameter determination of support vector machine and feature selection using simulated annealing approach.Applied Soft Computing ,2008 (8), 1505–1512.
    [35] Peng X Y, Wu H X, Peng Y. Parameter selection method for SVM with PSO [J]. Chinese Journal of Electronics, 2006, 15(4): 638-642.
    [36] C-H Wua, G-H Tzeng , R-H Lin, A Novel hybrid genetic algorithm for kernel function and parameter optimization in support vector regression,Expert Systems with Applications , 2009,364725–4735.
    [37] Guohai Liu , Dawei Zhou, Haixia Xu, Congli Mei. Model optimization of SVM for a fermentation soft sensor. Expert Systems with Applications 37, 2010, 2708–2713.
    [38] Li .X and Li .C.P Multi-objective Parameters Selection for SVM Classification Classification Using NSGA-II P. Perner (Ed.): ICDM 2006, LNAI 4065, 2006, 365–376.
    [39] T. Suttorp and C.Igel. Multi-objective optimization of support vector machines Yaochu Jin (Ed.), Multi-objective Machine Learning Studies in Computational Intelligence, Springer-Verlag ,2006, Vol. 16, 199-220.
    [40] Vapnik.V.N. Estimation of dependence based on empirical data.[M].Berlin: Springer–Verlag, 1982.
    [41] Osuna.E,Freund.R, Girosi.F, Support Vector Machines: Training and Applications Techincal Report.AIM1602,Cambridge, MA:MIT Artificial Intelligence Laboratory, 1996.
    [42] Platt.J, Fast training of support vector machines using sequential minimal optimization. in Kernel methods:support vector learning[M], Sch?lkopf, B., Burges C.J, Smola,A.J,Cambridge,MA:MIT Press,1999,185~208.
    [43] Bennett K P, Bredensteiner E J. Duality and geometry in SVM classifiers. In: Proceedings of 17th International Conference on Machine Learning. San Mateo, CA:IEEE 2000.57-64.
    [44] Crisp D J, Burges C J C. A geometric interpretation of v-SVM classifiers. In : Advanves in Neural Information Processing System (NIPS)12. Denver, USA: MIT press 2000.244-250.
    [45] Franc V , Hlavac V. An iterative algorithm learning the maximal margin classifier. Pattern Recognition,2003, 36 (9),1985-1996.
    [46] Keerthi S S, Shevade S K, Bhattacharyya C, Murthy K PK. A fast iterative nearest point algorithm for support vector machine classifier design. IEEE Transaction on Neural Networks,2000, 11(1):124-136.
    [47] Mavroforakis M E, Theodoridis S. A geometric approach to support vector machine (SVM) classification. [J]. IEEE Transaction on Neural Networks,2006, 17(3):671-682.
    [48] Mavroforakis M E, Geometric Approach to Statistical Learning Theory through Support Vector Machines with Application to medical diagnosis [D] Greece,Athens: National & Kapodistrian University of Athens 2008.
    [49] Liu Z, Liu J. G, Pan C, Wang G. A Novel Geometric Approach to BinaryClassification Based on Scaled Convex Hulls, IEEE Transaction on Neural Networks, 2009, 20 (7):1215-1220.
    [50]陶卿,求解SVM的几何方法研究,机器学习及其应用2007,周志华等主编,北京:清华大学版社, 2007. 49~84.
    [51] Tax D M J and Duin R P W., Support Vector Data Description. Machine Learning , 2004,54(1):45–66.
    [52] Bennett K,Demiriz A.Semi-supervised support vector machines[C]//Kearns M S,Solla S A,CohnD A.Advances in Neural Information Processing Systems 11.Cambridge,MA: MIT Press, 1999, 368-374.
    [53] Takuya I, Shigeo A. Fuzzy Support Vector Machine for pattern Classification[C]. Proceedings of International Joint Conference on Neural Networks, Washington, D.C , 2001, 7: 1449-1454.
    [54] Ben-Hur A and Horn D. Support vector clustering. Journal of Machine Learning Research, 2001 (2): 125-137.
    [55] Dietterich T G. Machine learning research: four current directions. AI Magazine, 1997, 18(4): 97-136.
    [56] Valentini G, Dietterich T G. Bias-variance analysis of support vector machines for the development of svm-based ensemble methods. Journal of Machine Learning Research 2004(5), 725–775.
    [57] Eberhart R C, Yuhui Shi.Computational Intelligence:Concepts to Implementations, Morgan Kaufmann Publishers,2007.
    [58] Vapnik.V.N, The Nature of Statistical Learning Theory, 2nd ed. Springer, Berlin, 1999.
    [59]王珏,陶卿. Rough Set理论与统计机器学习理论.陆汝钤主编,知识科学与计算科学,北京,清华大学出版社,2004, 49-76.
    [60] C.-W. Hsu, C.-C. Chang, C.-J. Lin. A practical guide to support vector classification, 2009.
    [61] S. S. Keerthi and C.-J. Lin. Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Computation, 2003,15(7):1667-1689.
    [62] Zheng, C.H., Jiao, L.C: Automatic parameters selection for SVM based on GA.Intelligent Control and Automation, Vol. 2. Springer-Verlag, Berlin Heidelberg New York ,2004, 1869– 1872
    [63] RatschG..http://ida.first.fraunhofer.de/projects/bench/benchmarks.htm.
    [64] T. Joachims, The maximum-margin approach to learning text classifiers: method, theory and algorithms,Ph.D. Thesis, Department of Computer Science, University of Dortmund, 2000.
    [65] G. Wahba, Support vector machine, reproducing kernel Hilbert spaces and the randomized GACV,in: B. Scholkopf, C. Burges, A. Smola (Eds.), Advances inKernel Methods-Support Vector Learning,MIT press, Cambridge, MA, 1999.
    [66]崔逊学,多目标进化算法及其应用[M],北京:国防工业出版社,2006.
    [67]玄光男,程润伟,遗传算法与工程优化[M],北京;清华大学出版社,2004.
    [68] K.Deb, Multiobjective Optimization Using Evolutionary Algorithms .Chichester, U.K.: Wiley, 2001.
    [69] Deb, K, Ptatap, A,Agarwal, S. Meyarivan, T. : A Fast and Elitist Multiobjective Genetic Algorithm NSGA-II. IEEE Transactions on Evolutionary Computation.Vol. 6,2002, 182–197.
    [70] S.S.Keeith Efficient Tuning of SVM Hyperparameters Using. Radius-Margin Bound and Iter. Iterative Algorithms. IEEE Transactions on Neural Networks. Vol. 13,2002,1045–9227.
    [71] Rene Van Der Heiden. Aircraft Recognition with Radar Range Profiles [D]. Doctoral Dissertation of University of Amsterdam,1998
    [72]保铮,邢孟道,王彤编著,雷达成像技术[M],北京:电子工业出版社,2005.4
    [73] Kuncheva Combining Pattern Classifiers:Methods and Algorithms.Wiley,2004
    [74] Simon.Haykin,Neural Networks A Comprehensive Foundation, 2nd,1999.
    [75]周志华,选择性集成.王珏,周志华,周傲英主编,机器学习及其应用.北京:清华大学出版社,2006, 3.
    [76] Dietterich T G. Machine learning research: four current directions. AI Magazine, 1997, 18(4): 97-136.
    [77] R. E.Schapire,“The Strength of WeakLearnability,”Machine Learning, Vol. 5(2) , 1990, 197-227.
    [78] L.Breiman,“Bagging Predictors”, Machine Learning, 1996, Vol.24, 123-140.
    [79] Zhou Z-H, Wu J, Tang W. Ensembling neural networks: many could be better than all. Artificial Intelligence, 2002, 137(1-2): 239-263.
    [80]傅强,选择性神经网络集成算法研究[D].杭州:浙江大学博士学位论文,2007.
    [81] Zhou Z-H, Wu J-X, Jiang Y, Chen S-F. Genetic algorithm based selective neural network ensemble. In:Proceedings of the 17th International Joint Conference on Artificial Intelligence, Seattle, WA, 2001, vol.2,797-802.
    [82] Lei Wang , Yong Yang, Selective Ensemble Algorithms of Support Vector Machines Based on Constraint ProjectionW. Yu, H. He, and N. Zhang (Eds.): ISNN 2009, Part II, LNCS 5552, 2009, 287–295.
    [83] Bing Han, Xinbo Gao, and Hongbing Ji Automatic News Audio Classification Based on Selective Ensemble SVMsJ. Wang, X. Liao, and Z. Yi (Eds.): ISNN 2005, LNCS 3497, 2005, .363–368.
    [84]唐耀华,高静怀,包乾宗.新的选择性支持向量机集成学习算法,西安交通大学学报Vol-42, No.10,1221-1225.
    [85] Zhang X R,Wang S,Tan S,et al,Selective SVMs Ensemble driven by immune clonal algorithm. Lectures notes on in computer science :2005,3449:325-333.
    [86] Lean Yu, Shou yang Wang, Kin Keung Lai, Plural Correlation Credit risk assessment with a multistage neural network ensemble Expert Systems with Applications 2008(34),1434–1444.
    [87] Garcia-Pedrajas N, Hervas-Martinez U, Ortiz-Boyer D. Cooperative coevolution of artificial neural network ensembles for pattern classification. IEEE Transactions on Evolutionary Computation, 2005, 9(3):271-302.
    [88] Dietterich.T.D.Ensemble Methods in Machine Learning.In Proceedings of MCS,LNCS,Springer,2000,1-15.
    [89] Ron Meir and Gunnar Ratsch. An Introduction to Boosting and Leveraging,S. Mendelson, A.J. Smola Ed, Advanced Lectures on Machine Learning, LNAI 2600, 2003, 118–183.
    [90] Valentini.G An experimental bias-variance analysis of SVM ensembles based on resampling techniques 2005(6).
    [91] Shi-jin Wang, Avin Mathew, Yan Chen, Li-feng Xi, Lin Ma, Jay Lee .Empirical analysis of support vector machine ensemble classifiers Expert Systems with Applications ,2009(36) 6466–6476.
    [92]何灵敏,支持向量机集成及在遥感分类中的应用.[D].杭州:浙江大学博士学位论文,2006.
    [93] German S, Bienenstock E, Doursat R. Neural networks and the bias/variance dilemma. Neural Computation, 1992, 4 (1): 1-58.
    [94] Kohavi R, Wolpert D H. Bias plus variance decomposition for zero-one loss functions. In: Proceedings ofthe 13th International Conference on Machine Learning, Bari, Italy, 1996, 275-283.
    [95] Krogh A, Vedelsby J. Neural network ensembles, cross validation, and active learning. In: Tesauro G,Touretzky D S, Leen T K, eds. Advances in Neural Information Processing Systems 7, Cambridge, MA: MIT Press, 1995, 231-238.
    [96] Hansen, L. K., & Salamon, P. Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990,12,993–1001.
    [97] E. K. Tang P.N. Suganthan X. Yao An analysis of diversity measures Mach Learn 2006, 65:247–271
    [98] Raviv,Y, ntrator.N.Bootstrapping with noise: an effective regularization technique. Connection Science, 1996, 8, 355–372.
    [99] Tumer, K., & Ghosh, J.. Error correlations and error reduction in ensemble classifiers. Connection Science, 1996, 8, 385–404.
    [100] L. Kuncheva and C. J. Whitaker,“Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy,”Mach.Learn. , May 2003, vol.51, no. 2, 181–207.
    [101] Y. Liu and X. Yao,“Ensemble learning via negative correlation,”Neural network. , Dec. 1999, vol. 12, no. 10, pp. 1399–1404.
    [102] Thomas.G.Dietterich, An Experimental Comparison of Bagging Boosting and Randomization .Machine Learning, 1999, 1-22.
    [103] Yaochu J,Okabe T,Sendhoff .B.Neural network regularization and ensembling using multi-objectiveevolutionary algorithms.In:Congress on EVolutionary Computatin 2004.1-81
    [104] Chandra.A,Yao.X.DIVACE:Diverse and Accurate Ensemble Learning Algorithm. In:Lecture Notesin Computer Science,2004.619-625.
    [105] C.-W. Hsu and C.-J. Lin. A comparison of methods for multi-class support vector machines. IEEE Transactions on Neural Networks, 2002, 13(2):415-425.
    [106] C.R.Houck,J.A.JoinesM.G Kay, A Genetic Algorithm for Function Optimization: A Matlab Implementation.Technical Report NCSU-1E-TR-95-90. Nortfi Carolina State University,1995.
    [107] Schaffer JD.Multiple objective optimization with vector evaluated genetic algorithms.In:Grefenstette JJ.ed.Proc.of the International Conference on Genetic Algorithms and Their Applications.Hillsdale:L.Erlbaum Associates,Inc,1985.93-100.
    [108] C. M. Fonseca and P. J. Fleming,“Genetic algorithms for multiobjective optimization: Formulation, discussion and generalization,”in Proceedings of the Fifth International Conference on Genetic Algorithms, S.Forrest, Ed. San Mateo, CA: Morgan Kauffman, 1993,.416–423.
    [109] N. Srinivas and K. Deb,“Multiobjective function optimization using nondominated sorting genetic algorithms,”Evol. Comput , Fall 1995,vol. 2, no.3, 221–248.
    [110] J. Horn, N. Nafploitis, and D. E. Goldberg,“A niched Pareto genetic algorithm for multiobjective optimization,”in Proceedings of the First IEEE Conference on Evolutionary Computation, Z. Michalewicz, Ed. Piscataway, NJ: IEEE Press, 1994, 82–87.
    [111] Eriekson M,Mayer A,Hom J.The niched Pareto genetic algorithm 2 applied to the design of groundwater remediation system.In:Zitzler E,Deb K,Thiele L,Coello Coello CA,Come D,eds.Proceedings of the First International Conference on Evolutionary Multi-Criterion Optimization,EMO 2001.Berlin:Springer-Verlag, 2001, 681-695.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700