支持向量机算法及其在雷达干扰效果评估中的应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
随着电子技术的发展,雷达对抗在战争中发挥着越来越重要的作用。雷达干扰是雷达对抗的主要组成部分,干扰效果是反映干扰装备作战能力的一项重要的综合性指标,它表现为干扰前后雷达工作性能的下降程度,是雷达方、干扰方以及对抗环境等方面的众多因素共同作用的结果。干扰效果评估要解决的是根据影响因素估计干扰效果的问题,科学合理地评估干扰效果对雷达干扰/抗干扰技术的研究、雷达及干扰装备的研制都具有重要的意义。
     雷达干扰效果评估方法主要有早期的评估因子法,后来的模糊综合评估法,以及近几年出现的基于人工神经网络(ANN)等机器学习理论的智能评估法。智能评估法是利用机器学习理论对干扰实验所得到的实验样本进行学习,得到干扰效果与影响因素之间对应关系,并由此实现对特定影响因素下干扰效果的评估。因为具有以实验数据为基础、受人为影响小等特点,智能评估法已被认为是一种非常有前景的干扰效果评估方法。从智能评估角度,干扰效果评估问题可以看作是一个有限样本的学习问题,神经网络在有限样本情况下容易出现因过学习而导致的推广能力下降,而在统计学习理论基础上发展起来的支持向量机(SVM)是专门针对有限样本问题的机器学习方法,所以利用SVM解决干扰效果评估问题会更有优势。由于干扰技术的多样性,本文仅针对舰载有源干扰设备对反舰导弹末制导雷达进行的自卫式干扰进行研究。针对雷达干扰效果评估及战时预测问题研究合适的支持向量机算法,并解决其在该领域的相关应用问题是本文的主要研究内容,具体如下:
     1.在支持向量机算法方面,对计算复杂度低的最小二乘支持向量机(LS-SVM)进行了研究。针对干扰效果评估中的离线学习以及干扰效果战时预测中需要对预测误差较大的样本进行在线学习的需求,并且为了克服标准LS-SVM解的非稀疏性的缺点,研究了两种在线式的LS-SVM:基于更新逆核矩阵的在线式LS-SVM和基于序列最小优化(SMO)的在线式LS-SVM,并分别研究了它们的分类算法和回归算法。它们均采用“预测→增量学习→逆学习”的思想迭代完成学习过程,能够根据具体问题自适应地得到稀疏的解,不但能够快速地完成离线学习,而且能够用于在线学习问题。
     2.对SVM相关的参数选择及特征选择问题进行了研究。在SVM参数选择方面,针对SVM的性能一般是参数的多峰函数的特点,研究了利用全局优化能力强的微分进化算法(DE)解决SVM的参数选择问题。另外,针对机器学习领域有些问题的输入维数很高,需要进行特征选择的问题,研究了利用DE算法同步选择SVM的参数和特征的方法。仿真实验表明,与基于微粒群算法(PSO)的类似方法相比,基于DE的参数选择方法和同步选择方法不但具有更快的寻优速度,而且具有更强的参数选择和特征选择能力。
     3.为了得到智能评估方法用以学习的实验样本,针对实验中的干扰效果度量问题进行了研究。通过分析末制导雷达的工作过程及其面临的有源干扰的特点,结合“时间准则”和“效率准则”,提出了以搜索时间比和跟踪误差比为指标的、综合定量的干扰效果度量方法。利用海军某电子对抗仿真中心的半实物仿真系统进行了干扰实验,实验结果表明度量的干扰效果符合理论及工程实践规律,度量方法有效。
     4.根据干扰效果评估定量化的要求,将其看作一个回归问题,在分析末制导雷达有源干扰效果的主要影响因素基础上,研究了基于LS-SVM回归的干扰效果评估方法。半实物仿真实验表明,该方法能够根据影响因素较为准确地评估出的干扰效果,比基于ANN的评估方法具有更高的评估精度;此外,对干扰效果的战时预测问题进行了尝试性研究,根据战时影响因素不完全可知、预测精度不需要太高的特点,将干扰效果的战时预测看作一个分类问题,研究了基于LS-SVM分类的干扰效果战时预测方法。
With the development of electronic technology, radar confrontation is playing an important role in modern warfare. Radar jamming is the main component of radar confrontation, and jamming effect is an important index of jamming equipment, which is showed as the decline degree of radar’s performance before and after jamming. Jamming effect is influenced by many factors of radar, jamming and confrontation environment, and jamming effect evaluation is estimating the possible jamming effect according to influence factors. Scientific and rational evaluation of jamming effect takes great significance on the research of radar jamming/anti-jamming technology, and the development of radar and jamming equipment.
     The main evaluation methods of radar jamming effect include early evaluation factor methods, subsequent fuzzy synthetic evaluation methods and intelligent evaluation methods based on machine learning theory, such as artificial neural network (ANN). In intelligent evaluation methods, machine learning theories are used to get the relationship between jamming effect and influence factors by learning on samples get in radar jamming experiment. Because the intelligent evaluation is based on experiment samples and slightly influenced by artificial factors, it has been considered to be a very promising approach to solve radar jamming effect evaluation. From the perspective of intelligent evaluation, the evaluation of radar jamming effect can be considered as learning problem with limited samples, in this case the generalization ability of ANN would decline because of overfitting, while support vector machine (SVM) based on statistical learning theory is developed to solve the learning problem with limited samples, so it will be more superior that SVM is used to solve jamming effect evaluation problem. Because of the diversity of radar jamming, in this dissertation researches are focused on self-screening jamming of shipbased active jamming equipment to terminal guidance radar of anti-ship missile. Researches on proper SVM algorithms for the evaluation and prediction at wartime of radar jamming effect, and solving the relevant application problems of SVM are the main contents of this dissertation, detailed as follows:
     1. In support vector machine algorithm, the least squares support vector machine (LS-SVM) with low computational complexity was studied. To satisfy the offline learning requirement of jamming effect evaluation and online learning requirement to the samples with larger predicting error in jamming effect prediction at wartime, and avoid the shortcoming of standard LS-SVM that solution is nonsparse, two kinds of online LS-SVM were studied: the online LS-SVM based on refreshing inverse kernel matrix and the online LS-SVM based on sequence minimal optimization (SMO), the classification and regression algorithms of them were proposed. Both of them iteratively accomplish learning process by use of "predicting→incremental learning→decremental learning" idea. They can adaptively get sparse solution according to specific learning problem, not only can learn rapidly at offline problem, but also can be used in online learning problem.
     2. The relevant parameters selection and features selection of SVM were studied in succession. In parameters selection of SVM, according to the character that SVM’s performance is multi-peak function of parameters, the differential evolution (DE) algorithm with strong global optimization ability was used to solve the parameters selection of SVM. Moreover, in machine learning area there are matters with high dimension inputs (features) and the features need to be selected. According to this case, parameters and features simultaneous selection based on DE algorithm was studied. Compared with the similar methods based on PSO, the DE-based parameters selection method and the DE-based simultaneous selection method not only is more rapid at optimization speed, but also have more powerful parameters and features selection ability.
     3. To get the training samples for intelligent evaluation, the measurement method of jamming effect in experiment was studied. On the base of analysis of terminal guidance radar’s working process and the countering active jamming, and the combination of "time norms" and "efficiency norms", a comprehensive quantitative measurement method was proposed which use searching-time ratio and tracking-error ratio as measurement indexes. Jamming experiments were implemented by use of semi-physical simulation system at one of naval electronic warfare simulation center, results show that the measured jamming effect accords with theory and practice engineering rule, and the measurement method of radar jamming effect is effective.
     4. According to the quantitative requirement of radar jamming effect evaluation, it was regarded as a regression problem, and the jamming effect evaluation method based on LS-SVM regression was studied on the base of analyzing main influence factors of terminal guidance radar’s active jamming effect. Semi-physical simulation experiments show that the proposed evaluation method can accurately evaluate jamming effect according to specific influence factors. Compared with the evaluation method based on ANN, the LS-SVM regression-based method has higher evaluation accuracy. In addition, the jamming effect prediction at wartime was tentatively studied. According to the fact that influence factors are not all knowable and predicting results aren’t need very accuracy, jamming effect prediction at wartime was regarded as a classification problem, and the jamming effect prediction method at wartime based on LS-SVM classification was studied.
引文
1 Spezio A. E. Electronics warfare systems. IEEE Transactions on Microwave Theory and Techniques, 2002 ,50(3):633~644
    2赵国庆.雷达对抗原理.西安:西安电子科技大学出版社, 2003:1~3
    3王国玉,汪连栋,等著.雷达电子战系统数学仿真与评估.北京:国防工业出版社, 2004:48~458
    4侯定丕.非线性评估的理论探索与应用.合肥:中国科学技术大学出版社, 2001:1~2
    5周颖,王雪松,徐振海,赵峰,王国玉.雷达电子战效果及效能评估的一般性思考.系统工程与电子技术, 2004, 26(5):617~620
    6 Vapnik V N. The Nature of Statistical Learning Theory (Second Editon). New York: Springer-Verlag, 1999:225~260
    7张学工.关于统计学习理论与支持向量机.自动化学报, 2000, 26(1):32~42
    8俞静一.雷达干扰效果度量问题的探讨.舰船电子对抗, 1999(4):15~18
    9 Berkowitz P.H., Rosen M.W. ECM system evaluation using jamming-to-signal ratio. Journal of Electronic Defense, 1986, 9(11/12):45~54
    10 Kefalas, G.P. Radar noise jamming calculations simplified. IEEE Transactions on Aerospace and Electronic Systems, 1981(2):297~300
    11沈逢吉,蔡鸿芳,苏国庆.对雷达遮盖型干扰效果度量准则的研究.电子对抗, 1987(1):1~10
    12 Bykov, V.V. Capabilities of modern radar under conditions of the combined use of stealth and noise jamming techniques. Telecommunications and Radio Engineering, 1992, 47(2):54~59
    13孙凤荣,冷东方,崔永久.搜索雷达抗有源噪声干扰效果定量评估方法.航天电子对抗, 2000(3):20~23
    14黄高明,刘勤,苏国庆,袁湘辉.雷达遮盖性干扰效果评估度量方法研究.现代雷达, 2005, 27(8):10~13
    15 Kanter Irving. Effect of jamming of monopulse accuracy. IEEE Transactions on Aerospace and Electronic Systems, 1979(5):738~741
    16强毓锟.雷达与雷达电子战中的熵.现代雷达, 1994(3):1~13
    17 Nickel U. Angle estimation with adaptive phased array radar under mainbeam jamming conditions. AGARD Conference Proceedings, Electronic Counter-Counter Measures for Avionics Sensors and Communication Systems, 1990:11~16
    18孙见彬,李敬辉.海军常用电子对抗方式干扰效果的评估.船舶工程, 1991(3):59~63
    19 Povetko V.N., Starostenko V.F., Tolstikhin G.N. Estimate of the effect of jamming on the efficiency of radar detection of ground targets with the shadowing effect of the underlying surface taken into account. Telecommunications and Radio Engineering, 1995, 49(2):62~66
    20 Radziyevskiy V.G. Effectiveness of radar simulating jamming in the presence of information about its creation. Telecommunications and Radio Engineering, 1995, 49(2):52~56
    21谢虹,汪连栋,袁翔宇.对SAR的噪声干扰效果评估研究.航天电子对抗, 2003(3):15~18
    22周广涛,石长安,杨英科,李宏.基于熵的SAR干扰效果评估方法.航天电子对抗, 2006, 22(4):33~35
    23 Weigang Zhu, Yinqing Zhou, Chunsheng Li. A new index for effectiveness measure of jamming to synthetic aperture radar. International Conference on Signal Processing Proceedings, 2006(4):16~20.
    24张洪斌.数字仿真用于质心干扰效果评定.航天电子对抗, 1988(4):49~55
    25陈相麟.雷达干扰效果及抗干扰性能定量评估的一种新方法.电子对抗技术, 1998, 13(5):9~13
    26周颖,王雪松,王国玉,汪连栋,刘义和.基于战区弹道导弹突防的雷达干扰效果模糊评估.系统工程与电子技术, 2003, 25(7):807~809
    27解凯,陈永光,汪连栋,李昌锦.多假目标干扰效果评估研究.现代雷达, 2006, 28(5):87~90
    28周怀军,丁士援,张剑云.一种相控阵雷达多假目标干扰效果评估指标.电子信息对抗技术, 2007, 22(5):57~60
    29梁百川.干扰效果度量标准和方法.航天电子对抗, 1988(1):44~51
    30 Johnston S.L. Radar ECCM history. IEEE Naecon record, 1980:1210~1214
    31 Li NengJing. Formulas For Measuring Radar ECCM Capability. IEEProceedings, Part F: Communications, Radar and Signal Processing, 1984, 131(4):417~423
    32 Shen Gui-ming, Zhu Wei-hua. Quantitative testing research of surveillance radar anti-jamming performance. CIE International Conference of Radar Proceedings, 2001:255~259
    33李军.雷达有源干扰效果的定量评估与检测.舰船电子对抗, 1995(5):1~11
    34魏保华,吕晓雯,王雪松.一种新的雷达有源干扰效果评估准则的研究.现代雷达, 1999(6):26~31
    35刘普寅,吴孟达.模糊理论及其应用.长沙:国防科技大学出版社, 1998:194~200
    36 Xu Jie, Zhao Shang-Hong, Wu Ji-Li, Li Yong-Jun, Ma Tao. Integrated fuzzy evaluation method of photoelectric jamming to detector. Opto-Electronic Engineering, 2007, 34(6):30~34
    37周颖,王雪松,魏保华.波门拖引式欺骗干扰效果的模糊综合评判.航天电子对抗, 2000(2):20~23
    38 Farrey-Goudreau Ellen, Wood David B. Fuzzy logic implementation of a RGPO jamming detector for a pilot training system. IEEE Military Communications Conference MILCOM, Monterey, CA, USA, Nov 3-5 1997(2):755~759
    39王杰贵,罗景青.多对多雷达干扰效果模糊综合理论评估.雷达与对抗, 2000(3):11~16
    40魏保华,吕晓雯,王雪松,肖顺平.雷达干扰效果模糊综合评估方法研究.系统工程与电子技术, 2000, 22(8):68~71
    41魏保华,杨锁昌,王雪松,高勤,周颖.神经网络应用于干扰效果评估的研究.现代雷达, 2001(3):24~27
    42温浩.用神经网络方法实现复合干扰效果评估.西安电子科技大学硕士学位论文, 2005:41~54
    43 Kwon Hyuck M., Schaefer Lawrence T. Neural network applications for jamming state information generator. IEEE Transactions on Neural Networks, 1994, 5(5):833~837
    44李聪,黄高明,李敬辉.基于模糊神经网络的欺骗性干扰效果评估.电子信息对抗技术, 2007, 22(1):6~9
    45 Yaochu Jin, Jingping Jiang, Jin Zhu. Neural network based fuzzy identification and its application to modeling and control of complex systems. IEEE Transactions on Systems, Man and Cybernetics, 1995, 26(5):990~997
    46 Murphy W S, Roane Michael L. Application of the analysis federate in the joint advanced distributed simulation joint test force electronic warfare phase II test[A], Winter Simulation Conference, USA, 1999:1109~1117
    47周颖,赵峰,罗佳,王雪松.基于半实物/数字仿真的相控阵雷达抗干扰性能评估.航天电子对抗, 2005, 21(4):25~29
    48 Wang Guo-yu. Mathematic simulating model of phased-array antenna in multifunction array radar. System Engineering and Electronics, 1999, 1(1):39~43
    49袁翔宇,王国玉,汪连栋.有源噪声干扰对SAR干扰效果的仿真研究.系统工程与电子技术, 2004, 26(11):1564~1566
    50 V.Vapnik, A.Y.Chervoknenkis. The necessary and sufficient conditions for the uniform convergence of averages to their expected values. Teoriya veroyatnostei I Ee primeniniya, 1981, 26(3):543~564
    51 Friess T.T., Cristianimi N., Cambell C. The kernel adatron algorithm: a fast and simple learning procedure for support vector machines. In Proceeding of 15th International Conference Machine Learning. Morgan Kaufman Pulishers, 1998:188~196
    52 Mangasarian O.L. Generalized support vector machines. In B.S.A.J. Smola, P. Bartelett and D. Schuurmans, editors, Advances in large margin classifiers, MIT Press, 2000:135~136
    53 Chang Chih-Chung, Lin Chih-Jen. Training v-support vector classifiers: theory and algorithm. Neural Computation, 2001, 13(9):2119~2147
    54 Scholkopf B., J.C. Plat, J. Shawe-Taylor A.J. Smola, and R.C. Williamson. Estimating the support of a high-dimensional distribution. Neural Computation, 2001, 13(7):1443~1471
    55 Tax D. and Duin R. Support vector domain description. Pattern Recognition Letters, 1999(20):1191~1199
    56 Lin Chun-Fu, Wang Sheng-De. Fuzzy support vector machines. IEEE Transactions on Neural Networks, 2002, 13(2):464~471
    57 Chew Hong-Gunn, Bogner Robert E., Lim Cheng-Chew. Dual nu-support vector machines with error rate and training size biasing. Proceedings of 26th IEEE ICASSP 2001, Salt Lake City, USA, 2001:1269~1272
    58 J.A.K. Suykens and J. Vandewalle. Least squares support vector machine classifiers. Neural Processing Letters, 1999, 9(3):293~300
    59 T.V. Gestel, J.A.K. Suykens, et al. Benchmarking Least Squares Support Vector Machine Classifiers. Machine Learning, 2004(54):5~32
    60 D. Anguita1 and A. Boni. Digital Least Squares Support Vector Machines. Neural Processing Letters, 2003(18):65~72
    61 D. Tsujinishi, S. Abe. Fuzzy least squares support vector machines for multiclass problems. Neural Networks, 2003(16):785~792
    62 Gestel T. V., Suykens J. A. K., Lanckriet G.. Bayesian framework for least squares support vector machine classifiers, gaussian processes and kernel fisher discriminant analysis. Neural Computation, 2002, 15(5):1115~1148
    63 Suykens J.A.K., Vandewalle J.. Recurrent least squares support vector machines. IEEE Transactions on Circuits and Systems, 2000, 47(7):1109~1114
    64 Cortes C, Vapnik V. Support Vector Networks. Machine Learning, 1995(20):273~297
    65 Chih-Jen Lin, et al. On the Convergence of the Decomposition Method for Support Vector Machines, IEEE Transactions on Neural Networks, 2001, 12(6):1288~1298
    66 E. Osunna, R. Freund, and E. Girosi. Training Support Vector Machines: An Application to Face Detection, In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Juan, PR, USA, Jun 17-19 1997:130~136
    67 Osuna E, Freund R, Firosi. An improved training algorithmfor support vector machines, Proceedings of the 1997 IEEE Workshop on Neural Networks for Singal Processing, New York, 1997:276~285
    68 Joachims T. Making Large-Scale SVM Learning Practical, Advances in Kernel Metheods-Support Vector Learning, Cambridge, MA, MIT Press,1998:169~184
    69 Platt J. Fast Training of Support Vector Machines using Sequential MinimalOptimization, Advances in Kernel Methods-Support Vector Learning, Cambridge, MA, MIT Press,1999:185~208
    70 Keerthi S S. S. K. Shevade. Improvements to Platt’s SMO Algorithm for SVM Classier Design. Technical report, National University of Singapore, 1999:port CD-99-14
    71 Bennett K, Campbell C. Support vector machines: Hype or hallelujah. ACM Special Interest Group on Knowledge Discovery and Data Mining, 2000, 2(2):1~13
    72 Keerthi S, Shevade S, Bhattacharyya C. A fast iterative nearest point apgorithm for support vector machine classifier design. IEEE Transactions on Neural Network, 2000, 11(1):124~136
    73 O. Chapellpe, V. Vapnik. Model selection for support vector machines. Advances in Neural Information Processing Systems, MIT Press, 2000:230~236
    74 C. Gold, P. Sollich. Model selection for support vector classification. Neurocomputing, 2003, 55(1-2):221~249
    75 S.S. Keerthi. Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms. IEEE Transactions on Neural Networks, 2002, 13(5):1225~1229
    76 A. Kulkarni, V.K. Jayaraman, B.D. Kulkarni. Support vector classification with parameter tuning assisted by agent-based technique. Computers& Chemical Engineering, 2004, 28(3):311~318
    77 C.H. Zheng, G.W. Zheng, L.C. Jiao. Heuristic genetic algorithm-based support vector classifier for recognition of remote sensing images. New York Springer-Verlag, Adavances in Neural Networks, Lecture Notes in Computer Science, 2004:629~635
    78 Peng Xiyuan, Wu Hongxing, Peng Yu. Parameter selection method for SVM with PSO. Chinese Journal of Electronics, 2006, 15(4):638~642
    79邵信光,杨慧中,陈刚.基于粒子群优化算法的支持向量机参数选择及其应用.控制理论与应用, 2006, 23(5):740~743
    80 Chun-Hong Zheng, Li-cheng Jiao. Fuzzy pre-extracing method for support vector machine. Proceedings of the First International Conference on Machine Learning and Cybernetics, Beijing, Vol.4, 2002:2026~2030
    81 Zhang Li, Zhou Weida, Jiao Licheng. Pre-extracting support vectors for support vector machines. Signal Processing Proceedings, 2000(3): 1432~1435
    82 Mao, K.Z. Feature subset selection for support vector machines through discriminative function pruning analysis. IEEE Transactions on Systems, Man and Cybernetics-Part B: Cybernetics, 2004, 34(1):60~67
    83 Hearst M. A, Scholkopf B, Dumais S. Trends and Controversies-Support Vector Machines. IEEE Transactions on Intelligent Systems, 1998, 13(4):18~28
    84卢增详,李衍达.交互SVM学习算法及其在文本信息过滤中的应用.清华大学学报. 1999, 39(7):93~97
    85 Roobaert D. View-based 3D Object Recognition with Support Vector Machines. Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99), Madison, WI, USA, Aug 23-Aug 25 1999:77~84
    86 Brown M., Lewis H.G., Gunn S.R. Linear Spectral Mixture Models and Support Vector Machines for Remote Sensing. IEEE Transactions on Geoscience and Remote Sensing, 2000, 38(5):2346~2360
    87 King-Shy Goh, Edward Y. Chang, Beitao Li. Using One-Class and Two-Class SVMs for Multiclass Image Annotation, IEEE Transactions on Knowledge and Data Engineering, 2005, 17(10):1333~1346
    88 L.V. Ganyun, Cheng Haozhong, Zhai Haibao, Dong Lixin. Fault Diagnosis of Power Transformer Based on Multi-layer SVM Classifier. Elecric Power Systems Research, 2005 (74):1~7
    89 Muller K.R., Smola A.J., Ratsch Gl. Predicting time series with support vector machines. Internetional Conference on Artificial Neural Networks-ICANN’97, Berlin, Germany, 1997: 999~1004
    90孙德山,吴今培,侯振挺,等.基于SVR的混沌时间序列预测.计算机工程, 2004(2):54~56
    91 Mohammadreza Afshin, Alireza Sadeghian and Kaamran Raahemifar. On Efficient Tuning of LS-SVM Hyper-Parameters in Short-Term Load Forecasting: A Comparative Study. IEEE Power Engineering Society General Meeting, PES, Tampa, FL, United States, Jun 24-282007:1298~1303
    92刘广利.基于支持向量机的经济预警方法研究.中国农业大学博士学位论文, 2003:32~66
    93 Jae H. Mina, Young-Chan Lee. Bank ruptcy Prediction Using Support Vector Machine with Optimal Choice of Kernel Function Parameters. Expert Systems with Applications, 2005 (28):603~614
    94 Suykens J, A.K., Vandewalls J. De Moor. Optimal Control by Least Squares Support Vector Machines. Neural Networks, 2001(14):23~35
    95 Ling Wei, Jian-jun Qi, Wen-xie Zhang. Knowledge Discovery of Decision Table Based on Support Vector Machine. Proc. of the Second International Conference on Machine Learning and Cybernetics, Xi’an, November 2-5, 2003, Vol.2, pp:1195~1200
    96肖健华,吴今培,杨叔子.基于SVM的综合评价方法研究.计算机工程, 2002, 28,(8):28~30
    97夏国恩,金炜东,张葛祥.基于支持向量分类机和回归机的综合评价方法.西南交通大学学报, 2006, 41(4):522~527
    98徐启军,李敬辉,刘晓东.一种基于SVM的干扰效果评估方法.舰船电子工程, 2007, 27(1):164~168
    99 J.A.K. Suykens, J. De Brabanter, L. Lukas, J. Vandewalle. Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing, 2002(48):82~105
    100 Suykens, J.A.K., Least Squares Support Vector Machines for Classification and Nonlinear Modeling. Neural Network World, Special Issue on PASE’2000, 2000(10):29~48
    101 Zhang Qian, Fan Fuling, Wang Lan. Online least squares support vector machines based on wavelet and its applications. Lecture Notes in computer science, Vol.4493, 2007:416~425
    102 J.A.K. Suykens, L. Lukas, and J. Vandewalle. Sparse approximation using least squares support vector machines. Proc. IEEE Int. Symp. Circuits and systems (ISCAS’00), Geneva, Switzerland, May 2000, PartⅡ, pp:757~760
    103 J.A.K.Suykens, J.De Barbanter, L.Lukas, and J.Vandewalle. Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing, 2002, 48(1-4):85~105
    104 Valyon Jozsef, Horvath Gabor. A sparse least squares support vector machine classifier. IEEE International Joint Conference on Neural Networks-Proceedings, Budapest, Hungary, Vol.1, Jul 25-29 2004:543~548
    105 Xiangyan Zeng and Xue-wen Chen. SMO-Based pruning methods for sparse least squares support vector machines. IEEE Transactions on Neural Networks, 2005, 16:1541~1546
    106 J.H. Liu, J.P. Chen, S. Jiang and J.S. Cheng. Online LS-SVM for function estimation and classification. Journal of University of Science and Technology Beijing, 2003, 10(5):73~77
    107 Tang He-Sheng, Xue Song-Tao, Chen Rong, Sato Tadanobu. Online weighted LS-SVM for hysteretic structural system identification. Engineering Structures, 2006, 28(12):1728~1735
    108张浩然,汪晓东.回归最小二乘支持向量机的增量和在线式学习算法.计算机学报, 2006, 29,(3):400~406
    109吴春国.广义染色体遗传算法与迭代最小二乘支持向量机回归算法研究.吉林大学博士学位论文, 2006:49~70
    110 Lau K.W., Wu Q.H. Online training of support vector clssifier. Pattern Recognition, 2003, 36(8)1913~1920
    111 Ma Jun-shui, Theiler J, Perkins S. Accurate on-line support vector regression. Neural Computation, 2003, 15(11):2683~2704
    112 Jyrki Kivinen, Alexander J. Smola, and Robert C. Williamson. Online learning with kernels. IEEE Transactions on Signal Processing, 2004, 52(8):2165~2176
    113 Yu Zhenhua, Cai Yuanli. Online support vector regression for reinforcement learning. High Technology Letters, 2007, 13(2):173~176
    114 Bordes Antoine , Bottou Leon. The Huller: A simple and efficient online SVM, Lecture Notes in Computer Science, Vol.3720, 2005:505~512
    115 Zhang Zonghua, Shen Hong. Online training of SVMs for real-time intrusion detection. Proceedings - International Conference on Advanced Information Networking and Application (AINA), Ishikawa, Japan, Vol.1, 2004:568~573
    116 Davy Manuel, Desobry Frederic, Gretton Arthur, Doncarli Christian. An online support vector machine for abnormal events detection. SignalProcessing, 2006, 86(8):2009~2025
    117 Wang Wenjian, Men Changqian, Lu Weizhen. Online prediction model based on support vector machine. Neurocomputing, 2008, 71(4-6):550~558
    118叶美盈,汪晓东,张浩然.基于在线最小二乘支持向量机回归的混沌时间序列预测.物理学报, 2005, 54(6):2568~2573
    119欧阳军,闫桂荣,王腾. LS-SVM在随机振动在线自适应逆控制中的应用.应用力学学报, 2007, 24(4):530~534
    120 Li Li-Juan, Su Hong-Ye, Chu Jian. Generalized predictive control with online least squares support vector machines. Zidonghua Xuebao, 2007, 33(11):1182~1188
    121 G. Cauwenberghs and T. Poggio. Incremental and decremental support vector machine learning. Advances in Neural Information Processing Systems, MIT Press, Vol.13, 2001:426-433
    122 B.J. de Kruif and T.J. de Vries. Pruning error minization in least squares support vector machines. IEEE Transactions on Neural Network. 2003, 14(3): 696~702
    123 Chih-Chung Chang and Chih-Jen Lin. LIBSVM: a library for support vector machines. http://www.csie.ntu.edu.tw/~cjlin/libsvm, 2001
    124 Shevade S.K., Keerthi S.S., Bhattacharyya C., and Murthy, K.R.K. Improvements to SMO algorithm for SVM regression. IEEE Transactions on Neural Networks, 2000, 11(5):1188~1193
    125 Flake GW, Lawrence S. Efficient SVM regression training with SMO. Machine Learning Special Issue on SVMs, 2000, 46(1~3):271~290
    126 Keerthi, S.S, Shevade, S.K. SMO algorithm for least squares SVM formulations, Neural Computation, 2003, 15:487~507
    127 Liefeng Bo, Licheng Jiao, and Ling Wang. Working set selection using functional gain for LS-SVM. IEEE Transactions on Neural Networks, 2007, 18(5):1541~1544
    128 Chih-Wei Hsu, Chih-Chung Chang, and Chih-Jen Lin. A practical guide to support vector classification. Technical report, Department of Computer Science and Information Engineering, National Taiwan University. Available at http://www.csie.ntu.edu.tw/~cjlin/papers/tanh.pdf
    129 Keerthi S.S., C.J. Lin. Asymptotic behaviors of support vector machineswith Gaussian kernel. Neural Computation, 2003, 15(7):1667~1689
    130 Lin H.T., C.J. Lin. A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. Technical report, Department of Computer Science and Information Engineering, National Taiwan University. Available at http://www.csie.ntu.edu.tw/~cjlin/papers/tanh.pdf
    131 Carl Golda, Peter Sollichb. Model selection for support vector machine classification. Neurocomputing, 2003(55):221~249
    132 N.E. Ayata, M.Cherieta, C.Y.Suenb. Automatic model selection for the optimization of SVM kernels. Pattern Recognition, 2005(38):1733~1745
    133 Y.Bengio. Gradient-based optimization of hyper-parameters. Neural computation, 2000, 12(8): 1889~1900
    134 Olivier Chapelle, Vladimir Vapnik. Choosing mutiple parameters for support vector machines. Machine learning, 2002, 46(1-3):131~159
    135 J.Kwok. The evidence framework applied to support vector machines. IEEE Transactions on Neural Networks, 2000, 11(5):1162~1173
    136 Liang Xuefeng, Liu Fang. Choosing multiple parameters for SVM based on genetic algorithm. Sixth International Conference on Signal Processing, Vol.1, 2002:117~119
    137 P.Pai and W.Hong. Support vector machines with simulated annealing algorithms in electricity load forecasting. Energy Conversion and Management, 2005, 46(17):2669~2688
    138 Millet-Roig J., Ventura-Galiano R., Chorro-Gasco F.J., Cebrian A.. Support vector machine for arrhythmia discrimination with wavelet-transform-based feature selection. Computers in Cardiology, 2000:407~410
    139 Frohlich Holger, Chapelle Olivier, Scholkopf, Bernhard. Feature Selection for Support Vector Machines by Means of Genetic Algorithms. Proceedings of the International Conference on Tools with Artificial Intelligence, 2003:142~148
    140 Gold Carl, Holub Alex, Sollich, Peter. Bayesian approach to feature selection and parameter tuning for support vector machine classifiers. Neural Networks, 2005, 18(5-6):693~701
    141 Lin S.W., Tseng T.Y., Chen S.C., Huang J.F.. A SA-based feature selection and parameter optimization approach for support vector machine. IEEEInternational Conference on Systems, Man and Cybernetics, Vol.4, 2006:31~44
    142乔立岩,彭喜元,彭宇.基于微粒群算法和支持向量机的特征子集选择方法.电子学报, 2006, 34(3):496~498
    143 Muni D.P., Pal N.R., Das J. Genetic Programming for Simultaneous Feature Selection and Classifier Design. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 2006, 36(1):106~117
    144 Huang C-L, Wang C-J. A GA-based feature selection and parameters optimization for support vector machines. Expert Systems with Applications, 2006, 31(2): 231~240
    145任江涛,赵少东,许盛灿,印鉴.基于二进制PSO算法的特征选择及SVM参数同步优化.计算机科学, 2007, 34(6): 179~182
    146 Rainer Storn, Kenneth Price. Differential evolution-A simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, 1997, 11(4):341~359
    147 Chong-wei Chen, De-zhao Chen, Guang-zhi Cao. An improved differential evolution algorithm in training and encording prior knowledge into feedforward networks with application in Chemistry. Chemometrics and Intelligent Laboratory Sytems, 2002, 64(1):27~43
    148 Rainer Storn. Designing nonstandard filters with differential evoluation. IEEE Signal Processing Magazine, 2005, 22(1):103~106
    149 Sandra Paterlinia, Thiemo Krinkb. Differential evolution and particle swarm optimization in partitional clustering. Computational Statistics & Data Analysis, 2006, 50(5): 1220~1247
    150赵光权,彭喜元,孙宁.带局部增强算子的微分进化改进算法.电子学报, 2007, 35 (5):849~853
    151 Jaakkola, T., and D. Haussler. Probabilistic kernel regression models, In Proc. of the 1999 Conf. on AI and Statistics, 1999:94~102
    152 Opper M., and O. Winther. Gaussian processes and SVM: Mean field and leave-one-out estimator. Advances in Large Margin Classifiers, MIT Press, Cambridge, MA, 2000:311~326
    153余伟峰,王广伦,钱夕元.基于GA/SVM的微阵列数据特征的选择与分类.计算机工程, 2007, 33(10):204~206
    154 Scott Clayton D., Willett Rebecca M., Nowak Robert D.. Cort: Classification and Regression Trees. IEEE International Conference on Acoustics, Speech and Signal Processing, Vol.6, 2003:153~156
    155 Cover T.M.. The best two independent measurements are not the two best. IEEE Transactions on Systems, Man, and Cybernetics, 1974, 4(2):116~117
    156 Kira K., Rendell L.A.. The feature selection problem: traditional methods and a new algorithm. In Proceedings of Ninth National Conference on Aritificial Intelligence, 1992:129~134
    157 John G., Kohavi R., Pfleger K.. Irrelevant features and the subset selection problem. The Eleventh International Conference on Machine Learning, 1994:121~129
    158中航雷达与电子设备研究院.雷达系统,北京:国防工业出版社, 2006:64~78
    159斯科尼克(著),左群声(译).雷达系统导论(第三版).北京:电子工业出版社, 2006: 310~330
    160谢季坚,刘承平.模糊数学方法及其应用(第三版).武汉:华中科技大学出版社, 2006: 37~49
    161章耐芳,罗荣根,高明东.对于末制导雷达抗干扰性能的考评方法.航天电子对抗, 1990(1):61~67
    162 Platt J.C., N. Cristianini, and J. Shawe-Taylor. Large margin DAGs for multiclass classification. Advances in Neural Information Processing Systems, MIT Press, Vol.12, 2000:547~553

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700