基于统计学习理论的支持向量机算法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
传统的统计学研究的是假定样本数目趋于无穷大时的渐近理论,现有的机器学习方法大多是基于这个假设。然而在实际的问题中,样本数往往是有限的。现有的基于传统统计学的学习方法在有限样本的情况下难以取得理想的效果。统计学习理论是在有限样本情况下新建立起来的统计学理论体系,为人们系统地研究小样本情况下机器学习问题提供了有力的理论基础。支持向量机是在统计学习理论基础上开发出来的一种新的、非常有效的机器学习新方法。它较好地解决了以往困扰很多学习方法的小样本、非线性、过学习、高维数、局部极小点等实际问题,具有很强的推广能力。目前,统计学习理论和支持向量机作为小样本学习的最佳理论,开始受到越来越广泛的重视,正在成为人工智能和机器学习领域新的研究热点。本论文研究的主要内容包括以下几个方面:支持向量机算法、多输出支持向量回归、多类支持向量机分类、支持向量机算法以及支持向量分类和支持向量回归的应用。论文主要研究工作有:
    1.标准的支持向量机算法,其最优分类超平面与正负两类是等距的,在处理一些特殊分类问题时,会存在不足。在对支持向量机算法进行研究和分析之后,提出了基于不等距分类超平面的支持向量机算法,并对算法进行了简要的理论推导和仿真。
    2.支持向量回归算法是针对单输出回归问题提出的,对于多输出系统的回归估计,传统的方法是对各个输出独立地建立单输出支持向量回归模型,其缺点是忽略了各个输出之间实际存在的联系,并且不能保证各输出误差和最小化。针对这些问题,通过增加误差和约束条件,且在同一个优化公式中考虑所有输出的回归估计,提出了一种多输出支持向量回归算法,从而可以考虑到各个输出之间的联系,并能提高整个回归模型的回归估计精度。
    3.支持向量机的训练算法需要解决一个大的二次规划最优化问题,传统的二次规划数学算法在求解大数据的二次规划问题时,需要巨大的内存空间,所以并不能
Traditional statistics is based on assumption that samples are infinite, so are most of current machines learning methods. However, in many practical cases, samples are limited. Most of existing methods based on traditional statistical theory may not work well for the situation of limited samples. Statistical Learning Theory (SLT) is a new statistical theory framework established from finite samples. SLT provides a powerful theory fundament to solve machine learning problems with small samples. Support Vector Machine (SVM) is a novel powerful machine learning method developed in the framework of SLT. SVM solves practical problems such as small samples, nonlinearity, over learning, high dimension and local minima, which exit in most of learning methods, and has high generalization. Currently, being the optimal learning theory for small samples, SLT and SVM is attracting more and more researcher and becoming a new active area in the field of artificial intelligent and machine learning. This dissertation studies multi-output Support Vector Regression (SVR), multiclass SVM, support vector machines algorithm, and applications of SVM and SVR. The main results of the dissertation are as follows:
    1.After the original formulation of the standard SVM is studied and analyzed, a new learning algorithm, Non-equidistant Margin Hyperplane SVM (NM-SVM), is proposed to handle some frequent special cases in pattern classification and recognition. The separating hyperplane of NM-SVM is not equidistant from the closest positive examples and the closest negative examples.
    2. Support vector regression builds a model of a process that depends on a set of factors. It is traditionally used with only one output, and the multi-output case is then dealt with by modeling each output independently of the others, which means that advantage cannot be taken of the correlations that may exist between outputs. The dissertation extends SVR to multi-output systems by considering all output in one optimization formulation. This will make it possible to take advantage of the possible correlations between the outputs to improve the quality of the predictions provided by the model.
    3.For the study of SVM training algorithm, training a Support Vector Machine
    requires the solution of a very large Quadratic Programming (QP) optimization problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, several approaches exist for circumventing the above shortcomings and work well. The dissertation explores the possibility of using Particle Swarm Optimization (PSO) algorithm for SVM training. 4.For lager-scale samples, based on Rough Sets (RS) theory and SVM, an integrated method of classification named RS-SVM is presented. Using the knowledge reduction algorithm of RS theory, the method can eliminate redundant condition attributes and conflicting samples from the working sample sets, and evaluates significance of the reduced condition attributes. Eliminating the redundant condition attributes can cut down the sample space dimension of SVM, and SVM will generalize well. Deleting the conflicting samples can reduce the count of working samples, and shorten the training time of SVM. 5.The methods constructing and combining several binary SVMs with a binary tree can solve multiclass problems, and resolve the unclassifiable regions that exist in the conventional multiclass SVM. Since some existing methods based on binary tree didn’t use any effective constructing algorithm of binary tree, several improved multiclass SVM methods based on binary tree are proposed by using class distance and class covering of clustering. 6.The study of SVM and SVR application. An approach based on voice recognition using support vector machine (SVM) is proposed for stored-product insect recognition. Adaline adaptive noise canceller is used as voice preprocessing unit, feature vectors are extracted from audio signals preprocessed of known insect samples, and used to train multiply SVMs for insect recognition. The operation is very convenient, only requiring the insect’s audio signals collected by sensors without insect images or samples. Focusing on the difficulty of scattered data approximation, two methods of surface approximation based on SVR are presented, which have been applied to reconstruct temperature fields of large granaries.
引文
[1] 史忠植. 知识发现. 北京: 清华大学出版社,2002
    [2] V. Vapnik. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1999
    [3] 张学工译. 统计学习理论的本质. 北京: 清华大学出版社, 2000
    [4] F. F. Rosenblatt. The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Cornell Aeronautical Laboratory, Psychological Review, 1958,65: 386~408
    [5] V. Vapnik. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995
    [6] V. Vapnik, A. Lerner. Pattern Recognition Using Generalized Portrait Method. Automation and Remote Control, 1963,24(6): 774~780
    [7] V. Vapnik, A. Y. Chervonenkis. A Note on one Class of Perceptrons. Automation and Remote Control,1964,25
    [8] V. Cherkassky, F. Mulier. Learning from Data: Concepts, Theory and Methods. New York: John Wiley & Sons, 1997
    [9] V. Vapnik. An Overview of Statistical Learning Theory. IEEE Transactions on Neural Networks, 1999, 10(5): 988~999
    [10] V. Vapnik, A. Y. Chervoknenkis. The Necessary and Sufficient Conditions for the Uniform Convergence of Averages to their Expected Values. Teoriya Veroyatnostei I Ee Primeniniya, 1981, 26(3): 543~564
    [11] V. Vapnik, A. Y. Chervoknenkis. Theory of Pattern Recognition. Nauka, Moscow, 1974
    [12] V. Vapnik. Estimation of Dependencies Based on Empirical Data. New York: Springer-Verlag, 1982
    [13] V. Vapnik, A. Y. Chervoknenkis. The Necessary and Sufficient Conditions for Consistency of the Method of Empirical Risk Minimization. Pattern Recognition And Image Analysis, 1991,1(3): 284~305
    [14] V. Vapnik. Statistical Learning Theory. New York: John Wiley & Sons,1998
    [15] A. Smola, R. Williamson, B. Sch?lkopf. Generalization Bounds for Convex Combinations of Kernel Functions. NeuroCOLT2 Technical Report series, NC2-TR-1998-020, Royal Holloway College, University of London, UK, 1998
    [16] 张学工. 关于统计学习理论与支持向量机. 自动化学报, 2000,26(1): 32~42
    [17] C. J. C. Burges. A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery,1998,2(2): 121~176
    [18] 杜树新,吴铁军. 模式识别中的支持向量机方法. 浙江大学学报(工学版),2003,37(5): 521~527
    [19] C. Cortes, V. Vapnik. The Soft Margin Classifier. Technical memorandum 11359-931209-18TM, AT&T Bell Labs, 1993
    [20] V. Vapnik, S. Golowich, A. Smola. Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing. In: Advances in Neural Information Processing Systems 9, Cambridge, MA, MIT Press, 1997: 281~287
    [21] B. Sch?lkopf, A. Smola, K.-R. Muller. Kernel Principal Component Analysis. In B. Sch?lkopf, C. J. C. Burges , and A. Smola, editors, Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge, MA, 1999: 327~352
    [22] A. Ahmad, M. Khalid, R. Yusof. Kernel Methods and Support Vector Machines for Handwriting Recognition. IEEE Student Conf. on Research and Development Proceedings (SCOReD 2002), 2002: 309~312
    [23] C. Bahlmann, B. Haasdonk, H. Burkhardt. On-line Handwriting Recognition with Support Vector Machines—A Kernel Approach. In Proceedings of the Eighth International Workshop on Frontiers in Handwriting Recognition, 2002: 49~54
    [24] G. Siolas, F. d’Alche-Buc. Support Vector Machines Based on a Semantic Kernel for Text Categorization. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000,5: 205~209
    [25] N. Cristianini, C. Campbell, J. Shawe-Taylor. Dynamically Adapting Kernels in Support Vector Machines. In Proceedings of Neural Information Processing Workshop, NIPS’98, 1998: 204~210
    [26] T. Friess, N. Cristianini, C. Campbell. The Kernel-Adatron: A Fast and Simple Learning Procedure for Support Vector Machines. In Proceedings of the Fifteenth International Conference on Machine Learning, 1998: 188~196
    [27] B. Sch?lkopf. Statistical Learning and Kernel Methods. Technical Report MSR-TR-2000-23,2000
    [28] M. Anthony, P. Bartlett. Learning in Neural Networks: Theoretical Foundations. Cambridge University Press, 1999
    [29] J. Shawe-Taylor, N. Cristianini. Margin Distribution and Soft Margin. In A. Smola, P. Bartlett, B. Sch?lkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, MIT Press, Cambridge, MA, 2000: 349~358
    [30] J. Weston. Leave-One-Out Support Vector Machines. IJCAI 1999: 727~733
    [31] A. Smola, B. Sch?lkopf, K.-R. Muller. Convex Cost Functions for Support Vector Regression. In L. Niklasson, M. Bodtn, and T. Ziemke, editors, Proceedings of the Eighth International Conference on Artificial Neural Networks, Perspectives in Neural Computing, Berlin, Springer-Verlag.1998
    [32] E. Osuna, R. Freund, F. Girosi. Support Vector Machines: Training and Application. A.I. Meno No.1602 C.B.C.L Paper No.144, Cambridge, MA: Massachusetts Institute of Technology, AILab, 1997
    [33] O. L. Mangasarian, D. R. Musicant. Successive Overrelaxation for Support Vector Machines. IEEE Transactions on Neural Networks, 1999,10(5): 1032~1037
    [34] B. Sch?lkopf, A. Smola, P. Bartlett. New Support Vector Algorithms. Neural Computation, 2000, 12: 1207~1245
    [35] Chih-Chung Chang, Chih-Jen Lin. Training v-Support Vector Classifiers: Theory and Algorithms. Neural Computation, 2001,13(9): 2119~2147
    [36] Pai-Hsuen Chen, Chih-Jen Lin, B. Sch?lkopf. A Tutorial on ν-Support Vector Machines. Applied Stochastic Models in Business and Industry, 2005. To appear
    [37] Chih-Chung Chang, Chih-Jen Lin. Training v-Support Vector Regression: Theory and Algorithms. Neural Computation, 2002,14(8): 1959~1977
    [38] O. L. Mangasarian. Generalized Support Vector Machines. In A. Smola, P. Bartlett, B. Sch?lkopf, and D. Schuurmans, editors, Advances in large Margin Classifiers, MIT Press, 2000:135~146
    [39] O. L. Mangasarian, D. Musicant. Nonlinear Data Discrimination via Generalized Support Vector Machines. ICCP99, Madison, Wisconsin, June 9-12, 1999
    [40] Takuya Inoue, Shigeo Abe. Fuzzy Support Vector Machines for Pattern Classification. In Proceedings of International Joint Conference on Neural Networks, 2001,2: 1449~1454
    [41] Chun-Fu Lin, Sheng-De Wang. Fuzzy Support Vector Machines. IEEE Transactions on Neural Networks, 2002, 13 (2): 464~471
    [42] J. A. K. Suykens, J. Vandewalle. Least Squares Support Vector Machines Classifiers. Neural Processing Letters, 1999, 9(3): 293~300
    [43] J. A. K. Suykens, J. Vandewalle. Recurrent Least Squares Support Vector Machines. IEEE Transactions on Circuits and Systems, 2000, 47(7): 1109~1114
    [44] Hong-Gunn Chew, D. Crisp, R. E. Bogner et al. Target Detection in Radar Imagery Using Support Vector Machines with Training Size Biasing. In: Proceedings of the sixth international conference on control, Automation, Robotics and Vision, Singapore, 2000
    [45] Hong-Gunn Chew, R. E. Bogner, Cheng-Chew Lim. Dual ν-Support Vector Machine with Error Rate and Training Size Biasing. In Proceedings of 26th IEEE ICASSP 2001, Salt Lake City, USA, 2001(2): 1269~1272
    [46] B. Sch?lkopf, J. Platt, J. Shawe-Taylor et al. Estimating the Support of a High-Dimensional Distribution. Neural Computation, 2001, 13(7): 1443~1471
    [47] D. Tax, R. Duin. Support Vector Domain Description. Pattern Recognition Letters, 1999, 20(11-13): 1191~1199
    [48] Yuh-Jye Lee, O. L. Mangasarian. SSVM: A Smooth Support Vector Machines. Computational Optimization and Applications, 2001, 20(1): 5~22
    [49] Yuh-Jye Lee, O. L. Mangasarian. RSVM: Reduced Support Vector Machines. In Proceedings of the First SIAM International Conference on Data Mining, 2001
    [50] Kuan-Ming Lin, Chih-Jen Lin. A Study on Reduced Support Vector Machines. IEEE Transactions on Neural Networks, 2003,14(6): 1449~1459
    [51] Li Zhang, Weida Zhou, Licheng Jiao. Wavelet Support Vector Machine. IEEE Transactions on Systems, Man and Cybernetics, Part B, 2004,34(1): 34~39
    [52] O. L. Mangasarian, D. R. Musicant. Lagrangian Support Vector Machines. Journal of Machine Learning Research, 2001,1: 161~177
    [53] N. de Freitas, M. Milo, P. Clarkson et al. Sequential Support Vector Machines. Neural Networks for Signal Processing IX.1999: 31~40
    [54] O. L. Mangasarian, D. R. Musicant. Active Support Vector Machine Classification. Technical Report 00-04, Data Mining Institute, University of Wisconsin, April 2000
    [55] J. Weston,R. Herbrich. Adaptive Margin Support Vector Machines. In A. Smola, P. Bartlett, B. Sch?lkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, MIT Press, Cambridge, MA, 2000: 281~295
    [56] C. Campbell. Algorithmic Approaches to Training Support Vector Machines: A Survey. 8th European Symposium On Artificial Neural Networks Bruges (Belgium), 2000
    [57] K.-R. Muller, S. Mika, G. Ratsch et al. An Introduction to Kernel-based Learning Algorithms. IEEE Transactions on Neural Networks, 2001,12(2): 181~201
    [58] L. Kaufman. Solving the Quadratic Programming Problem Arising in Support Vector Classification. In B. Sch?lkopf, C. J. C. Burges,and A. Smola, editors, Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge, MA, 1998,10: 147~167
    [59] 李建民,张钹,林福宗. 支持向量机的训练算法. 清华大学学报(自然科学版), 2003,43(1): 120~124
    [60] 刘江华,程君实,陈佳品. 支持向量机训练算法综述. 信息与控制, 2002,31(1): 45~50
    [61] C. Cortes, V. Vapnik. Support Vector Networks. Machine Learning, 1995,20: 273~297
    [62] E. Osuna, R. Freund, F. Girosi. An Improved Training Algorithm for Support Vector Machines. In J. Principe, L. Gile, N. Morgan, and E. Wilson, editors, Neural Networks for Signal Processing VII-Proceedings of the 1997 IEEE Workshop, New York, 1997: 276~285
    [63] E. Osuna, R. Freund, F. Girosi. Training Support Vector Machines: An Application to Face Detection, Proc. Computer Vision and Pattern Recognition’97, 1997: 130~136
    [64] T. Joachims. Text Categorization with Support Vector Machines: Learning with Many Relevant Features. In Proceedings of the European Conference on Machine Learning, Berlin: Springer, 1998: 137~142
    [65] T. Joachims. Making Large-Scale SVM Learning Practical. In B. Sch?lkopf, C. J. C. Burges,and A. Smola, editors, Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge, MA, 1999: 169~184
    [66] J. Platt. Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines. Tech. Rep. MSR-TR-98-14, Microsoft Research, 1998
    [67] J. Platt. Fast Training of SVMs Using Sequential Minimal Optimization. In B. Sch?lkopf, C. J. C. Burges , and A. Smola, editors, Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge, MA, 1998: 185~208
    [68] J. Platt. Using Analytic QP and Sparseness to Speed Training of Support Vector Machines. In M. Kearns, S. Solla, D. Cohn, editors, Advances in Neural Information Processing Systems, MIT Press, Cambridge, MA, 1999,11: 557~563
    [69] S. Keerthi, S. Shevade, C. Bhattacharyya et al. Improvements to Platt’s SMO Algorithm for SVM Classifier Design. Technical Report CD-99-14, Dept. of Mechanical and Production Engineering, Nat. Univ. Singapore, Singapore,1999
    [70] A. Smola, B. Sch?lkopf. A Tutorial on Support Vector Regression. Statistics and Computing. Forthcoming.2001
    [71] G. W. Flake, S. Lawrence. Efficient SVM Regression Training with SMO. Machine Learning, 2002,46(1-3): 271~290
    [72] B. Sch?lkopf, A. Smola, R. Williamson et al. New Support Vector Algorithms. Neural Computation, 2000,12: 1083~1121
    [73] 孙剑,郑南宁,张志华. 一种训练支撑向量机的改进贯序最小优化算法. 软件学报, 2002, 13(10): 2007~2013
    [74] G. Cauwenberghs, T. Poggio. Incremental and Decremental Support Vector Machine Learning. Advances in Neural Information Processing Systems (NIPS*2000), MIT Press, 2001,13: 409~415
    [75] M. Carozza, S. Rampone. Towards an Incremental SVM for Regression. In Proceedings of the IEEE-ENNS International Joint Conference on Neural Networks, Como, Italy 2000,6: 405~410
    [76] Rong Xiao, Jicheng Wang, Fayan Zhang. An Approach to Incremental SVM Learning Algorithm. In Proceedings of 12th IEEE International Conference on Tools with Artificial Intelligence, ancouver, BC, Canada, 2000: 268~273
    [77] 萧嵘,王继成,孙正兴等. 一种SVM 增量学习算法α-ISVM. 软件学报, 2001,12(12): 1818~1824
    [78] Xuegong Zhang. Using Class-center Vectors to Build Support Vector Machines. In Proceedings of NNSP’99, 1999, 3~11
    [79] S. Keerthi, S. Shevade, C. Bhattacharyya et al. A Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier Design. IEEE Transactions on Neural Networks, 2000, 11(1): 124~136
    [80] M. Rychetsky, S. Ortmann, M. Ullmann et al. Accelerated Training of Support Vector Machines. In Proceedings of International Joint Conference on Neural Networks (IJCNN99). IEEE, 1999: 998~1003
    [81] Ming-Hsuan Yang, Ahuja N. A Geometric Approach to Train Support Vector Machines. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2000,1: 430~437
    [82] Chih-Wei Hsu, Chih-Jen Lin. A Comparison of Methods for Multiclass Support Vector Machines. IEEE Transactions on Neural Networks, 2002, 13(2): 415~425
    [83] L. Bottou, C. Cortes, J. Denker et al. Comparison of Classifier Methods: A Case Study in Handwritten Digit Recognition. In Proc. Of the International Conference on Pattern Recognition. 1994: 77~87
    [84] S. Knerr, L. Personnaz, G. Dreyfus. Single-Layer Learning Revisited: A Stepwise Procedure for Building and Training a Neural Network. Neurocomputing: Algorithms, Architectures and Applications. J. Fogelman, Ed. New York: Springer-Verlag, 1990
    [85] J. Platt, N. Cristianini, J. Shawe-Taylor. Large Margin DAG’s for Multiclass Classification. In Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 2000, 1(12): 547–553
    [86] K. Bennett, J. Blue. A Support Vector Machine Approach to Decision Trees. Rensselaer Polytechnic Institute, Troy, New York: R.P.I Math Report, 1997: 97~100
    [87] D. J. Sebald, J. A. Bucklew. Support Vector Machines and the Multiple Hypothesis Test Problem. IEEE Transactions on Signal Processing, 2001, 49(11): 2865~2872
    [88] Sungmoon Cheong, Sang Hoon Oh, Soo-Young Lee. Support Vector Machines with Binary Tree Architecture for Multi-Class Classification. Neural Information Processing-Letters and Reviews, 2004, 2(3): 47~51
    [89] 马笑潇, 黄席樾, 柴毅. 基于SVM 的二叉树多类分类算法及其在故障诊断中的应用. 控制与决策, 2003,18(3): 272~276
    [90] 曾凡仔,裘正定. 一种基于可行域解析中心的多类分类算法. 复旦学报(自然科学版), 2004,43(5): 773~776
    [91] 徐勋华,王继成. 支撑向量机的多类分类方法. 微电子学与计算机, 2004,21(10): 149~152
    [92] 李昆仑,黄厚宽,田盛丰. 模糊多类SVM 模型. 电子学报, 2004,32(5): 830~832
    [93] J. Weston, C. Watkins. Support Vector Machines for Multi-class Pattern Recognition. In Proc. Of the 7th European Symposium on Artificial Neural Networks, April 1999
    [94] D. Gorgevik, D. Cakmakov. Combining SVM Classifiers for Handwritten Digit Recognition. In Proceedings of 16th International Conference on Pattern Recognition, 2002,3: 102~105
    [95] 高学,金连文,尹俊勋等. 一种基于支持向量机的手写汉字识别方法. 电子学报, 2002,30(5): 651~654
    [96] F. Wang, L. Vuurpijl, L. Schomaker. Support Vector Machines for the Classification of Western Handwritten Capitals. In Proceedings of the Seventh International Workshop on Frontiers in Handwriting Recognition, 2000: 167~176
    [97] S. Fine, J. Navratil, R. A. Gopinath. A Hybrid GMM/SVM Approach to Speaker Identification. In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2001,1: 417~420
    [98] W. M. Campbell. A SVM/HMM System for Speaker Recognition. In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003,2: II-209~212
    [99] V. Wan, W. M. Campbell. Support Vector Machines for Speaker Verification and Identification. Neural Networks for Signal Processing X, Proceedings of the 2000 IEEE Signal Processing Society Workshop, 2000,2: 775~784
    [100] 何昕,刘重庆,李介谷. 基于支撑向量机的文本无关的说话人识别系统. 计算机工程,2000,26(6): 61~63
    [101] 忻栋,杨莹春,吴朝晖. 基于SVM-HMM 混合模型的说话人确认. 计算机辅助设计与图形学学报,2002,14(11): 1080~1082
    [102] A. V. Anghelescu, I. B. Muchnik. Combinatorial PCA and SVM Methods for Feature Selection in Learning Classifications (Applications to Text Categorization). International Conference on Integration of Knowledge Intensive Multi-Agent Systems, 2003: 491~496
    [103] 李晓黎,刘继敏,史忠植. 基于支持向量机与无监督聚类相结合的中文网页分类器. 计算机学报,2001,24(1): 62~68
    [104] A. Tefas, C. Kotropoulos, I. Pitas. Using Support Vector Machines to Enhance the Performance of Elastic Graph Matching for Frontal Face Authentication. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001,23(7): 735~746
    [105] Shaoyan Zhang, Hong Qiao. Face Recognition with Support Vector Machine. Robotics. In Proceedings of IEEE International Conference on Intelligent Systems and Signal Processing, 2003,2: 726~730
    [106] Kyunghee Lee, Hyeran Byun. A New Face Authentication System for Memory-Constrained Devices. IEEE Transactions on Consumer Electronics, 2003,49(4): 1214~1222
    [107] Kyunghee Lee, Yongwha Chung, Hyeran Byun. SVM-Based Face Verification with Feature Set of Small Size. Electronics Letters, 2002,38(15): 787~789
    [108] 王宏漫,欧宗瑛. 基于支持向量机的人脸识别方法研究. 小型微型计算机系统, 2004,25(1): 139~142
    [109] M. Brown, H. G. Lewis, S. R. Gunn. Linear Spectral Mixture Models and Support Vector Machines for Remote Sensing. IEEE Transactions on Geoscience and Remote Sensing, 2000,38(5): 2346~2360
    [110] F. Melgani, L. Bruzzone. Classification of Hyperspectral Remote Sensing Images with Support Vector Machines. IEEE Transactions on Geoscience and Remote Sensing, 2004,42(8): 1778~1790
    [111] P. M. L. Drezet, R. F. Harrison. Support Vector Machines for System Identification. UKACC International Conference on Control’98 (Conf. Publ. No. 455), 1998,1: 688~692
    [112] Qi Miao, Shi-Fu Wang. Nonlinear Model Predictive Control Based on Support Vector Regression. In Proceedings of International Conference on Machine Learning and Cybernetics , 2002,3: 1657~1661
    [113] B. J. de Kruif, T. J. A. de Vries. On Using a Support Vector Machine in Learning Feed-Forward Control. In Proceediings of Int. Conf. on Advanced Intelligent Mechatronics, Como, Italy, July 2001: 272~277
    [114] J. A. K. Suykens. Nonlinear Modeling and Support Vector Machines. In Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference, 2001,1: 287~294
    [115] M. Rychetsky, S. Ortmann, M. Glesner. Support Vector Approaches for Engine Knock Detection. International Joint Conference on Neural Networks, 1999,2: 969~974
    [116] 李凌均,张周锁,何正嘉. 基于支持向量机的机械故障智能分类研究. 小型微型计算机系统, 2004,25(4): 667~670
    [117] 王成栋,朱永生,张优云等. 时频分析与支持向量机在柴油机气阀故障诊断中的应用. 内燃机学报, 2004,22(3): 245~251
    [118] 孙德山,吴今培,肖健华. SVR 在混沌时间序列预测中的应用. 系统仿真学报,2004,16(3): 519~524
    [119] S. Chen, A. K. Samingan, L. Hanzo. Support Vector Machine Multiuser Receiver for DS-CDMA Signals in Multipath Channels. IEEE Transactions on Neural Networks, 2001,12(3): 604~611
    [120] L. J. Cao, F. Tay. Support Vector Machine with Adaptive Parameters in Financial Time Series Forecasting. IEEE Transactions on Neural Networks, 2003, 14(6): 1506~1518
    [121] C. J. C. Burges, B. Sch?lkopf. Improving the Accuracy and Speed of Support Vector Machines. In Advances in Neural Information Processing Systems 9,M. Mozer, M. Jordan and T.Petsche, Eds, MIT Press, Cambridge, MA, 1997: 375~381
    [122] 许建华,张学工译. 统计学习理论.北京: 电子工业出版社,2004
    [123] 唐发明,王仲东, 陈绵云. 支持向量机多类分类算法研究. 控制与决策, (已录用)
    [124] C. Williams, M. Seeger. Using the Nystrom Method to Speed Up Kernel Machines. Proc. Advances in Neural Information Processing System, 2001,vol.14
    [125] B. Boser, L. Guyon, V. Vapnik. A Training Algorithm for Optimal Margin Classifier, In Fifth annual Workshop on Computational Learning Theory, Baltimore, MD: ACM Press, 1992: 144~152
    [126] C. Campbell, N. Cristianini. Simple Learning Algorithm for Training Support Vector Machines. Technical Report CIG-TR-KA, Bristol, U K: University of Bristol, Engineering Mathematics, Computational Intelligence Group, 1999
    [127] Chih-Chung Chang, Chih-Wei Hsu, Chih-Jen Lin. The Analysis of Decomposition Methods for Support Vector Machines. IEEE Transactions on Neural Networks, 2000,11: 1003~1008
    [128] Chih-Jen Lin. On the Convergence of the Decomposition Method for Support Vector Machines. IEEE Transactions on Neural Networks, 2001,12: 1288~1298
    [129] E. Osuna, R. Freund, F. Girosi. An Improved Training Algorithm for Support Vector Machines. J. Principe, L. Gile, N. Morgan et al. In Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing, 1997: 276~285
    [130] P. Laskov. An Improved Decomposition Algorithm for Regression Support Vector Machines. In S. Solla, T. Leen, K.-R. Muller, editors, Advances in Neural Information Processing Systems 12, MIT Press, Cambridge, MA, 2000: 484~490
    [131] P. Laskov. Feasible Direction Decomposition Algorithms for Training Support Vector Machines. Machine Learning, 2002, 46(1): 315~349
    [132] Chih-Wei Hsu, Chih-Jen Lin. A Simple Decomposition Method for Support Vector Machines. Technical report, Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan. Submitted to Machine Learning. 1999
    [133] J. Kennedy, R. Eberhart. Particle Swarm Optimization. In Proceedings of IEEE Int. Conf. on Neural Networks. Piscataway, 1995: 1942~1948
    [134] R. Eberhart, J. Kennedy. A new Optimizer Using Particle Swarm Theory. Proc 6th Int. Symposium on Micro Machine and Human Science. Nagoya, 1995: 39~43
    [135] Y. Shi, R. Eberhart. A Modified Particle Swarm Optimizer. IEEE International Conference on Evolutionary Computation, Anchorage, Alaska, 1998
    [136] Y. Shi, R. Eberhart. Parameter Selection in Particle Swarm Optimization. In Proceedings of the 7th Annual Conf on Evolutionary Programming. New York, 1998: 591~600
    [137] Z. Pawlak. Rough Sets: Theoretical Aspects of Reasoning about Data. Boston: Kluwer Academic Publishers, 1991
    [138] 邹先霞,杜威,魏长华. 基于容错粗糙集理论的知识约简.计算机工程与应用, 2003,13: 111~113
    [139] J. Hertz, A. Krogh, R. G. Palmer. Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City, California, 1991
    [140] 罗健旭, 邵惠鹤. 一种基于粗糙集的模糊神经网络. 上海交通大学学报,2003,37(11): 1702~1705
    [141] U. Kre?el. Pairwise Classification and Support Vector Machines. In B. Sch?lkopf, C. J. C. Burges,and A. Smola, editors, Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge, MA, 1999: 255~268
    [142] J. Friedman. Another Approach to Polychotomous Classification. Dept. Statistic, Stanford Univ., Stanford, CA. 1996
    [143] E. J. Bredensteiner, K. P. Bennet. Multicategory Classification by Support Vector Machines. Computational Optimizations and Applications, 1999: 53~79
    [144] I. Santamaria. Design of Linear-Phase FIR Filters Using Support Vector Regression Approach. Electronics Letters, 2003,39(19): 1422~1423
    [145] 郭敏,尚志远. 储粮害虫声信号的检测和应用. 物理,2001,30(1): 39~42
    [146] D. W. Hagstrum, K. W. Vick, J. C. Webb. Acoustic Monitoring of Phyzopertha Dominica (Coleoptera:Bostrichidae) Populations in Stored Wheat. Journal of Economic Entomology, 1990,83(2): 625~628
    [147] D. Shuman, J. A. Coffelt, K. W. Vick et al. Quantitative Acoustical Detection of Larvae Feeding Inside Kernels of Grain. Journal of Economic Entomology, 1993,86(3): 933~938
    [148] D. W. Hagstrum, P. W. Flinn, D. Shuman. Automated Monitoring Using Acoustical Sensors for Insects in Farm-Stored Wheat. Journal of Economic Entomology, 1996,89(1): 211~217
    [149] K. M. Coggins, J. Pricipe. Detection and Classification of Insect Sounds in a Grain Silo Using a Neural Network. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on Neural Networks Proceedings, 1998,3: 1760~1765
    [150] 朱东波,张舜德,李涤尘等. 密集散乱测量数据点的B 样条曲面拟合研究. 计算机辅助设计与图形学学报,2001,13(12): 1123~1128
    [151] R. Hardy. Multiquadratic Equations of Topography and Other Irregular Surfaces. J. Geophysical Research, 1971,76(8): 1905~1915
    [152] 莫灿林,谭建荣,张树有. 基于自适应神经网络的自由曲面分形生成. 软件学报,2001,12(4): 592~598
    [153] P. Veron, J. L. Leon. Static Polyhedron Simplification Using Error Measurements. Computer-Aided Design, 1997,29(4): 187~296
    [154] L. Schumaker. Fitting Surfaces to Scattered Data. Approximation Theory II, C. Chui, L. Schumaker, and G. Lorentz, editors, New York: Wiley, 1976:203~268
    [155] C. Bradley, G. W. Vikers. Free-Form Surface Reconstruction for Machine Vision Rapid Prototyping. Optical Engineering, 1993,32(9): 2191~2200
    [156] B. Sarkar, C. H. Menq. Parameter Optimization in Approximating Curves and Surfaces to Measurement Data. Computer-Aided Geometric Design, 1991,8(4): 267~290
    [157] C. G. Lim. A Universal Parameterization in B-spline Curve and Surface Interpolation. Computer-Aided Geometric Design, 1999,16(5): 407~422

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700