基于Kernel学习机的建模与分类的应用算法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
研究从观测数据出发寻找规律,利用这些规律对未来数据进行预测,以实现为人类更好服务的目的,这是基于数据的机器学习的主要内容。然而基于数据的学习存在一个不适定的问题,因此一些理论上很优秀的学习方法在实际运用中往往差强人意。本文利用Kernel变换和正则化的思想在数据学习方面做了一系列前瞻性的研究。
     1)对本文采用的基本理论进行了介绍。介绍了机器学习和Kernel学习机基本理论。介绍了Kernel算法的通用结构、Mercer定理及映射函数。介绍了统计学习理论和正则化网络的理论。回顾了Kernel相关算法的国内外进展情况。最后介绍了本文工作的基本框架结构、主要的创新性以及相关研究领域。
     2)提出了一种新的具有先验类别信息的PKPCA算法,通过将样本类内差和类间差融入总体方差中,从而达到更好的分类目的。提出重构样本库的概念及构建算法,获得稀疏样本库,减少特征向量维数。可以证明KPCA和KFD是PKPCA参数取极限的两个特例。同时可以克服KFD只能求得(类别数—1)个特征向量的不足。最后,利用构造的函数类对第一个主元的分类能力进行仿真分析,以及对信用卡、天文、疾病等数据进行实验分析,表明本算法明显优于KPCA算法,获得了满意的分类效果。
     3)对递推最小二乘进行非线性的Kernel变换,并采用正则化技术改写了目标函数,提出了一种RKRLS算法。获得了RKRLS模型的系数和误差表达式。在此基础上还提出了一种递推支持向量机算法。给出了DSV的概念,以及判断DSV的三个条件:ε不敏感性、ν敏感性和非奇异性条件。分别导出了RKRLS和RSVM算法在限定、增长和缩减记忆模式下的递推公式,均无需进行求逆计算。同时总结了算法具有小样本、可控的推广能力、鲁棒性和快速性等良好的工程特性。
     4)提出了矢量基学习算法。通过分析样本矢量和解空间的夹角,推导了基矢量的判断准则。获得了辨识参数的增长和校正模式的递推公式。在此基础上更深入提出了矢量基学习网络,推导了网络基的自动生成规则。推导了网络节点增长的权值递推算法、网络基参数的校正算法以及网络权值的校正算法的递推公式。对糖酵解混沌振荡过程进行动态辨识建模,结果表明本算法具有较好的辨识效果和收敛性。
     5)提出了MIMO矢量基学习网络的基本结构,网络可以实现建模和模
    
    浙江大学博士学位论文
    式分类的功能。利用梯度下降法对网络的权值进行训练,并且推导了BVS
    的增长算法,以及网络训练的限制记忆递推公式。并进行了参数辨识和双重
    螺旋分类的仿真研究,得到较好的辨识和分类效果。
     6)对矢量基网络算法进行了更高层次的概括,提出了人类认知的矢量
    基模型。本文利用这一认知模型对混沌序列进行了认知模拟,达到较好的认
    知目的。仿真结果也说明,这种结构与人类的认知模式非常接近,可以对认
    知科学的发展提供新的参考框架,同时人类的认知分析也对矢量基算法提供
    了哲学层次的指导意义,促使算法在更高层次上得到更深入的发展。
     7)将本文算法在橡胶工业的密炼过程得到实际的应用:在排除异常样
    本点的情况下,利用5 VM的工业特性,进行排胶点的建模,获得好的应用
    效果:利用动态的RKRLs和RsvM算法,通过对橡胶棍炼质量的门尼指标
    进行建模和预测分析,表明算法具有较好的跟踪预测性能;利用矢量基学习
    网络对密炼过程的门尼进行辨识建模和预报,获得了较好的效果,从而实现
    了更好的门尼波动的控制。最终本文开发成功了“两栖智能密炼系统”,并
    介绍了相应的软件操作和功能。
     最后对本文的工作进行了总结,并从人类认知高度进行了展望。
Machine Learning is to find rules from data and to predict future data in order to better serve human being. While the question of learning from data is ill posed, so some excellent theories cannot satisfy the practical application properly. This thesis researches machine learning using Kernel function and Regularization Theory.
    1) The basic theories used in this thesis were introduced, including Machine Learning, Kernel Machine, Statistical Learning Theory and Regularized Theory. The research situation about Kernel Machine was reviewed. At last, the basic structure, contributions and correlated fields were given.
    2) The thesis proposed a Priori Kernel Principal Component Analysis (PKPCA), which integrates between and within class variances into KPCA, and thus the classification performances can be enhanced. To get sparse sample library and reduce eigenvector dimension, a new concept of reconstructing sample library and its corresponding algorithm are introduced and presented, respectively. Further, both KPCA and Kernel Fisher Discriminant can be proved to be two special cases of PKPCA, and meanwhile PKPCA successfully avoids the disadvantage of KFD that can only get (class number -1) eigenvectors. Simulation results of two numerical function classes, as well as experiment results of a real world dataset involving credit card, chronometer and diseases, show that the proposed algorithm is valid and the classification performance is satisfied.
    3) This thesis proposed a Regularized Kernel Recursive Least Square (RKRLS) algorithm. The coefficients and error of RKRLS model are gotten and the generalization ability is analyzed. Furthermore, this thesis proposed a Recursive Support Vector Machine algorithm. The norm of Dynamic Support Vector is put forward, and the three conditions including e -insensitive, v-sensitive and nonsingular conditions are proved. The recursive algorithms of restricted, increased, decreased mode are deduced without the calculation of the inversion. RKRLS and RSVM have four properties: small samples, controlled generalization ability, good robustness and rapidity, which are applicable to the
    
    
    
    industry case.
    4) This thesis proposed Vector Base Learning (VBL) algorithm. The method to construct BVL is designed through judging the angle of sample and solution space. The increasing-form algorithm and adjusting-form algorithm are deduced. Further more, a new Vector Base Learning (VBL) network is proposed which is based on the concept of Base Vector Set (BVS) and Network Base (NB). A method of automatically generating NBs is developed. The algorithms- of increasing network node, adjusting NB parameter and adjusting weights are presented. The dynamic glycolysis chaotic oscillation process is identified using VBL network, which shows that this algorithm has better convergence property than other algorithms.
    5) This thesis proposed MIMO Vector Base Learning Network, which can be used to model and classify. The weights are trained with Gradient Descent Method. The increase algorithm of BVS, and restricted algorithm, was induced. At last parameter identification and double spirals were simulated,, and good identification result and classification performance were obtained.
    6) This thesis proposed a Vector Base Cognition Model. The basic idea, structure and algorithm are introduced. A simulation is made using this cognitive model, which proved that Vector Base Cognition Model is valid. Three propositions are proposed based the Vector Base Network. Cognition of human being and Vector Base Cognition Model are compared, the corresponding connection is created.
    7) The application for rubber mixing process is given: Abnormal modeling samples first removed, SVM is applied to build the discharge model to establish the rubber discharge condition, and long term practical production validated the discharge modeling method; Adopting dynamic RKRLS and RSVM, Mooney time serials is used to model and predict, which shows better prediction ability than RLS; Using V
引文
Adams K, Kaufmann D, Ziegler A, Sebastian, (2002): Biochemical Reactions,http://cox.iwr.uni-heidelberg.de/~markus/ProjekteB/Biochemie.pdf
    Amari S., Wu S.(1999): Improving support vector machine classifiers by modifying kernel functions. Neural Networks, 12, 783-789
    Aziz Guergachi A., (2001): Using statistical learning theory for modeling the uncertainty in business and engineering systems: a qualitative introduction, Systems, Man, and Cybernetics, 2001 IEEE International Conference on, Volume:1,7-10 Oct. 423-428 vol.1
    Bartlett P., (2001): Statistical learning and VC theory, Circuits and Systems, 2001. Tutorial Guide: ISCAS 2001. The IEEE International Symposium on, 6-9 May, 4.2.1 -4.2.16
    Barzilay O, Brailovsky V.L. (1999): On domain knowledge and feature selection using a support vector machine Pattern Recognition Letters,20,475-484
    Baudat G, Anouar F. (2001): Kernel-based methods and function approximation. In International Joint Conference on Neural Networks, pages 1244--1249, Washington, DC July 15-19
    Belousov A.I., Verzakov S.A., von Frese J. (2002): A flexible classification approach with optimal generalisation performance: support vector machines, Chemometrics and Intelligent Laboratory Systems Volume: 64, Issue: 1, October 28, pp. 15-25
    Bahlmann C, Haasdonk B, Burkhardt H. (2002): On-line handwriting recognition with support vector maehines--a kernel approach. In Proc. of the 8th IWFHR, pages 49-54
    Blake C.L. Merz C.J. (1998): UCI Repository of machine learning databases. Irvine, CA: University of California, Department of Information and Computer Science.
    Blow C. M. and Hepburn C.(1982): Rubber technology and manufacture, Butterworth Scientific, Second Edition.
    Bottou L, Vapnik V. N., (1992): Local Learning Algorithms, Neural Computation, vol. 4(6), pp.888-900
    Brailovsky V. L., Barzilay O, Shahave R (1999): On global, local, mixed and
    
    neighborhood kernels for support vector machines, Pattern Recognition Letters ,20, 1183-1190
    Burges C. J.C. (1998): A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 2, 121-167
    Cabrera J. L., Gorronogoitia J., de la Rubial F. J., (1999): Noise-Correlation-Time-Mediated Localization in Random Nonlinear Dynamical Systems, PHYSICAL REVIEW LETTERS, VOLUME 82, NUMBER 14, 5 APRIL, 2816-2819
    Cai C.Z., Wang W.L., Sun L.Z., Chen Y.Z. (2003): Protein function classification via support vector machine approach, Mathematical Biosciences, Volume: 185, Issue: 2, October, pp. 111-122
    Cai Y.D., Liu X.J., Xu X.B., Chou K.C. (2000): Support Vector Machines for Prediction of Protein Subcellular Location Molecular Cell, Biology Research Communications, Volume: 4, Issue: 4, October, pp. 230-233
    Cao L.J. (2003): Support vector machines experts for time series forecasting, Neurocomputing Volume: 51, April, pp. 321-339
    Cao L.J., Chua K.S., Chong W.K., Lee H.P., Gu Q.M. (2003): A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine, Neurocomputing Volume: 55, Issue: 1-2, September, pp. 321-336
    Cawley Gavin C.; Talbot Nicola L.C., (2002): Improved sparse least-squares support vector machines, Neurocomputing Volume: 48, Issue: 1-4, October, pp. 1025-1031
    Chang R.F., Wu W.J., Moon W.K., Chou Y.H., Chen D.R. (2003): Support Vector Machines for Diagnosis of Breast Tumors on US Images, Academic Radiology Volume: 10, Issue: 2, February, pp. 189-197
    Cherkassky V., Shao X., (2001): Signal estimation and denoising using VC-theory, Neural Networks, 14, 37-52
    Cherkassky V., Shao X., Mulier F.M., Vapnik V. N. (1999): Model Complexity Control for Regression Using VC Generalization Bounds, IEEE Trans. Neural Networks VOL. 10, NO. 5, SEPTEMBER, 1075-1089
    Chua K. S., (2003): Efficient computations for large least square support vector machine classifiers, Pattern Recognition Letters Volume: 24, Issue: 1-3, January,pp. 75-80
    Colin C(2002): Kernel methods: a survey of current techniques, Neurocomputing,
    
    Volume: 48, Issue: 1-4, October, pp. 63-84
    Cortes C, Vapnik V. (1995): Support□vector networks. Machine Learning, 20:273~297
    David V., Sánchez A., (2003): Advanced support vector machines and kernel methods, Neurocomputing Volume: 55, Issue: 1-2, September, pp. 5-20
    Demiriz A., Bennett K. P., Breneman C. M., Embrechts M. J., (2001): Support vector machine regression in chemometrics. In Computing Science and Statistics: Proceedings of Interface
    Ding C., Dubchak I. (2001): Multi-class protein fold recognition using support vector machines and neural networks. Bioinformatics, 17:349-358
    Drezet P.M.L., Harrison R.F. (1998): support vector machines for system identification, UKACC International Conference on CONTROL'98,1-4 september 688-692
    Déniz O., Castrilln M., Hemández M.(2003): Face recognition using independent component analysis and support vector machines, Pattem Recognition Letters Volume: 24, Issue: 13, September, pp. 2153-2157
    Evgeniou T, Pontil M(2000): Statistical Learning Theory: A Primer, International Journal of Computer Vision 38(1), 9-13
    Evgeniou T, Pontil M, Poggio T (2000): Regularization Networks and Support Vector Machines, Advances in Computational Mathematics, 13, 1-50
    Evgeniou T, Pontil M, Pontib M, Verri A, (2002): Regularization and statistical learning theory for data analysis, Computational Statistics & Data Analysis, 38,421-432
    Francis E.H. T, Cao L.J (2001): Application of support vector machines in financial time series forecasting, Omega 29 309-317
    Gammerman A., (1996): Machine Learning: Progress and Prospects, Royal Holloway, University of London, Egham, UK
    Ge M, Du R., Zhang G.C., Xu Y.S. (2004): Fault diagnosis using support vector machine with an application in sheet metal stamping operations, Mechanical Systems and Signal Processing, Volume: 18, Issue: 1, January, pp. 143-159
    Gestel T. V, Suykens J. A. K., Lanckriet G. et al.(2002): Bayesian Framework for Least Squares Support Vector Machines Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis[J]. Neural Computation,
    Gestal T. V, Suykens J. A. K.( 2001): Financial time series prediction using least
    
    squares support vector machines within the evidence framework, IEEE tron neural networks vol 12 no 4 JULY,809-820.
    Guo G.D., Li S.Z., Chan K.L.(2001): Support vector machines for face recognition Image and Vision Computing Volume: 19, Issue: 9-10, August 1, pp. 631-38
    Hua S.J., Sun Z.R. (2001): A Novel Method of Protein Secondary Structure Prediction with High Segment Overlap Measure: Support Vector Machine Approach, Journal of Molecular Biology Volume: 308, Issue: 2, April 27, pp. 397-407
    Hong D.H., Hwang C.H.(2003): Support vector fuzzy regression machines, Fuzzy Sets and Systems, Volume: 138, Issue: 2, September 1, pp. 27t-281
    Je H. M., KIM D. J., BANG S.Y., (2003): Human Face Detection in Digitat Video Using SVM Ensemble, Neural Processing Letters 17:239-252
    Joshua B. Tenenbaum, Vin de Silva, John C. Langford, (2000): A Gtobal Geometric Framework for Nonlinear Dimensionality Reduction, Science, December 22; 290:2319-2323
    Jonsson K., Kittler J., Li Y.P., Matas J.(2002): Support vector machines for face authentication, Image and Vision Computing Volume: 20, Issue: 5-6, Aprit 15, pp. 369-375
    Jourdain E. P. (1995): Computer control of internal mixer for more consistent EPDM compotmds, Rubber World, Feb, 29-32.
    Juditsky A, Hjalmarsson H, Benveniste A. etc., (1995): Nonlinear Black-box Models in System Identification-Mathematical Foundations, Automatica, Vol. 31, No. 12, 1725-1750.
    Kim K.J. (2003): Financial time series forecasting using support vector machines, Neurocomputing Volume: 55, Issue: 1-2, September, pp. 307-319
    KIM K.I., KIM J.H. (2002): Face recognition using support vector machi_nes with local correlation kernels, International Journal of Pattern Recognition and Artificial Intelligence,Vol. 16, No.1, 97-111
    Krempasky J., Stetina M., (1999): Feigenbaum's chaos in macromolecular photosynthetic process, http://www.feh.vutbr.cz/udalosti/mol/sbomik99/krempasky 1.pdf
    Kulkarni A, Jayaraman V.K., Kulkarni B.D.(2003): Control of chaotic dynamical systems using support vector machines, Physics Letters A, Volume: 317, Issue: 5-6, October 27, pp. 429-435
    
    
    Kumpati S. N, Kannan P. (1990): Identification and control of dynamical systems using neural networks[J]. IEEE Trans on Neural Networks, 1(1): 4-27
    Lau K.W., Wu Q.H., (2003): Online training of support vector classifier Pattern Recognition, Volume: 36, Issue: 8, August, pp. 1913-1920
    Li L. N., Hou C.Z. (2002): The identification of industrial processes based on SVM, Machine Learning and Cybernetics, 2002. Proceedings. 2002 International Conference on, Volume: 1,4-5 Nov. 520-523 vol.1
    Li S.T., James T. K., Zhu H.L., Wang Y.N. (2003): Texture classification using the support vector machines, Pattern Recognition Volume: 36, Issue: 12, December, pp. 2883-2893
    Liu X.H., Cheng G.X., Wu J.X.(2002): Analyzing Outilers Cautiously, IEEE tran on Knowledge and Data Engineering, Vol. 14, No. 2, March/April, 432-437
    Ljung L., (1996): PAC-learning and asymptotic system identification theory, in: Proc. 35th IEEE Conf. on Decision and Control,Kobe, Japan, pp. 2303-2307.
    Ljung L. (1991): Issues in system identification,IEEE control system, January,25-29
    Marc G. Genton, (2001): Classes of Kernels for Machine Learning: A Statistics Perspective, JMLR, 2(Dec), 299-312,http://www.samsi.info/200304/dmml/web-internal/svm/svmArchive.html
    Merikoski S., Laurikkala M., Koivisto H. (2000): An Adaptive Neuro-Fuzzy Inference System as a Soft Sensor for Viscosity in Rubber Mixing Process. Neural Networks and Applications '01, Puerto de la Cruz, Spain, February
    Merikoski S, Laurikkala M, Koivisto H (2001) Modelling viscosity in rubber mixing process using an Adaptive Neuro-Fuzzy Inference System (ANFIS). Automaatio2001, Helsinki, September 2-9,.
    Meyer D, Leisch F, Hornik K, (2003): The support vector machine under test,Neurocomputing Volume: 55, Issue: 1-2, September, pp. 169-186
    Mika S., Raitsch G., Müller K.-R. (2001): A mathematical programming approach to the Kernel Fisher algorithm. In T.K. Leen, T.G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13, pages 591-597. MIT Press
    Mika S., Ratsch G., Weston J. et al. (1999): Fisher discriminant analysis with kernels. Neural Networks for Signal Processing Ⅸ, IEEE, 41-48.
    Mukherjee S., Rifkin R., Poggio T. (2002): Regression and Classification with
    
    Regularization. In: Lectures Notes in Statistics: Nonlinear Estimation and Classification, Proceedings from MSRI Workshop, D.D. Denison, M.H. Hansen, C.C. Holmes, B. Mallick and B. Yu(eds.), Springer-Verlag, 171, 107-124,
    Müller K,R., Mika S, Ratsch G etc.(2001): An introduction to kernel-based learning algorithms. IEEE Trans on Neural Networks, 12(2): 181-202
    Nielsen K, Srensen P. G., Hynne F, (1997): Chaos in Glycolysis, J. theor. Biol. 186,303-306
    Osuna E., Freund R.and Girosi E (1997): Support Vector Machines: Training and Applications, CBCL Paper #144/AI Memo #1602, Massachusetts Institute of Technology, Cambridge, MA
    Poggio T., Girosi F. (1998): Notes on PCA, Regularization, Sparsity and Support Vector Machines, CBCL Paper #161/AI Memo #1632, Massachusetts Institute of Technology, Cambridge,MA
    Pontil M., Rifkin R., Evgeniou T. (1998): From Regression to Classification in Support Vector Machines, CBCL Paper #166, AI Memo #1649, Massachusetts Institute of Technology, Cambridge, MA
    Ratsch G., Scholkopf B., Mika S., and Müller K.-R. (2000): SVM and Boosting: One class. Technical Report 119, GMD FIRST. Berlin, November
    Roobaert D., (2000): DirectSVM: a fast and simple support vector machine perceptron, Neural Networks for Signal Processing Ⅹ, 2000. Proceedings of the 2000 IEEE Signal Processing Society Workshop, Volume: 1,11-13 Dec. 356-365 vol.1
    Rosipal R., Girolami M., Trejo L.J. , Cichocki A. (2001): Kernel PCA for Feature Extraction and De-Noising in Non-linear Regression. Neural Computing & Applications 10(3):231-243
    Rosipal R., Trejo L.J. (2001): Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space. Journal of Machine Learning Research, 2(Dec): 97-123.
    Ruiz A, Lopez-de-Teruel P.E., (2001): Nonlinear kernel-based statistical pattern analysis, IEEE Trans. Neural Networks 12(1) 16-32.
    Sam T. Roweis and Lawrence K. Saul(2000): Nonlinear Dimensionality Reduction by Locally Linear Embedding, SCIENCE ,VOL 290, 22, DECEMBER 2323-2324.
    Saunders C., Gammerman A., Vovk V., (1998): Ridge Regression Learning Algorithm in Dual Variables, Proceedings of the 15th International Conference on Machine Learning,
    
    
    Schlkopf B., Smola A. J. (2000): New support vector algorithms. Neural computation 12, 1207-1245.
    Schlkopf B., Sung K.-K., Burges C. (1997): Comparing Support Vector Machines with Gaussian Kemels to Radial Basis Function Classifiers. IEEE Trans. on Signal Processing, vol. 45, no. 11, pp. 2758-2765
    Schlkopf B., Knirsch P., Smola A., and Burges C.(1998): Fast approximation of support vector kernel expansions, and an interpretation of clustering as approximation in feature spaces. Mustererkennung 1998---20. DAGM-Symposium, Informatik aktuell, pages 124--132, Berlin, Springer
    Schlkopf B., Smola A.and Müller K.-R. (1998): Nonlinear component analysis as a kernel eigenvalue problem[J]. Neural Computation, 10(5): 1299-1319.
    Schlkopf B., Smola A., Williamson R., and Bartlett P. L.(1998): New support vector algorithms. NeuroCOLT Technical Report NC-TR-98-031, Royal Holloway College, University of London, UK
    Schlkopf B., Mika S, Burges C.J.C etc. (1999): Input Sapace Versus Feature Space in Kemel-Based Methods, IEEE Tran Neural Network, Vol. 10, No. 3, Sep, 1000-1016
    Shashua A (1999): On the Relationship Between the Support Vector Machine for Classification and Sparsified Fisher's Linear Discriminant, Neural Processing Letters 9: 129-139.
    Sjberg J, Zhang Q.H., Ljung L etc. (1995): Nonlinear Black-box Modeling in System Identification-a Unified Overview. Automatic, Vol. 31, No. 12, 1691-1724
    Smola A. J and SchlkopfB (1998): A tutorial on support vector regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK, 1998
    Smola A. J (1998): Learning with Kernels. PhD thesis, Technische UnJversitt Berlin.
    Smola A. J., Schlkopf B, Müller K, (1998): The connection between regularization operators and support vector kemels Neural Networks Volume: 11, Issue: 4, June, pp. 637-649
    Smola A. J (1996): Regression estimation with support vector learning machines. Master's thesis, Teehnisehe Universitt München.
    Stitson M. O., Weston J., Gammerman A, Vapnik V., Vovk V., (1996): Theory of SV Machines, CSD-TR-96-17, Royal Holloway, University of London, Egharn, UK
    
    
    Suykens J.A.K. (2001): Support Vector Machines: a nonlinear modelling and control perspective, European Journal of Control, Special Issue on fundamental issues in control, vol. 7, no. 2-3, Aug. 311-327
    Suykens J.A.K., De Brabanter J., Lukas L., Vandewalle J. (2002): Weighted least squares support vector machines: robustness and sparse approximation, Neurocomputing Volume: 48, Issue: 1-4, October, 85-105
    Suykens J. A. K., Vandewalle J.(1999): Least squares support vector machine classifiers, Neural Processing Letters, vol. 9, no. 3, pp. 293-300.
    Suykens J.A.K., Vandewalle J. (2000): Recurrent Least Squares Support Vector Machines, IEEE Transactions on Circuits and Systems-Ⅰ, Vol.47, No,7, pp.1109-1114, Jul.
    Suykens J.A.K., Vandewalle J., Moor B. D, (2001): Optimal Control by Least Squares Support Vector Machines, Neural Networks, vol. 14, no, 1, pp.23-35, Jan.
    Suresh R.P, Halladay J.R. (1994): Computer control in rubber processing, Rubber World, Nov, 28-34
    Tefas A, Kotropoulos C, Pitas Ⅰ. (2001): Using Support Vector Machines to Enhance the Performance of Elastic Graph Matching for Frontal Face Authentication, IEEE tron on Pattern Analysis and Machine Inteligence,Vol.23. No. 7. July. 735-746.
    Thissen U., van Brakel R., de Weijer A.P., Melssen W.J., Buydens L.M.C. (2003): Using support vector machines for time series prediction, Chemometrics and Intelligent Laboratory Systems Volume: 69, Issue: 1-2, November 28, pp, 35-49
    Tikhonov A N, (1963): On solving ill-posed problem and method of regularization, Doklady Akademii Nauk USSR, 153:501-504.
    Twining C.J., Taylor C.J., (2003): The use of kernel principal component analysis to model data distributions, Pattern Recognition, Volume: 36, Issue: 1, January, pp. 217-227
    Vapnik V. N., (1999): An Overview of Statistical Learning Theory[J], IEEE Trans. Neural Networks, 10(5): 985-999.
    Vapnik V. N.(2000): 统计学习理论的本质,张学工译,清华大学出版社,
    Vapnik V. N., Bottou L, (1993): Local Algorithms for Pattern Recognition and Dependencies Estimation," Neural Computation, vol. 5(6), pp, 893-909,
    Vapnik V. N., Golowich S., and Smola A. (1997): Support vector method for function approximation, regression estimation, and signal processing, In M. Mozer,
    
    M. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9, pages 281-287, Cambridge, MA,. MIT Press
    Veropoulos K., Cristianini N., Campbell C. (1999): The Application of Support Vector Machines to Medical Decision Support: A Case Study. In Proceedings of the ECCAI Advanced Course in Artificial Intelligence, Chania, Greece, 1999 (ACAI99) Workshop W10, p. 17-21
    Wang Li-Xin. (1994): Adaptive fussy systems and control: design and stability analysis[M]. New Jersey: Prentice-Hall
    Weston J., Gammerman A,. Stitson M. O, Vapnik V., Vovk V., Watkins C., (1997): Density Estimation using SV Machines, CSD-TR-97-23, Royal Holloway, University of London, Egham, UK
    Wu W., Massart D.L., de JongS. (1997): The kernel PCA algorithms for wide data. Part Ⅰ: theory and algorithms, Chemometrics and Intelligent Laboratory Systems Volume: 36, Issue: 2, April, 1997, pp. 165-172
    Xu J. H., Zhang X. G., Li Y. D. (2001): Kernel mse algorithm: A unified framework for kfd. ls-svm. Proceedings of IJCNN'01 2:1486-1491
    边肇祺 张学工 等编著(2000):模式识别,清华出版社,第2版
    蔡群英、张海等(1994):密炼机混炼工艺瞬时功率控制与其他控制方法的比较,橡胶技术与装备,1,1-8
    陈念贻 等著(2000):模式识别方法在化学化工中的应用,科学出版社
    方崇智 萧德云,(1998):过程辨识,清华大学出版社
    弗兰西斯.克里克(1998):惊人的假说,湖南科技出版社
    弗里克利P.K.,周国楹 李元石 周彦豪 译校,(1992):橡胶加工和生产组织.化学工业出版社.
    卡尔·萨根,吕柱、王志勇 译 伊甸园的飞龙
    拉梅特里,顾寿观 译(1999):人是机器,商务印书管
    李晓黎 刘继敏 史忠植(2001):基于支持向量机与无监督聚类相结合的中文网页分类器,计算机学报,第24卷,第1期
    梁路宏,艾海舟,肖习攀,叶航军,徐光,张钹,(2002):基于模板匹配与支持矢量机的人脸检测,计算机学报,Vol.25,No.1,Jan,22-29
    刘江华,程君实,陈佳品(2002):基于支持向量机的非线性系统辨识测控技术,Vol.21,No.11
    马云潜 张学工 (2000):支持向量机函数拟合在分形插值中的应用,清华大学学
    
    报(自然科学版)Vol.40 No.3 P.76-78,103
    皮亚杰,王宪钿 等译,发生认识论原理
    盛万兴(2001):一类智能控制的认知框架及其实现,中国电机工程学报,Vol.21,No.1,Jan.48-51
    孙文爽,陈兰祥 (1994):多元统计分析。北京:高等教育出版社
    田盛丰 黄厚宽,(2000):基于SVM的数据库学习算法,计算机研究与发展,Vol.37,No.1,17-22
    田盛丰 黄厚宽,(2000):支持向量机多专家决策算法,模式识别与人工智能,Vol.13,No.2,165-169
    王继成(2000):基于认知模拟的符号-神经网络系统的研究,电子学报,Vol.28,No.11.99-101
    王继成(2001):基于认知模拟的自适应机器学习算法研究,软件学报,Vol.12,No.8,1205-1211
    王继成(2002):基于认知模拟的多路信息融合系统的研究,电子学报,No,5,738-740
    王海清 (2000):工业过程监测:基于小波和统计学的方法,浙江大学博士学位论文
    王旭东 邵惠鹤(1997):RBF神经网络理论及其在控制中的应用,信息与控制,26(4).272-284
    王永骥 涂健 编著(1998):神经元网络控制,机械工业出版社
    威廉.詹姆士,陈羽纶 孙瑞禾译(1997):实用主义,商务出版社,北京
    温熙森 胡茑庆 邱静编著 (1997):模式识别与状态监控,国防科技大学出版社
    乌拉尔斯基M.,戈列里克P.A.,布卡诺夫A.M.,胡又牧 译 (1989):橡胶胶料工艺性能的控制与调节,化学工业出版社.
    许建华,张学工,李衍达(2002):一种基于核函数的非线性感知器算法,计算机学报,Vol.25,No.7.1-7
    徐英 杨尔铺 等(2001):一类基于RBF神经网络的动态系统在线自适应辨识方法,信息与控制,30(6).508-512
    阎辉,张学工,马云潜,李衍达(2002):基于变异函数的径向基核函数参数估计,计算机学报,Vol.28,No.3.450-455
    杨清芝 主编 (1997):现代橡胶工艺学,中国石化出版社
    张海等(1999):关于开发智能密炼机的思考,橡胶技术与装备,Vol.25,No.5,18-20
    
    
    张海,贺德化,马铁军等(1996):密炼机橡胶混炼结束前瞬时功率与混炼胶粘度的关系。轮胎工业。Vol.16,No.4,239-242。
    张浩然、韩正之、李昌刚,(2003):基于支持向量机的未知非线性系统辨识与控制,上海交通大学学报,Vol.37.No.6.927-930
    张浩然 韩正之 李昌刚(2003):基于支持向量机的非线性系统辨识,系统仿真学报Vol.15 No.1
    张磊,林福宗, 张钹,(2002):基于支持向量机的相关反馈图像检索算法,清华大学学报(自然科学版),Vol.42,No.1,80-83
    张莉,周伟达,焦李成(2002):核聚类算法,计算机学报,Vol.25,No.6.587-590
    张铃(2002):基于核函数的SVM机与三层前向神经网络的关系,计算机学报,Vol.25.No.7.
    张尧庭 方开泰(1997):多元统计分析引论[M],科学出版社.
    章士嵘(1992):认知科学导论,人民出版社
    张学工(2000):关于统计学习理论与支持向量机。自动化学报,Vol.26,No.1,32-42
    赵南元 著,认知科学与广义进化论 (第二版),清华大学出版社
    赵南元 (1993):广义进化论-认知科学的一个范式,模式识别与人工智能,Vol.6,No.1.19-26
    周彦豪(1997):密炼机橡胶混炼控制方法的新进展.橡胶工业.Vol.44,No.11,696-697
    邹高峰 王正欧 (2002):基于回归神经网络的非线性时变系统辨识,控制与决策,17(5).-517-521

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700