用户名: 密码: 验证码:
改进型支持向量回归机及其在过程建模与控制中的应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
支持向量机(Support Vector Machine, SVM);是在统计学习理论框架下产生的一种通用的机器学习方法,与神经网络方法相比,支持向量机解决了小样本、过学习、高维数、局部最小等问题,而且具有很强的泛化性能,已被人们广泛用于模式识别、回归估计、密度估计、系统辨识、过程建模和控制等领域。但是,支持向量机仍然存在大样本计算复杂度高、模型参数及核函数选择困难等问题,本文从支持向量回归机(Support Vector Regression, SVR)理论研究和应用的角度出发,对支持向量回归机的大规模样本学习、模型选择、过程建模和预测控制等几个方面作了系统的研究,主要包含以下内容:
     (1)提出了有效的数据约简算法,改善了高维大样本的学习问题。针对回归估计的大样本学习问题,提出了基于边界向量提取的支持向量回归机样本约简算法。通过引入ε'带将所有目标输出分别增加和减少ε',将原有支持向量回归问题转化为高维空间的两类分类问题,调整ε'带大小使两类样本可分,借助自适应投影算法选择边界向量,实现了支持向量回归机的样本约简。针对高维数据问题,提出了基于量子聚类的支持向量回归机特征选择算法,将惰性阈值引入到量子聚类中,确定了具有最优性能的特征子集。通过数值仿真研究证实了上述数据约简算法的有效性。
     (2)提出了三种支持向量回归机模型参数优化算法。研究了参数选择对支持向量回归机的泛化性能的影响,将粒子群算法中的每个粒子随迭代次数变化赋予不同的惯性权重,提出了基于自适应粒子群的支持向量回归机模型参数选择算法。提出了基于模糊接受和模糊影响的模糊文化算法,优化了支持向量回归机模型参数,改善了模型参数的寻优效果。提出了支持向量回归机的惩罚参数C加权算法,依据样本的训练误差赋予样本不同的权值,通过数值仿真验证了各模型的泛化精度都得到了有效地提高。
     (3)构造了改进的保角变换核函数和基于模型选择的多尺度小波核函数。通过定义新的因子函数,减小了特征空间中远离支持向量的样本点的因子,提出了基于改进的保角变换核函数的支持向量回归算法,对标准回归数据集和乙烯精馏塔产品组成预估的仿真验证了算法提高回归模型精度的有效性。提出了一种构造多尺度Morlet小波核函数的方法,解决了复杂多峰的函数拟合问题。通过对标准回归数据集和丙烯精馏塔的丙烯浓度数据的仿真证实了该方法的可行性。
     (4)提出了关于非线性多变量系统的支持向量回归机动态建模方法。针对实际工业过程的时变特性,将输入变量和历史输出进行非线性映射和线性叠加,提出了基于线性规划支持向量回归机的非线性多输入多输出系统的动态模型,解决了实际复杂工业过程的建模问题。通过对聚对苯甲酸乙二酯工业过程数据和二阶差分非线性多输入单输出系统的仿真证实了该方法的可行性。
     (5)提出了基于支持向量回归机的非线性预测控制方法。将非线性系统的输入和输出进行非线性映射,设计出基于线性规划支持向量回归机的预测模型,利用泰勒展开对该预测模型进行线性化,推导了预测控制系统的最优控制律,形成了支持向量回归机的非线性预测控制方法。通过对弱非线性系统和强非线性系统的仿真表明,该方法表现出较强的抗干扰性。
Support vector machine has become a general method for machine learning in the framework of the statistical learning theory. That is, it shows the advantage of capability in dealing with small samples, overfitting, high dimensions and local minimum, and exhibits good generalization. Thus, it has been widely used for pattern recognition, regression estimation, density estimation, system identification, process modeling and control. From the perspective of theory and application, this paper made systematic study on the main aspects of large scale samples, model selection, process modeling and predictive control for support vector regression. Main contents are listed as follows.
     (1) Effective data reduction algorithms were proposed to improve large-scale learning problem. For large dataset learning problem of regression estimation, sample reduction approach based on boundary vector extraction was presented. It shifted all target outputs up and down by introducingε'band, and then regression problem can be converted to a two-class classification problem in the high dimensional space. It adjustedε'band and enabled all the training data to be just separable linearly in the feature space, and then extracted boundary vectors in terms of adaptive projective algorithm to realize sample reduction of the regression problem. Dimension reduction based on quantum clustering technique is proposed to solve high dimension learning problem. It introduced inertia threshold into quantum clustering, and thus selected the optimal feature subsets with best generalization ability.
     (2) Three optimization algorithms were modified to optimize model parameters for support vector regression. Parameter selection has a direct effect on the generalization of support vector regression approach. Thus, adaptive particle swarm algorithm was developed to select model parameters for support vector regression. It adjusted inertia weight for each particle in each iteration to enhance the speed of convergence. Meanwhile, fuzzy cultural algorithm was proposed by fuzzy acceptance and fuzzy influence to optimize the model parameters for support vector regression and improve the optimization process. Weighted support vector regression on the penalized parameter C was presented to assign different weight according to sample training error. The generalization accuracy of support vector regression was improved in terms of numerical simulation.
     (3) Improved conformal mapping kernel function and multi-scale wavelet kernel function were given. A new factor function was determined to decrease the volume of the data points far away from support vectors in feature space, and then improved conformal mapping kernel was presented to verify the validity of the generalization precision for support vector regression by the simulation of standard regression dataset and ethylene concentration data. Multi-scale morlet wavelet was developed to solve the problem of complex and multi-peak function estimation. The simulations of standard regression datasets and propylene concentration in bottom products of propylene distillation column verified the feasibility of this method.
     (4) Dynamic modeling approach for support vector regression was presented to design the model of nonlinear multivariable system. For time varying characteristics in industrial process, dynamic modeling approach for multi-input and multi-output system based on linear programming support vector regression was proposed, in order to establish accurate model for controlled object in industrial process. It provided non-linear mapping of original input variables and history output variables, respectively, carried on linear superposition, and employed support vector regression to solve the problem. Then it is applied to construct dynamic model for complex industrial process. The simulation of the poly ethylene terephthalate production and second-order difference nonlinear multivariable system verified the validity.
     (5) Nonlinear predictive control for support vector regression was given based on the above-mentioned dynamic modeling. Support vector regression can approach arbitrary nonlinear system by arbitrary accuracy. Therefore, nonlinear predictive control method based on linear programming support vector regression with Gaussian kernel function was presented. It did nonlinear mapping of the original input and output, regarded dynamic model based on linear programming support vector regression as predictive model for predictive control system, and then expanded this model by Taylor's formula to make the model linearization, and obtained the optimal control law of predictive control system. The experiments of weakly and strongly nonlinear system depicted that this approach exhibited excellent disturbance-rejection ability.
引文
[1]Vapnik V.N. The nature of statistical learning theory [M]. Springer Verlag, New York, 1995.
    [2]Vapnik V.N. Statistical Learning Theory [M]. John Wiley and Sons, New York,1998.
    [3]Vapnik V.N., Chervonenkis A.Ya. Uniform convergence of the frequencies of occurrence of events to their probabilities [J]. Soviet mathematics-Doklady,1968,9(4): 915-918.
    [4]Vapnik V.N., Chervonenkis A.Ya. On the uniform convergence of relative frequencies of events to their probabilities [J]. Theory of Probability and its Applications,1971, 16(2):264-280.
    [5]Vapnik V.N., Chervonenkis A.Ya. Necessary and Sufficient Conditions for the Uniform Convergence of Means to their Expectations [J]. Theory of Probability and its Applications,1981,26(3):532-553.
    [6]Vapnik V.N. Estimation of Dependence Based on Empirical Data [M]. Springer, New York,1982.
    [7]Cortes C., Vapnik V. Support vector networks [J]. Machine Learning,1995,20(3): 273-297.
    [8]Bradley P.S., Mangasarian O.L. Massive data discrimination via linear support vector machines [J]. Optimization Methods and Software,2000,13(1):1-10.
    [9]Osuna E., Freund R., Girosi F. An improved training algorithm for support vector machines [C]. Proceeding of the 1997 IEEE Workshop on Neural Networks for Signal Processing, New York,1997, pp:276-285.
    [10]Osuna E., Freund R., Girosi F. Training support vector machines:An application to face detection [C]. Proceeding of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition,1997, pp:130-136.
    [11]Chang C.C., Hsu C.W., Lin C.J. The analysis of decomposition methods for support vector machines [J]. IEEE Transactions on Neural Networks,2000,11(4):1003-1008.
    [12]Platt J.C. Fast training of support vector machines using sequential minimal optimization [M]. Advances in Kernel Methods:Support Vector Learning, Scholkopf B., Burges C.J.C and Smola A.J., MIT Press,1999, pp:185-208.
    [13]Platt J.C. Using analytic QP and sparseness to speed training of support vector machines [M]. Advances in Neural Information Processing Systems, Kearns M.S., Solla S.A. and Cohn D.A., MIT Press,1999,11:557-563.
    [14]Mukherjee S., Osuna E., Girosi F. Nonlinear prediction of chaotic time series using support vector machines [C]. Proceeding of the 1997 IEEE Workshop on Neural Networks for Signal Processing Ⅶ,1997, pp:511-520.
    [15]Brown M., Gunn S.R. Empirical data modelling algorithms:additive spline models and support vector machines [C]. UKACC International Conference on Control'98,1998,1:
    709-714.
    [16]Muller K.-R., Smola A.J., Ratsch G., et al. Using support vector machines for time series prediction [M]. Advances in Kernel Methods:Support Vector Learning, Scholkopf B., Burges C.J.C and Smola A.J., MIT Press,1999, pp:243-253.
    [17]Drezet P.M.L., Harison R.F. Support vector machines for system identification [C]. UKACC Inernational Conference on Control'98,1998,1:688-692.
    [18]Mattera D., Haykin S. Support vector machines for dynamic reconstruction of a chaotic system [M]. Advances in Kernel Methods:Support Vector Learning, Scholkopf B., Burges C.J.C and Smola A.J., MIT Press,1999, pp:212-241.
    [19]Adachi S., Ogawa T. A new system identification method based on support vector machines [C]. IFAC Workshop on Adaptation and Learning in Control and Signal Processing,2001, pp:181-186.
    [20]Van Gestel T., Suykens J.A.K., Baestaens D.-E, et al. Financial time series prediction using least squares support vector machines within the evidence framework [J]. IEEE Transactions on Neural Networks,2001,12(4):809-821.
    [21]Suykens J.A.K. Nonlinear modelling and support vector machines [C]. Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference,2001,1: 287-294.
    [22]Gretton A., Doucet A., Herbrich R., et al. Support vector regression for black-box system identification [C]. Proceedings of the 11th IEEE Signal Processing Workshop on Statistical Signal Processing,2001, pp:341-344.
    [23]王军,彭宏,肖建.尺度核支持向量回归机的非线性系统辨识[J].系统仿真学报,2006,18(9):2429-2432.
    [24]韩建民,王丽侠,张浩然.基于鲁棒支持向量机的非线性系统辨识[J].仪器仪表学报,2006,27(6):2279-2280.
    [25]罗伟林,邹早建.基于最小二乘支持向量机的船舶操纵运动建模[J].系统仿真学报,2008,20(13):3381-3384.
    [26]Kosanovich K.A., Dahl K.S., Piovoso M.J. Improved process understanding using multiway principal component analysis [J]. Industrial & engineering chemistry research, 1996,35(1):138-146.
    [27]Karmy M., Warwick K. System identification using partial least squares [J]. IEE-proc-Control Theory Application,1995,142(3):223.
    [28]Park S., Han C. A nonlinear soft sensor based on multivariate smoothing procedure for quality estimation in distillation columns [J]. Computers and Chemical Engineering, 2000,24(2-7):871-877.
    [29]Crisafulli S., Pierce R.D., Dumont G.A., et al. Estimating sugar cane fibber rte using Kalman filterring techniques [C]. Proceeding of 13th IFAC Triennial World Congress, 1996, pp:361-366.
    [30]Gudi R.D., Shah S., Gray M. Adaptive multirate state and paramter estimation strategies with application to a bioreactor [J]. America Institute of Chemical Engineering Journal,1995,41(11):2451-2464.
    [31]Gee D.A., Ramirez W.F. On-line state estimation and paramter identification for batch fermentation [J]. Biotechnology Progress,1996,12:132-140.
    [32]Wang X., Luo R., Shao H., et al. Designing a soft sensor for a distillation column with the fuzzy distributed radial basis function neural network [C]. Proceddings of the 35th IEEE Conference on Decision and Control,1996,2:1714-1719.
    [33]Glassey J., Ignova M., Ward A.C., et al. Bioprocess supervision:neural network and knowledge based system [J]. Journal of Biotechnology,1997,52:201-205.
    [34]Yang Y., Chai T. Soft sensing based on artificial neural network [C]. Proceedings of the 1997 American Control Conference,1997,1:674-678.
    [35]Jose D.A.A., Maciel F.R., Soft sensors development for on-line bioreactor state estimation [J]. Computers and Chemical Engineering,2000,24(2-7):1099-1103.
    [36]冯瑞,张浩然,邵惠鹤.基于SVM的软测量建模[J].信息与控制,2002,31(6):567-571.
    [37]马勇,黄德先,金以慧.基于支持向量机的软测量建模方法[J].信息与控制,2004,33(4):417-421.
    [38]熊志化,黄国宏,邵惠鹤.基于高斯过程和支持向量机的软测量建模比较及应用研究[J].信息与控制,2004,33(6):754-757.
    [39]Yan W., Shao H., Wang X. Soft sensing model based on support vector machine and its application [J]. Chinese Journal of Mechanical Engineering,2004,17(1):55-58.
    [40]张英,苏宏业,褚健.基于模糊最小二乘支持向量机的软测量建模[J].控制与决策,2005,20(6):621-624.
    [41]Han I., Han C., Chung C. Melt index modeling with support vector machines, partial least squares, and artificial neural networks [J]. Journal of Applied Polymer Science, 2005,95(4):967-974.
    [42]朱建鸿,丁健,杨惠中,姜永森.贝叶斯回归支持向量机的软测量建模方法[J].南京航空航天大学学报,2006,38(增刊):136-138.
    [43]俞佩菲,吴燕玲,卢建刚,孙优贤.基于PCA和LS-SVM的软测量建模与应用[J].江南大学学报,2006,5(2):182-186.
    [44]郑小霞,钱锋.基于PCA和最小二乘支持向量机的软测量建模[J].系统仿真学报,2006,18(3):739-741.
    [45]陈如清,俞金寿.基于粒子群最小二乘支持向量机的软测量建模[J].系统仿真学报,2007,19(22):5307-5310.
    [46]段中兴,高伟林.基于支持向量机的催化剂颗粒浓度软测量[J].仪器仪表学报,2008,29(6):1187-1192.
    [47]吴德会.基于LS-SVM的特征提取及在凝点软测量中的应用[J].系统仿真学报,2008,20(4):917-920,925.
    [48]耿增显,柴天佑.基于LS-SVM的浮选过程工艺技术指标软测量[J].系统仿真学报,2008,20(23):6321-6324.
    [49]Suykens J.A.K., Vandewalle J. Chaos control using least-squares support vector machines [J]. International Journal of circuit theory and application,1999,27(6): 605-615.
    [50]Suykens J.A.K., Vandewalle J., De Moor B. Optimal control by least squares support vector machines [J]. Neural Networks,2001,14:23-35.
    [51]Kruif de B.J., Vries de T.J.A. On using a support vector machine in learning feed-forward control [C].2001 IEEE/ASME International Conference on Advanced Intelligent Mechatronics,2001,1:272-277.
    [52]Andrea B., Fabio B. Intelligent hardware for identification and control of non-linear systems with SVM [C].9th European Symposium on Artificial Neural Networks.2001, pp:75-80.
    [53]张浩然,韩正之,李昌刚.基于支持向量机的非线性模型预测控制[J].系统工程与电子技术,2003,25(3):330-334.
    [54]张浩然,韩正之,李昌刚.基于支持向量机的未知非线性系统辨识与控制[J].上海交通大学学报,2003,37(6):927-930.
    [55]王定成,方廷健.一种基于支持向量机的内模控制算法[J].控制理论与应用,2004,21(1):85-88.
    [56]王宇红,黄德先,高东杰等.基于LS-SVM的非线性预测控制[J].控制与决策,2004,18(4):383-387.
    [57]刘斌,苏宏业,褚健.一种基于最小二乘支持向量机的预测控制算法[J].控制与决策,2004,19(12):1399-1402.
    [58]王定成,汪懋华.基于GA的SVMR预测控制研究[J].控制与决策,2004,19(9):1067-1070.
    [59]李春茂,肖建,张玥.基于LS-SVM的网络化控制系统自适应预测控制[J].系统仿真学报,2007,19(15):3494-3498,3502.
    [60]张日东,王树青,李平.基于支持向量机的非线性系统预测控制[J].自动化学报,2007,33(10):1066-1073.
    [61]包哲静,皮道映,孙优贤.基于并行支持向量机的多变量非线性模型预测控制[J].控制与决策,2007,22(8):922-926.
    [62]张日东,王树青.基于支持向量机的一类非线性系统预测控制[J].控制与决策,2007,22(10):1103-1107.
    [63]郭振凯,宋召青,毛剑琴.基于最小二乘支持向量机的非线性广义预测控制[J].控制与决策,2009,24(4):520-525,531.
    [64]陈进东,王鲜芳,潘丰.基于PSO滚动优化的LS-SVM预测控制[J].计算机应用研究,2009,26(4):1381-1383.
    [65]雷必成,王万良,李祖欣.基于在线LS-SVM的网络预测控制系统[J].信息与控制,2009,38(2):163-169.
    [66]Wilson D.R., Martinez T.R. Reduction techniques for instance-based learning algorithms [J]. Machine Learning,2000,38,257-286.
    [67]Abe S., Inoue T. Fast training of support vector machines by extracting boundary data [C]. International Conference on Artificial Neural Networks,2001,2130:308-313.
    [68]Ding A., Liu F., Li Y. Pre-extracting support vector by adaptive projective algorithm [C].2002 6th International Conference on Signal Processing,2002,1:21-24.
    [69]Tax D.M.J., Duin R.P.W. Support vector data description [J]. Machine Learning,2004, 54:45-66.
    [70]吴青,刘三阳,杜喆.基于边界向量提取的模糊支持向量机方法[J].模式识别与人工智能,2008,21(3):332-337.
    [71]Fodor I.K. A survey of dimension reduction techniques [R]. LLNL technical report, 2002, UCRL-ID-148494.
    [72]Cunningham P. Dimension reduction [R]. Technical Report,2007, UCD-CSI-2007-7.
    [73]Jolliffe I.T. Principal component analysis [M].2nd edn, Springer, New York,2002.
    [74]Scholkopf B., Smola A.J., Muller K. Kernel principal component analysis [M]. Advances in Kernel Methods:Support Vector Learning, Scholkopf B., Burges C.J.C and Smola A.J., MIT Press,1999, pp:327-352.
    [75]Hyvarinen A., Oja E. Independent component analysis:algorithms and applications [J]. Neural Network,2000,13(4-5):411-430.
    [76]Cao L.J., Chua K.S., Chong W.K., et al. A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine [J]. Neurocomputing,2003, 55(1-2):321-336.
    [77]张小丹,吕建平.基于SVM的非相关线性判别分析算法研究[J].计算机工程与应用,2008,44(4):227-229.
    [78]洪泉,陈松灿,倪雪蕾.子模式典型相关分析及其在人脸识别中的应用[J].自动化学报,2008,34(1):21-30.
    [79]Keerthi S.S., Lin C.J. Asymptotic behaviors of support vector machines with gaussian kernel [J]. Neural Computation,2003,15:1667-1689.
    [80]Lin H.T., Lin C.J. A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods [R]. Technical Report, National Taiwan University, 2003.
    [81]Chapelle O., Vapnik V.N. Model selection for support vector machines [C]. Proceedings of the 1998 Conference on Advances in Neural Information Processing Systems, MIT Press, Cambridge,1999.
    [82]Chapelle O., Vapnik V.N., Bousquet O., et al. Choosing multiple parameters for support vector machines [J]. Machine Learning,2002,46(1):131-159.
    [83]Simon G., Lendasse A., Wertz V., et al. Fast approximation of the bootstrap for model selection [C]. Proceedings of the 11th European Symposium on Artificial Neural Networks.2003, pp:475-480.
    [84]Lendasse A., Wertz V., Simon G., et al. Fast bootstrap applied to LS-SVM for long term prediction of time series [C]. IJCNN'2004, Proceedings-IEEE International Joint Conference on Neural Networks,2004,1:705-710.
    [85]Cristianini N., Shawe-Taylor J., Kandola J., et al. On kernel target alignment [C]. Proceedings of the 2001 Conference on Advances in Neural Information Processing Systems,2002, pp:367-373.
    [86]李琳,张晓龙.基于RBF核的SVM学习算法的优化计算[J].计算机工程与应用,2006,42(29):190-192,204.
    [87]Hsu C.W., Chang C.C., Lin C.J. A practical guide to support vector classification [R]. Technical Report, Department of Computer Science and Information Engineering National Taiwan University, Taiwan,2003.
    [88]熊伟丽,徐保国.基于PSO的SVR参数优化选择方法研究[J].系统仿真学报,2006,18(9):2442-2445.
    [89]李良敏,温广瑞,王生昌.基于遗传算法的改进径向基支持向量机及其应用[J].系统仿真学报,2008,20(22):6088-6092,6096.
    [90]吴德会,王晓红.基于支持向量机的传感器动态建模方法[J].自动化仪表,2005,26(10):21-23.
    [91]汪晓东,张长江,张浩然等.传感器动态建模的最小二乘支持向量机方法[J].仪器仪表学报,2006,27(7):730-733.
    [92]Guy on I. An introduction to variable and feature selection [J]. Journal of Machine Learning Research,2003,3:1157-1182.
    [93]Suykens J.A.K., Gestel T. V., Vandewalle J., et al. A support vector machine formulation to PC A analysis and its kernel version [J]. IEEE transaction on neural networks,2003,14(2):447-450.
    [94]Scholkopf B., Smola A.J., Muller K.M. Nonliear component analysis as a kernel eigenvalue problem [J]. Neural Computation,1998,10:1299-1319.
    [95]Mika S., Ratsch G., Weston J., et al. Fisher discriminant analysis with kernels [C]. Proceedings of the 1999 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing Ⅸ,1999,23-25:41-48.
    [96]Kuss M., Graepel T. The geometry of kernel canonical correlation analysis [R]. MPI Technical Report (108), Max Planck Institute for Biological Cybernetics, Tubingen, Germany,2003,5.
    [97]Viswanath P., Kumar P.V., Babu V.S., et al. Generalized Branch and Bound Algorithm
    for Feature Subset Selection [C]. Proceedings of the International Conference on Computational Intelligence and Multimedia Applications,2007,2:214-218.
    [98]Debuse J.C.W., Rayward-Smith V.J. Feature subset selection with a simulated annealing datamining algorithm [J]. Journal of Intelligent Information Systems,1997, 9(1):57-81.
    [99]Yang J., Honavar V. Feature Subset Selection Using a Genetic Algorithm [R]. Technical Report TR97-02, Department of Computer Science, Iowa State University, 1997.
    [100]Horn D., Gottlieb A. The method of quantum clustering [C]. Proceedings of the Neural Information Processing Systems,2001, pp:769-776.
    [101]Horn D., Gottlieb A. Algorithm for data clustering in pattern recognition problems based on quantum mechanics [J]. Physical Review Letters,2002,88(1): 018702.1-018702.4
    [102]Horn D., Axel I. Novel clustering algorithm for microarray expression data in a truncated SVD space [J]. Bioinformatics,2003,19(9):1110-1115.
    [103]Aimeur E., Brassard G., Gambs S. Quantum clustering algorithms [C]. Proceedings of the 24th International Conference on Machine Learning,2007, pp:1-8.
    [104]Christiani N., Shawe-Taylor J. An introduction to support vector machines and other kernel-based learning methods [M]. Cambridge University Press,2000.
    [105]Smola A.J., Scholkopf B., Ratsch G.,1999. Linear programs for automatic accuracy control in regression [C]. Ninth International Conference on Artificial Neural Networks, 2:575-580.
    [106]Bennett K.P. Combining support vector and mathematical programming methods for classification [M]. Advances in Kernel Methods:Support Vector Learning, Scholkopf B., Burges C.J.C and Smola A.J., MIT Press,1999, pp:307-326.
    [107]Bi J., Bennett K.P. A geometric approach to support vector regression [J]. Neurocomputing,2003,55:79-108.
    [108]Dash M., Liu H. Feature selection for classification [J]. Intelligent Data Analysis,1997, 1:131-156.
    [109]Mangasarian O.L., David R. Large scale kernel regression via linear programming [J]. Machine learning,2002,46(1-3):255-269.
    [110]Scholkopf B., Bartlett P.L., Smola A.J., et al. Shrinking the tube:a new support vector regression algorithm [C]. Proceedings of the 1998 Conference on Advances in Neural Information Processing Systems Ⅱ,1999, pp:330-336.
    [111]Anguita D., Boni A., Ridella S. Support vector machines:a comparison of some kernel functions [C]. Proceedings of the Third International Symposium on Soft Computing, Genova, Italy,1999.
    [112]Scholkopf B., Smola A. Learning with kernels [M]. MIT Press, Cambridge, MA,2002.
    [113]Anguita D., Boni A., Ridella S. Evaluating the generalization ability of support vector
    machines through the bootstrap [J]. Neural Processing Letters,2000,11(1):51-58.
    [114]Efron B., Tibshirani R.J. An introduction to the bootstrap [M]. Chapman and Hall, New York, USA,1993.
    [115]Chang M.W., Lin C.J. Leave-one-out bounds for support vector regression model selection [J]. Neural Computation,2005,17(5):1188-1222.
    [116]Sundararajan S., Shevade S., Keerthi S.S. Fast generalized cross-validation algorithm for sparse model learning [J]. Neural Computation,2007,19(1):283-301.
    [117]Craven P., Wahba G. Smoothing noisy data with spline functions estimating the correct degree of smoothing by the method of generalized cross-validation [J]. Numerische Mathematik,2005,31(4):377-403.
    [118]Ayat N.E., Cheriet M., Suen C.Y. Automatic model selection for the optimization of SVM kernels [J]. Pattern Recognition,2005,38:1733-1745.
    [119]Ito K., Nakano R. Optimizing support vector regression hyperparameters based on cross-validation [C]. Proceedings of the International Joint Conference on Neural Networks,2003,3:2077-2082.
    [120]An S., Liu W., Venkatesh S. Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression [J]. Pattern Recognition,2007,40: 2154-2162.
    [121]Cawley G.C., Talbot N.L.C. Fast exact leave-one-out cross-validation of sparse least-squares support vector machines [J]. Neural Networks,2004,17(10):1467-1475.
    [122]Lau K.W., Wu Q.H. Leave one support vector out cross validation for fast estimation of generalization errors [J]. Pattern Recognition,2004,37:1835-1840.
    [123]Joachims T. Estimating the generalization performance of a SVM efficiently [C]. Proceedings of 17th International Conference on Machine Learning, San Fransisco, CA, 2000, pp:431-438.
    [124]Chung K.M., Kao W.C., Sun C.L., et al. Radius margin bounds for support vector machines with the RBF kernel [J]. Neural Computation,2003,15(11):2643-2681.
    [125]Keerthi S.S. Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms [J]. IEEE Transactions on Neural Networks.2002,13(5): 1225-1229.
    [126]Vapnik V., Chapelle O. Bounds on error expectation for support vector machines [J]. Neural computation,2000,12:2013-2036.
    [127]Yu S., Song H., Ma F. Novel SVM performance estimators for information retrieval systems [J]. Advanced Web Technologies and Applications,2004,3007:895-898.
    [128]Seeger M. Bayesian model selection for support vector machines, Gaussian processes and other kernel classifiers [M]. Advances in Neural Information Processing Systems, Solla S., Leen T., Muller K.-R.(eds.),2000,12:603-609.
    [129]Yan W., Shao H., Wang X. Soft sensing modeling based on support vector machine and bayesian model selection [J]. Computer and Chemical Engineering,2004,28(8):
    1489-1498.
    [130]Vapnik V. Estimation of dependences based on empirical data [M]. Springer-Verlag, New York, USA,1982.
    [131]Li H., Wang S., Qi F. SVM model selection with the VC bound [M]. Computational and Information Science,2005,3314:1067-1071.
    [132]Bao Y., Liu Z. A fast grid search method in support vector regression forecasting time series [M]. Intelligent Data Engineering and Automated Learning,2006,4224: 504-511.
    [133]Zhao Y., Zhang J., Yang J. The model selection for semi-supervised support vector machines [C].2008 International Conference on Internet Computing in Science and Engineering,2008, pp:102-105.
    [134]Liu R., Liu E., Yang J., et al. Optimizing the hyper-parameters for SVM by combining evolution strategies with a grid search [M]. Intelligent Control and Automation,2006, 344:712-721.
    [135]Lessmann S., Stahlbock R., Crone S.F. Genetic algorithms for support vector machine model selection [C].2006 International Joint Conference on Neural Network,2006, pp: 3063-3069.
    [136]Lessmann S., Stahlbock R., Crone S.F. Optimizing hyperparameters of support vector machines by genetic algorithms [C]. Proceedings of the 2005 International Conference on Artificial Intelligence,2005, pp:74-82.
    [137]Momma M., Bennett K.P. A pattern search method for model selection of support vector regression [C]. Proceedings of the SI AM International Conference on Data Mining,2002.
    [138]Kennedy J., Eberhart R.C. Particle swarm optimization. Proceedings of the IEEE International Conference on Neural Networks, Piscataway, NJ,1995, pp:1942-1948.
    [139]Reynolds G.R. An introduction to cultural algorithms [C]. Proceedings of the 3rd Annual Conference on Evolutionary Programming, New Jersey:World Scientific Publishing,1994, pp:131-139.
    [140]Suganthan P. N. Particle swarm optimiser with neighbourhood operator [C]. Proceedings of the IEEE Congress on Evolutionary Computation (CEC 1999), Piscataway, NJ,1999,3:1958-1962.
    [141]Shi Y.H., Eberhart R.C. A modified particle swarm optimizer [C]. Proceedings of the IEEE Congress on Evolutionary Compuatation (CEC 1998), Piscataway,1998, pp: 69-73.
    [142]Shi Y.H., Eberhart R.C. Fuzzy adaptive particle swarm optimization [C]. Proceedings of the IEEE Conference on Evolutionary Compuatation, Seoul,2001, pp:101-106.
    [143]Tay F.E.H., Cao L.J. Application of support vector machines in financial time series forecasting [J]. Omega The International Journal of Management Science,2001,29: 309-317.
    [144]Cao L.J., Tay F.E.H. Support vector machine with adaptive parameters in financial time series forecasting [J]. IEEE Transactions on Neural Networks,2003,14(6):1506-1518.
    [145]杜树新,吴铁军.回归型加权支持向量机方法及其应用[J].浙江大学学报(工学版),2004,38(3):302-306.
    [146]Smith N.D. Support vector machines applied to speech pattern classification [D]. Master's thesis, Cambridge University Engineering Department,1998.
    [147]Cawley G.C.,2001. Model selection for support vector machines via adaptive step-size tabu search [C]. Proceedings of the International Conference on Artificial Neural Networks and Genetic Algorithms, April 2001, pp:434-437.
    [148]Boardman M., Trappenberg T. A heuristic for free parameter optimization with support vector machines [C]. International Joint Conference on Neural Networks,2006, pp: 610-617.
    [149]Lin S.W., Ying K.C., Chen S.C., Lee Z.J. Particle swarm optimization for parameter determination and feature selection of support vector machines [J]. Expert Systems with Applications,2008,35(4):1817-1824.
    [150]Glasmachers T., Igel C.,2005. Gradient-based adaptation of general Gaussian kernels [J]. Neural Computation,2005,17:2099-2015.
    [151]Ayat N.E., Cheriet M., Suen C.Y. Optimization of the SVM kernels using an empirical error minimization scheme [M]. Pattern Recognition with Support Vector Machines, 2002,2388:354-369.
    [152]Amari S., Wu S. Improving support vector machine classifiers by modifying kernel functions [J]. Neural Networks,1999,12(6):783-789.
    [153]Wu S., Amari S. Conformal transformation of kernel functions:a data-dependent way to improve support vector machine classifiers [J]. Neural Processing Letters,2002,15: 59-67.
    [154]Xiong H.L., Swamy M.N.S., Ahmad M.O. Learning with the optimized data-dependent kernel [C]. Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop,2004,6:95-95.
    [155]Xiong H.L., Swamy M.N.S., Ahmad M.O. Optimizing the kernel in the empirical feature space [J]. IEEE transactions on neural networks,2005,16(2):460-474.
    [156]Chen B., Liu H.W., Bao Z. Optimizing the data-dependent kernel under a unified kernel optimization framework [J]. Pattern recognition,2008,41:2107-2119.
    [157]An W.S., Sun Y.G. An information-geometrical approach to constructing kernel in support vector regression machines [M]. Advances in Natural Computation,2005,3610: 546-553.
    [158]Zheng D., Wang J., Zhao Y. Non-flat function estimation with a multi-scale support vector regression[J]. Neurocomputing,2006,70(1-3):420-429.
    [159]Pozdnoukhov A., Kanevski M. Multi-scale support vector algorithms for hot spot detection and modeling[J]. Stochastic Environmental Research and Risk Assessment,
    2008,22(5):647-660.
    [160]段崇雯.多尺度核函数支持向量机算法及其应用研究[D].国防科学技术大学,2006.
    [161]王鹏,王志成,张钧,田金文.基于多尺度小波核LS-SVM的红外弱小目标检测[J].红外与激光工程,2006,35(z4):251-257.
    [162]Zhang L., Zhou W., Jiao L. Wavelet support vector machine [J]. IEEE Transactions on Systems, Man, and Cybernetics,2004,34(1):34-39.
    [163]Wen X., Cai Y., Xu X. Least squares support vector machine based on continuous wavelet kernel [J]. Lecture Notes in Computer Science,2005,3496:843-850.
    [164]Tax D.M.J., Duin R.P.W. Support vector domain description [J]. Pattern Recognition Letters,1999,20(11-13):1191-1199.
    [165]李元诚,方廷健.小波支持向量机[J].模式识别与人工智能,2004,17(2):167-172.
    [166]张莉,孙刚,郭军.基于K-均值聚类的无监督的特征选择方法[J].计算机应用研究,2005,22(3):23-24,42.
    [167]郑晓星,吴今培.基于支持向量数据描述的数据约简[J].现代电子技术,2007,30(2):74-76.
    [168]王自强,段爱玲,张德贤.基于自适应核函数的支持向量数据描述算法[J].北京化工大学学报,2008,35(2):87-91.
    [169]马勇,黄德先,金以慧.动态软测量建模方法初探[J].化工学报,2005,56(8):1516-1519.
    [170]Shaw A.M., Doyle F.J., Schwaber J.S. Dynamic neural network approach to nonliear process modeling [J]. Computers & Chemical Engineering,1997,21(4):371-385.
    [171]王晓红,吴德会.基于SVM的动态建模新方法[J].九江学院学报(自然科学版),2004,3:22-25.
    [172]Daniel A.F., Howard Z., Lois J., et al. Identification of linear dynamic models for type 1 diabetes:a simulation study [C]. Proceedings of the 2006 IF AC International Symposium on Advanced Control of Chemical Processes, Gramado, Brazil,2006.
    [173]毛帅,熊智华,徐用懋等.常压塔柴油凝点动态软测量模型的研究[J].控制工程,2005,12(4):342-345.
    [174]Shen Y., Zhao L. Studies on reaction process for poly ethylene terephthalate V the continous prepolynerization process analysis and the study for reaction kinetics of polymerization [J]. Chemical Reaction Engineering and Technology,1997,13(4): 377-382.
    [175]Kang C.K., Ihm D.W., Lee B.C. Modeling of semibatch direct esterification reactor for poly(ethylene terephthalate) synthesis [J]. Journal of Applied Polymer Science,1996, 60(1):2007-2015.
    [176]Himanshu P., Gunter F., Reinhard S. Modeling of Semibatch Esterification Process for Poly (ethylene terephthalate) Synthesis [J]. Macromolecular Reaction Engineering, 2007,1(4):502-512.
    [177]Ahn Y.C., Choi S.M. Analysis of the Esterification Process for Poly(ethylene terephthalate) [J]. Macromolecular Research,2003,11(6):399-409.
    [178]Zhong W.M., He G.L., Pi D.Y., et al. SVM with quadratic polynomial kernel function based nonlinear model one-step-ahead predictive control [J]. Chinese Journal of Chemical Engineering,2005,13(3):373-379.
    [179]Miao Q., Wang S.F. Nonlinear model predictive control based on support vector regression [C]. Proceedings of the First International Conference on Machine Learning and Cybernetics,2002, pp:1657-1661.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700