最小二乘支持向量回归及其在水质预测中的应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
在水环境治理及规划研究中,河流水质预测问题是一个被广泛研究的重要问题。由于水流动的多样性及不确定性点源污染原因,导致河网水系流态十分复杂。再加上水环境系统的结构参数和边界条件具有时变性和复杂性,而我们通常所获取的水环境系统信息是不完备,使得水质预测成为一个难题。统计学习方法可以根据现有的实测资料,建立影响河流水质因子和水质之间一种映射关系,从而达到预测河流水质的目的。
     支持向量机作为统计学习理论的新一代机器学习方法,能较好地解决小样本、非线性、高维数和局部极小点等实际问题,已成功应用于分类、回归和时间序列预测等领域。Suykens在支持向量机基础上提出的最小二乘支持向量机,不但在多个问题上表现出较高的建模精度和良好的泛化能力,还有效降低了算法的计算复杂度。然而,对最小二乘支持向量机算法及其在水质预测的应用,尚存在一些研究不足的方面。例如:回归预测模型输入的合理选择问题、峰值样本预测误差相对偏大问题等。
     本文结合河流水质预测应用,研究了最小二乘支持向量回归算法的-些相关问题。论文主要研究内容及取得的成果有:
     1)研究了回归预测模型输入的合理选择问题。根据信息论中的熵信息理论,提出了基于偏互信息的回归模型输入选择算法。该算法在给定输入变量条件下,通过估计备选输入与模型输出之间关联程度,来判定备选输入的取舍。多个线性和非线性测试用例输入选择结果表明,该输入选择算法,能正确识别预测模型输入变量,克服输入变量选择的冗余性问题。同时,输入被选定的顺序反映了该输入对预测模型输出的重要性程度。实际问题的应用结果表明,该算法选定的输入变量能反映系统的变化规律。
     2)提出了一种峰值预测的最小二乘支持向量回归算法,该算法可以用来提高峰值区域水质的拟合预测精度。在分析了学习样本分布对最小二乘支持向量回归算法拟合误差的影响基础上,按照加权最小二乘思想,利用样本分布密度因子和样本幅值加权因子修正最小二乘支持向量回归拟合误差,提高了最小二乘支持向量回归对峰值过程的拟合精度,降低了峰值区域样本的预测误差。文中采用了多个测试实例对算法性能进行了检验,最后将算法应用到了水质预测问题。应用结果表明峰值预测的最小二乘支持向量回归算法在保持水质整体预测精度的同时,能显著提高峰值样本预测精度,其峰值样本预测的平均百分比误差绝对值较LS.SVR算法下降27%以上。
     3)研究了大样本最小二乘支持向量回归算法的效率问题,提出了一种快速大样本最小二乘支持向量回归算法,并应用于咸潮水质预测。该算法采用无监督核硬聚类方法,首先将学习样本集按推广的欧氏距离相似性测度标准,在高维特征空间进行聚硬类。然后选定类中心样本作为支持向量进入支持向量集,再利用Nvstrom算法在低尺度支持向量集样本空间逼近初始样本学习机的核Gram矩阵,从而得到原问题的一个近似解。函数拟合测试及咸潮水质氯化物含量预测实际应用结果表明:该算法能够在拟合预测误差没有明显下降的情况下,将LS-SVR处理大样本学习问题的计算效率提高50倍以上
     4)针对最小二乘支持向量回归算法的单核映射性能及灵活性较差问题,提出了一种分组特征多核最小二乘支持向量回归算法,并应用于水质耗氧量预测。算法采用同一映射函数将所有具有同源特征的输入变量,映射到高维特征空间再进行回归建模。并将该回归拟合目标优化问题转化为半无限线性规划问题,利用交换集法求解。函数拟合测试及水质耗氧量预测应用结果表明:同源特征分组多核最小二乘支持向量回归算法较标准最小二乘支持向量回归算法,预测平均百分比误差降低了17%以上。
In the water environmental amelioration planning, the prediction of river water quality is very important, and which have been researched widely. Because the diversity of water flowing, and coupled with the pollution of uncertain point source, the river flow is very complex.The water environment information that we usually get is not complete, and the water environment system structure parameters and boundary condition are changeable and complex,which made the prediction of water quality to be a difficult problem.Statistical learning methods can predict river water quality by the research of the relationship between the water quality and the effects, which based on the existing data.
     As a new machine learning method according to the statistical learning theory, support vector machines can solve the practical problems with the characters of small sample, nonlinear, high dimension and local minima.It has been successfully applied to the classification, regression and time series prediction fields, etc.Least squares support vector machines, which originated from support vector machines and proposed by Suykens, not only shows a higher modeling precision and good generalization ability in many problems, but also reduces the computational complexity.However, there are some problems have not been researched sufficient of the algorithm and its application in water quality prediction, such as input selection problem in regression prediction model, relatively high prediction error of peak samples,etc.
     Some problems, which related of least squares support vector regression, have been studied in this paper, and some proposed algorithms have been applied to predict the river water quality. The main research contents and results in the paper as follow:
     1) Input selection problem of regression prediction model is studied. An input selection algorithm based on partial mutual information is proposed based on the information entropy of information theory. The main idea of this algorithm as follow:Whether an alternative input is selected or not determined by the estimated value of relevant information between the alternative input and the output of the model in a given input variable condition.The simulation results of the linear and nonlinear test examples shows that the proposed input selection method can select the right input variables,and eliminate the redundancy input variables.The sequence of input selected reflect the importance of the input to the output.The application results show that the selected input variables based on the algorithm can reflect changes of the system.
     2) A least squares support vector regression algorithm, which can improve the prediction accuracy of peak samples, is proposed. The influence of samples distribution to the fitting error of least squares support vector regression is analyzed firstly. And then according to the weighted least square idea, the fitting errors is amended using the distribution density of the learning samples and the amplitude factor, which help to improve the fitting accuracy of peak samples in least square support vector regression algorithm. The proposed algorithm is tested and applied to prediction of water quality. The results show that the prediction accuracy of the proposed algorithm to the peak samples have been improved significantly with the holistic accuracy maintained simultaneously, and the algorithm can eliminate the influence of distribution density. It is more suited for peak prediction applications.Its prediction error of peak samples decreased more than27%to the the LS-SVR algorithm.
     3) Least squares support vector regression algorithm for a large sample is studied, and a fast regression algorithm is proposed to deal with large sample learning problem. The learning sample is clustered by an unsupervised hard clustering method in Hilbert space,according to the Euclidean distance which used to be a similarity measure. And then the clustering centers are selected as the support vectors. Nystrom algorithm is used to approximate the kernel Gram matrix based on the support vector set, so as to obtain an approximate solution of the original problem.The results of function fitting test and its application to predict chloride content in saltwater show that the computational efficiency is improved by more than50times,with unsignificant decrease of learning error.
     4) A multiple kernel least squares support vector regression with grouped features is proposed, which help to improve the performance and the flexibility of Least squares support vector regression algorithm. All of the input variables with homologous features are mapped into Hilbert space using a same basic mapping functions, and then modeling with regression method.The problem of the fitting optimization objective is transformed into semi infinite linear programming problem,and soluted by an exchange method.The results of function fitting test and its appication to predict CODMn demonstrate that its prediction errors declines more than17%to standard least squares support vector regression algorithm.
引文
[1]United Nationas Educational,Scientific and Cultural Oraganization(UNESCO). Water-a shared responsibility:The United Nations World Water Development Reprot 2[M]. New York,2006,161.
    [2]联合国环境规划署.全球环境展望2000.中国环境科学出版社.2001.
    [3]United Nationas Educational,Scientific and Cultural Oraganization(UNESCO). Water for People Water for Life[M]. New York,2003:73.
    [4]汪斌.浅谈中国水资源问题与可持续发展[A].水环境保护与管理文集2002.06:pg309-310.
    [5]黄延林,卢金锁,韩宏大等.地表水源水质预测方法研究[J].西安建筑科技大学学报(自然科学版).2004,Vo1.36(2):pg134-137.
    [6]欧素英,杨清书,雷亚平.咸潮入浸理论预报模式的分析及其在西江三角洲的应用[J].热带海洋学报,2010,29(1):32-41.
    [7]邓乃扬,田英杰.支持向量机一理论、算法与拓展[M].科学出版社,2009.
    [8]J.A.K Suykens. Least Squares Support Vector Machines[E]. NATO-ASI Learning Theory and Practice, http://www.esat. kuleuven.ac.be/sista/members/suykens.html, 2002:10-84.
    [9]袁从贵,张新政,陈旭.基于偏互信息和最小二乘支持向量机的咸潮预测模型[C].Proceedings of the 30th Chinese Control Conference,2011:1482-1486.
    [10]陈永义.支持向量机讲义[M]. CMSVM软件平台说明,2003:65-86.
    [11]J.A.K Suykens,J. De Brabanter,L. Lukas,etc. Weighted least squares support vector machines:robustness and sparse approximation[J].Neurocomputing 48,2002: 85-105.
    [12]刑永忠,吴晓蓓,徐志良.基于柯西分布加权的最小二乘支持向量机[J].控制与决策,2009,24(6):937-940.
    [13]吴青,刘三阳,杜喆.回归型模糊最小二乘支持向量机[J].西安电子科技大学学报(自然科学版),2007,34(5):773-778.
    [14]Michiel Dbruyne,Sven Serneels,Tim Verdonck.Robustified least squares support vector classification[J].CHEMOMETRICS,2009,23:479-486.
    [15]Jingli Liu,Jianping Li,Weixuan Xu,etc.A weighted Lq adaptive least squares support vector machine classifiers-Robust and sparse approximation[J].Expert Systems with Applications,2011(38):2253-2259.
    [16]张淑宁,王福利,尤富强等.基于鲁棒学习的最小二乘支持向量机及其应用[J].控制与决策,2010,25(8):1169-1177.
    [17]笪勇,侍洪波.改进的加权最小二乘支持向量机在德士古炉温软测量中的应用[J].华东理工大学学报(自然科学版),2010,5(36):717-722.
    [18]温雯,郝志峰,邵壮丰.迭代重加权最小二乘支持向量机快速算法研究[J].计算机科学,2010,37(8):224-228.
    [19]Kruif B J,Vries J A.Pruning error minimization in least squares support machines[J].IEEE Trans on Neural Networks,2003,14(3):696-702.
    [20]Zeng X Y,Chen X W.SMO-based pruning methods for sparse least squares support machines[J].IEEE Trans on Neural Networks,2005,16(6):1541-1546.
    [21]Jiao L C,Bo L F,Wang L.Fast sparse approximation for least squares support machine[J].IEEE Trans on Neural Networks,2007,18(3):685-697.
    [22]司刚全,曹晖,张彦斌等.一种基于密度加权的最小二乘支持向量机稀疏化算法[J].西安交通大学学报,2009,43(10):11-15.
    [23]G.C.Cawley,N.L.C.Talbot. Fast exact leave-one-out cross-validation of sparse least-squares support vector machines[J].Neural Network,2004(17):1467-1475.
    [24]Senjian An,Wanquan Liu,Svetha Venkatesh. Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression[J].Pattern Recognitin,2007(40):2154-2162.
    [25]T.V.Gestel, J.A.K Suykens,G Lanckriet,etc. Bayesian framework for least-squares support vector machine classifiers, Gaussian processes, and kernel Fisher discriminant analysis[J].NEURAL COMPUTATION,2002,14(5):1115-1147.
    [26]李正欣,赵林度.基于贝叶斯框架下LS-SVM的时间序列预测模型[J].系统工程理论与实践,2007:142-146.
    [27]王平,刘玉涛.基于GA和LS-SVM的时间序列预测[J].华北电力大学学报,2009,36(4):100-103.
    [28]Y.Yang,R.S.Chen,Z.B.Ye. Combination of particle-swarm optimization with least-squares support vector machine for FDTD time series forecasting[J]. Microwave and Optical Technology Letters,2006(48):141-144.
    [29]张广明,袁宇浩,龚松建.基于改进最小二乘支持向量机方法的短期风速预测[J].上海交通大学学报,2011,45(8):1125-1129.
    [30]王定成,姜斌.在线稀疏最小二乘支持向量机回归的研究[J].控制与决策,2007,22(2):132-136.
    [31]张浩然,汪晓东.回归最小二乘支持向量机的增量和在线式学习算法[J].计算机学报,2006,29(3):400-406.
    [32]唐和生,薛松涛,陈荣等.序贯最小二乘支持向量机的结构系统识别[J].振动工程学报,2006,19(3):382-386.
    [33]Yanping Gu,Wenjie Zhao,Zhangsong Wu. Online adaptive least squares support vector machine and its application in utility boiler combustion optimization systems[J].Journal of Process Control,2011:1040-1048.
    [34]叶美盈,汪晓东,张浩然.基于在线最小二乘支持向量回归的混沌时间序列预测[J].物理学报,2005,54(6):2568-2573.
    [35]Van Gestel T, J.A.K Suykens, B Baesens,etc. Benchmarking least squares support vector machine classifiers[J].MACHINE LEARNING,2004(54):5-32.
    [36]Van Gestel T, J.A.K Suykens,Baestaens DE,etc. Financial time series prediction using least squares support vector machines within the evidence framework[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS,2001,12(4):809-821.
    [37]Borin A,Ferrao MF,Mello C. Least-squares support vector machines and near infrared spectroscopy for quantification of common adulterants in powdered milk[J].Analytica Chimica ACTA,2006,579(1):25-32.
    [38]Cogdill RP,Dardenne P. Least-squares support vector machines for chemometrics: an introduction and evaluation[J]. JOURNAL OF NEAR INFRARED SPECTROSCOPY,200412 (2):93-100.
    [39]Vong CM,Wong PK, Li YP.Prediction of automotive engine power and torque using least squares support vector machines and Bayesian inference[J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE,2006,19(3): 277-287.
    [40]Mitra V,Wang CJ,Banerjee S.Text classification:A least square support vector machine approach[J].APPLIED SOFT COMPUTING.2007,7 (3):908-914
    [41]Gencoglu MT,Uyar M.Prediction of flashover voltage of insulators using least squares support vector machines[J].EXPERT SYSTEMS WITH APPLICATIONS, 2009,36 (7):10789-10798.
    [42]Ye MY.Control of chaotic system based on least squares support vector machine modeling[J].ACTA PHYSICA S1NICA,2005,54(1):30-34.
    [43]李轩,李建宏,何玉珠等.基于遗传优化最小二乘支持向量机的导引头故障诊断[J].计算机应用研究,2011,28(3):899-904.
    [44]Lu C,Van Gestel T, Suykens J.A.K.etc. Preoperative prediction of malignancy of ovarian tumors using least squares support vector machines[J].ARTIFICIAL INTELLIGENCE IN MEDICINE,2003,28(3):281-306.
    [45]Polat K,Gunes S.Breast cancer diagnosis using least square support vector machine[J].DIGITAL SIGNAL PROCESSING,2007,17(4):694-701.
    [46]Zhang Yang,Liu Yuncai.Data Imputation Using Least Squares Support Vector Machines in Urban Arterial Streets[J].IEEE SIGNAL PROCESSING LETTERS, 2009,16(5):414-417.
    [47]Li Li-Juan,Su Hong-Ye,Chu Jian.Generalized Predictive Control with Online Least Squares Support Vector Machines[J].ACTA AUTOMAT1CA SNICA,2007,33(11): 1182-1188.
    [48]谢春利,邵诚,赵丹丹.基于最小二乘支持向量机的非线性系统鲁棒自适应跟踪控制[J].信息与控制,2010,39(1):66-70.
    [49]赵亚萍,张和生,周卓楠等.基于最小二乘支持向量机的交通流量预测模型[J].北京交通大学学报,2011,35(2):114-117.
    [50]李如琦,苏浩益,王宗耀等.应用启发式最小二乘支持向量机的中长期电力负荷预测[J].电网技术,2011,35(11):195-199.
    [51]汪洪桥,孙富春,蔡艳宁等.多核学习方法[J].自动化学报,2010:1039-1044.
    [52]陈强,任雪梅.基于多核最小二乘支持向量机的永磁体同步电机混沌建模及实时在线预测[J].物理学报,2010,59(4):2310-2318.
    [53]赵永平,孙健国.一类非平坦函数的多核最小二乘支持向量机的鲁棒回归算法[J].信息与控制,2008,37(2):160-165.
    [54]傅目伟等.水污染控制规划[M].北京:清华大学出版社.1985:pg185-187.
    [55]房平,邵瑞华,司全印等.最小二乘支持支持向量机应用于西安霸河口水质预测[J].系统工程,2011,29(6):113-117.
    [56]申丽.粒子群优化与支持向量机在河流水质模拟预测中的应用[D].浙江师范大学,2005:41-48.
    [57]Xiang Yunrong,Jiang Liangzhong.Water Quality Prediction Using LS-SVM with Particle Swarm Optimization[C]. Knowledge Discovery and Data Mining,2009: 900-904.
    [58]王丽学,刘永鹏,孔祥春等APSO-WLSSVM算法在水质预测中的应用研究[J].水电能源科学,2011,29(4):38-40.
    [59]戴宏亮.基于智能遗传算法与复合最小二乘支持向量机的长江水质预测与评价[J].2009,26(1):79-81.
    [60]梁雪春,龚艳冰,肖迪.一种多核加权支持向量机的水质预测方法[J].东南大学学报(自然科学版),2011,41(增刊):14-17.
    [61]石为人,王燕霞,唐云建等.小样本跳变水质时序数据预测方法[J].计算机应用,2010,30(2):486-489.
    [62]陈爱军.最小二乘支持向量机及其在工业过程建模中的应用[D].浙江大学,2006:1:20.
    [63]Vapnik V.The Nature of Statistical Learning Theory[M].Springer,1999:35-121.
    [64]Vapnik V.An overview of statistical learning theory[J].IEEE Transactions on Neural Networks,1999,10(5):988-999.
    [65]JUDONG SHEN.Fusing Support Vector Machines and Soft Computing for Pattern Recognition and Regression[D]. KANSAS STATE University,KANSAS,2005: 21-83.
    [66]Stephen P. Boyd, Lieven Vandenberghe.Convex Optimization[M].Cambridge University Press,2004:127-273.
    [67]Nello Cristianini,John Shawe-Taylor.An Introduction to Support Vector Machines and Others Kernel-based Learning Methods[M]. Cambridge University Press, 2001:27-32.
    [68]T.M.KG.Fernando,H.R.Maier.G.C.Dandy,Selection of input variables for data driven models:An average shifted histogram partial mutual information estimator approach. Journal of Hydro logy [J],Elsevier 2008:165-176.
    [69]Yoon H.,Yang K.,Shahabi C.,Feature subset selection and feature ranking for multivariate time series[J].IEEE Trans.On Knowledge and Data Engineering, 2005,17(9):1186-1198.
    [70]Guyon I.,Elisseeff A.,An introduction to variable and feature selection[J].Journal of Machine Learning Research 3,2003:1157-1182.
    [71]Rashidi Khazaee Parviz,Mozayani Nasser,Jahed Motlagh, Mutual information based input variable selection algorithm and wavelet neural network for time series prediction[C],ICANN 2008:798-807.
    [72]Yongnan Ji,Jin Hao,Nima Reyhani,Amaury Lendasse,Direct and recursive prediction of time series using mutual information selection[C], IWANN 2005,LNCS 3512:1010-1017.
    [73]Anti Sorjamaa,Jin Hao,Amaury Lendasse,Mutual information and k-nearest neighbors approximator for time series prediction[C],ICANN 2005,LNCS3697: 553-558.
    [74]Alexander Kraskov,Harald Stogbauer,Peter Grassberger,Estimating mutual information[J],Physical review E 69,2004:066138(1-16).
    [75]M.M.Rezaei Yousefi,M.Mirmomeni, C.Lucas, Input variables selection using mutual information for neuro fuzzy modeling with the application to time series forecasting[C],ICNN,IEEE 2007:1121-1126.
    [76]Rashidi Khazaee Parviz.Mozayani Nasser,M.R.Jahed Motlagh, Mutual information based input variable selection algorithm and wavelet neural network for time series prediction[C],ICANN 2008:798-807.
    [77]Conggui Yuan, Xinzheng Zhang, Shuqiong Xu, A prediction method for nonlinear correlative time series[C], GCSE2010:930-936.
    [78]Gavin J. Bowden,Graeme C.Dandy,Holger R.Maier,Input determination for neural network models in water resources applications. Part1-background and methodology[J], Journal of Hydrology 301(2005):75-92.
    [79]Angeliki Papana,Dimistris Kugiumtzis,Evaluation of mutual information estimators for time series[J]. International Journal of Bifurcation and Chaos, Applied Sciences and Engineering,2009:2-6.
    [80]孔祥勇.基于互信息的多分辨率3D-MR图像配准方法研究[D].浙江大学,2005:15.
    [81]A.H.Vahabie,M.M.RezaeiYousefi,S.Barghinia,P.Ansarimehr,Mutual information based input selection in Neuro-fuzzy modeling for short term load forecasting of Iran national power system,IEEE ICCA 2007:2710-2711.
    [82]Shannon C E.A Mathematical Theory of Communication[J].Bell Syst Tech Journal, 1948(27):379-423.
    [83]石峰,莫忠息.信息论基础[M].武汉大学出版社,2006:16-52.
    [84]Silverman B.W.Density estimation for statistics and data analysis[E].Monographs on Statistics and Applied Probab-ility,http://gemini.econ.umd.edu,2002:1-22.
    [85]Conggui Yuan, Xinzheng Zhang, Shuqiong Xu. Partial Mutual Information for Input Selection of Time Series Prediction[C].CCDC2011:2010-2014.
    [86]Sharma A.Seasonal to interannual rainfall probabilistic forecasts for improved water supply management:Part 1-a strategy for system predictor identification, Journal of Hydrology,2000:232-239.
    [87]方瑞明.支持向量机理论及其应用分析[M].中国电力出版社,北京,2007:131-143.
    [88]http://www.ny iso.com/publ ic/markets_operations/market_data/load_data/index.js p[E],2004.
    [89]Yevgeniy Bodyanskiy, Sergiy Popov. Neural network approach to forecasting of quasiperiodic financial time series[J]. European Journal of Operational Research,2006:1357-1366.
    [90]K.Koutroumbas,I.Tsagouri,A.Belehaki. Time series autoregression technique implemented on-line in DIAS system for ionospheric forecast over Europe[J]. Ann.Geophys,2008:371-386.
    [91]Jan G. De Gooijer,Rob J. Hyndman.25 years of time series forecasting[J]. International Journal of Forecasting 22,2006:443-473.
    [92]Adamowski J F. Peak daily water demand forecast modeling using artificial neural networks[J]. Journal of Water Resources Planning and Management,2008,134(2) 119-128.
    [93]Enrico Pisoni,Marcello Farina,Claudio carnevale,Luigi Piroddi.Forecasting peak air pollution levels using NARX models[J]. Engineering Applications of Artificial Intelligence 22,2009:593-602.
    [94]Hyndman R J,Shu Fan. Density forecasting for long-term peak electricity demand[J], IEEE TRANSACTIONS ON POWER SYSTEMS,2010,25(2):1142-1153.
    [95]Wentong Cui,Xuefeng Yan. Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR[J].Chemometircs and Intelligent Laboratory Systems,2009,98(2):130-135.
    [96]Tingwei Quan,Xiaomao Liu.Qian Liu. Weighted least squares support vector machine local region method for nonlinear time series prediction[J].Applied Soft Computing,2010,10:562-566.
    [97]M.M.Caro Lucas,M.B.Araabi.Elaheh Kamaliha. Fuzzy descriptor systems and spectral analysis for chaotic time series prediction[J],Neural Comput&Applic, 2009,18:991-1004.
    [98]李莹,邹经湘,张宇羽等.自适应神经网络在水质预测建模中的应用[J].系统工程,2001,19(1):89-90.
    [99]岳舜琳.水的耗氧量的卫生学意义[J].给水排水,2004,30(6):37-39.
    [100]王庆云,黄道.固定尺度最小二乘支持向量机[J].华东理工大学学报(自然科学版),2006,32(7):772-775.
    [101]C.K.I.Williams.M.Seeger.Using the Nystrom Method to Speed Up Kernel Machines[J].NIPS'2000,vol 13:682-688.
    [102]胡婷婷.模糊k-means聚类方法研究及改进[D].中山大学,2010:4.
    [103]Scott Shaobing Chen,Gopalakrishnan,P.S.Clustering via the Bayesian information criterion with applications in speech recognition[J].Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing,1998: 645-648.
    [104]Swagatam Das, Sudeshna Sil. Kernel-induced fuzzy clustering of image pixels with an improved differential evolution algorithm[J]. Information Sciences,2010: 1237-1256.
    [105]Michael Steinbach, George Karypis, Vipin Kumar. A comparison of document clustering techniques[J]. KDD workshop on text mining,2000:1-2
    [106]John Hurst, Gavin Donaldson, and Jadwiga Wedzicha. Temporal clustering of exacerbations in chronic obstructive pulmonary disease[J]. American Journal of Respiratory and Critical Care Medicine Vol 179.2009:369-374.
    [107]David Ikenberry, James P Weston. Clustering in US stock prices after decimalisation [J].Wiley Online Library,2008:1-35.
    [108]张永鹏.基于核函数和自定类别数目的文本聚类问题研究[D].哈尔滨理工大学2008:6,25-26.
    [109]曲福恒.一类模糊聚类算法研究及其应用[D].吉林大学,2009:31.
    [110]RUI XU,DONALD C,WUNSCH. CLUSTERING[M].John Wiley and Sons,2009:4-5, 21-69.
    [111]奉国太,朱思铭.基于聚类的大样本支持向量机研究[J].计算机科学,2006,33(4):1 45-147.
    [112]汪军,王传玉,周鸣争.半监督的改进K-均值聚类算法[J].计算机工程与应用,2009,45(28):137-139.
    [113]张莉,周伟达,焦李成.核聚类算法[J].计算机学报,2002,25(6):587-590.
    [114]孔锐,张国宜,施泽生等.基于核的K-均值聚类[J].计算机工程,2004,30(11):12-13,80.
    [115]于剑,程乾生.模糊聚类方法中的最佳聚类数的搜索范围[J].中国科学(E辑),2002,32(2):274-280.
    [116]W.H. Press, S.A. Teukolsky, V.T. Vetterling, P.B. Flannery, Numerical Recipes in C; The Art of Scientific Computing, Cambridge University Press,1992.
    [117]K.De Brabanter,P. Karsmakers,J.A.K Sukens,etc.LS-SVMlab Toolbox versionl.7 [E]. http://www.esat. kuleuven.be/sista/Issvmlab,2010.
    [118]欧素英,杨清书,雷亚平.咸潮入浸理论预报模式的分析及其在西江三角洲的应用[J].热带海洋学报,2010,29(1):32-41.
    [119]Savenije H G. Salinity and tides in alluvial estuaries[M]. Boston:Elsevier,2005: 1-100.
    [120]A Gretton, O Bousquet, A Smol, B Scholkopf. Measuring Statistical Dependence with Hilbert-Schmidt Norms[J]. Max Planck Institute for Biological Cybernetics. 2004,Technical Report No.140.
    [121]Lanckriet G R G, Cristianini N, Bartlett P, etc. Learning the kernel matrix with semidefinite programming[J].The Journal of Machine Learning Research,2004, 5(1):27-72.
    [122]Lee WJ,Verzakov S,Duin R P.Kernel combination versus classifier combination[J]. Proceedings of the 7th International Workshop on Multiple Classifier Systems, 2007:22-31.
    [123]Pavlidis P,Weston J,Cai J,Grundy W N.Gene functional classification from heterogeneous data.In:Proceedings of the 5th Annual International Conference on Computation Biology.Montreal,Canada:ACM,2001:249-255.
    [124]Devis Tuia,Gustavo Camps-Valls,Giona Matasci,Mikhail Kanevski.Learning Relevant Image Features with Multiple-Kernel Classification[J].IEEE TRANS ON GEOSC1ENCE AND REMOTE SENING,2010,48(10):3782-3790.
    [125]S Nakajima,A Binder,C Muller,etc.Multiple Kernel Learning for Object Classification[J], Technical Report on Information-Based Induction Sciences, 2009:1-8.
    [126]A Vedaldi, V Gulshan, M Varma,etc. Multiple Kernels for Object Detection[J], Proceedings of the Internatinal Conference on Computer Vision,IEEE,2009:1-8.
    [127]常甜甜,刘红卫,王宇等.基于分组特征多核支持向量机的微钙化簇检测[J].系统仿真学报,2010,22(5):1159-1163.
    [128]Shibin Qiu,Terran Lane.A Framework for Multiple Kernel Support Vector Regression and Its Applications to siRNA Efficacy Prediction[J].IEEE/ACM TRANS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS,2009, 6(2):190-199.
    [129]Chi-Yuan Yeh,Chi-Wei Huang,Shie-Jue Lee.A multiple-kernel support vector regression approach for stock market price forecasting[J].Expert Systems with Applications,2011(38):177-21896.
    [130]Zhengyu Chen,Jiangping Li,Liwei Wei. A multiple kernel support vector machine scheme for feature selection and rule extraction from gene expression data of cancer tissue[J].Artificial Intelligence in Medicine,2007(41):161-175.
    [131]张炤,张素,章琛曦等.基于多核支持向量机的概率密度估计方法[J].计算机仿真,2006,23(1):85-88.
    [132]李良敏,温广瑞,王生昌.基于遗传算法的改进径向基支持向量机及其应用[J].系统仿真学报,2008,20(22):6088-6096.
    [133]Shi Yu,Tillmann Falck,Anneleen Daemen,etc.L2-normmultiple kernel learning and its application to biomedical data fusion[J]. BMC Bioinformatics 2010,11:309.
    [134]Ling Jian,Zhonghang Xia,Xijun Liang,etc.Design of a multiple kernel learning algorithm for LS-SVM by convex programming[J],Neural Networks,2011,24: 476-483.
    [135]A J Smola,Bernhard Scholkopf.A Tutorial on Support Vector Regression[J]. STATISTICS AND COMPUTING,2004,14(3):199-222.
    [136]Lewis D P, Jebara T, Noble W S. Nonstationary kernel combination[J].Proceedings of the 23rd International Conference on Machine Learning,2006:553-560.
    [137]Ong C S,Smola A J, Williamson R C.Learning the kernel with hyperkernels[J].The Journal of Machine Learning Research,2005,6(7):1043-1071.
    [138]M A Goberna,M A Lopez.Linear semi-infinite optimization[M]. John Wiley,1998.
    [139]Gunnar Ratsch,Ayhan Demiriz,Kristin P.Bennett.Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces[J]. Machine Learning,2002(48):189-218.
    [140]S Sonnenburg,G Ratsch,C Schafer.Large Scale Multiple Kernel Learning[J]. Machine Learning,2006(7):1531-1565.
    [141]刘志泽.求解非线性半无限规划的序列二次规划方法[C].湖南大学,2008:11.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700