支持向量回归机模型选择研究及在综合力学环境预示中的应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
在航天航空、汽车制造等领域中,有效地预示结构响应对于现实应用有着重要意义。预测结构响应的关键问题之一在于如何有效地分析和模拟结构所处的力学环境。由于结构在工作状态中常处于包含振动、冲击、噪声、气动等不同形式载荷的综合力学环境,此时常用的有限元、统计能量法等数值方法由于无法准确再现复杂环境而难以有效地预测振动响应,直接进行试验测量又往往由于环境的复杂性而导致难以多次重复。
     由于线性系统在不同边界条件下的响应之间存在着映射关系,因此可用一种边界条件下的振动响应预测另一种边界条件下的振动响应。基于映射关系的环境预示的关键是对未知数据的预测能力,即泛化能力,同时考虑到小样本问题,因此采用支持向量回归机(SVR)建立映射关系模型。为了提高映射关系的预测精度,本文从提高映射关系模型泛化能力的角度入手,对SVR模型选择算法进行了深入研究,目的在于提高映射关系的预测效果,获得更加准确和鲁棒的映射关系模型。主要工作和贡献如下:
     1.提出了一种加权SVR求解路径算法,以有效提取响应数据中的映射关系模型信息。该方法可根据样本的重要性,设置不同的权值,在模型选择过程中根据权值大小调整模型。为了得到合理的权值,进一步提出了一种基于PSO优化的权重动态优化方法,将寻找最优权重的问题转化为一个以泛化误差界为优化目标的两维优化问题,从而确定最优求解路径模型,实验结果表明该算法有效提高了映射关系模型的泛化能力。
     2.针对映射关系模型的参数选择问题,本文提出了一种实数制的小世界优化算法,该算法采用禁忌搜索构建局域搜索算子,具有良好的全局收敛性,在分析了LS-SVM留一法泛化误差界作为回归模型泛化性能度量方式的合理性的基础上,提出了基于小世界优化策略的LS-SVM回归机模型选择算法。实验结果表明,该算法可有效避免超参数陷入局部极值,从而获得更好的模型泛化能力。
     3.为提高含噪实验数据的预测效果,本文从迁移学习的角度对含噪声的映射关系建模问题进行研究,提出了一种基于主曲线的回归迁移学习算法。该算法将主曲线作为多个相似回归任务的共有部分,并采用加权学习算法进行模型选择。利用该算法,可将具有简单形式的响应数据中蕴含的映射关系模型知识进行迁移,有效提高了含较高噪声响应数据的建模效果。
     4.针对结构多信息源的传递特征,本文提出了一种基于SVR特征选择的多输入多输出建模方法。该方法将映射关系建模视作多输入多输出问题,将全时(频)段的响应数据作为输入特征,并采用基于SVR泛化误差界的特征选择算法排除冗余的响应数据,实验结果表明该方法可有效提高映射关系模型的泛化能力。
     5.为了能根据响应预测数据制定试验条件,本文提出了一种新的载荷识别方法,该方法采用SVR求解路径算法作为建模算法,可适用于不同谱型形式的载荷反求,避免了传统频域法由于病态矩阵而产生的误差扰动,实验结果表明该方法具有较好的识别精度和数值稳定性。
Effective prediction of structural response always plays a important role in the fields ofaeronautics, astronautics, car-manufacturing and so on. One of the key questions is how toanalyze and simulate the true mechanical environment of structure. Because structures inworking condition are regularly excited by different kinds of loads, for example, vibration,shock, noise and so on, traditional numerical methods such as finite element method,statistical energy analysis, etc could not predict structural response effectively becausecombined dynamical environment is hard to reproduce. Moreover, direct experimentalmeasurement is hard to repeat many times because of the complexity of environment.
     There exists a mapping relationship between responses of one linear system under differentboundary conditions. Therefore, the structural vibration response under one boundarycondition can be predicted by the response under another condition. The key part ofenvironmental prediction based on mapping relationship is the predictive power for new data,i.e. generalization ability. Besides, considering the small sample size, this thesis utilizessupport vector regression(SVR) to establish mapping relationship. Aiming at improving thepredictive precision of mapping relationship model, this thesis studies the model selectionalgorithms of SVR. It is of great importance to improve generalization ability as well aspredictive performance of mapping relationship model. The main works and contributionsare listed as follows.
     1. To withdraw the model information from response data distribution, a weighted SVRsolution path algorithm is proposed. According to the sample’s importance, this algorithmsets different weight of error penalty parameter on each sample and optimizes the model bymeans of modifying the value of weights. Furthermore, a heuristic weight-settingoptimization algorithm is proposed to compute the optimal weights by using particle swarmoptimization(PSO). The idea of this algorithm is to transfer the primal problem to a globaloptimization problem with two variables. After solve this optimization, the optimal solutionpath model is determined. Experimental results show that the proposed algorithm canimprove the generalization ability of mapping relationship model effectively.
     2. Aiming at choosing the optimal SVR parameters, a new decimal-coding small-world optimization algorithm is proposed. This algorithm employs tabu search to construct localsearch operator and has good global convergence. Furthermore, based on the analysis ofeffectiveness of leave-one-out bound of LS-SVM on regression problems, a new modelselection algorithm based on small-world strategy is proposed for LS-SVM regression.Experimental results show that the proposed algorithm can avoiding premature of parametersand has better generalization ability than traditional methods.
     3. To improve the predictive performance from noisy experimental data, a new regressiontransfer learning algorithm based on principal curve is proposed from aspect of transferlearning. This algorithm utilizes non-parametric approach to seek the common-across-tasksrepresentation among multiple related regression tasks by computing the principal curve, andthen weights the samples by means of this curve. Numerical results demonstrate that thisalgorithm can withdraw the model information of mapping relationship from low noisy data,and improve the model performance from high noisy response data.
     4. Accoring to the transfer characteristic of multiple information sources, a newmultiple-input multiple-output(MIMO) modeling method is proposed based on SVR featureselection. This method treats the modeling procedure of mapping relationship as MIMOproblem, and reconstructs input feature by adopt whole-brand frequency or time responsedata. For eliminating the negative influence of redundant response data, the feature selectionalgorithm based on SVR generalization error bound is integrated into MIMO modelingprocedure. Experimental results show this method can improve the generalization abilityeffectively.
     5. To formulate experimental condition by means of response data, a new dynamical loadidentification approach is proposed. This approach utilizes SVR solution path algorithm asmodeling algorithm, and can be applied to identify inversely a wide variety of mechanicalloads which have different forms. Numerical results demonstrate that this approach canavoid error disturbances caused by ill-posed matrix in traditional frequency domain method,and outperforms greatly the common methods in terms of identification accuracy andnumerical stability.
引文
[1]马兴瑞,于登云,韩增尧, et al.星箭力学环境分析与试验技术研究进展[J].宇航学报,2006,(03):323-331.
    [2]王其政,刘斌,宋文滨.航天事故与动力学环境预示和控制技术研究述评[J].环境技术,1995,(04):1-6.
    [3]黄寿康.流体动力弹道载荷环境[M].北京:宇航出版社,1991.
    [4]张正平.航天运载器力学环境工程技术发展回顾及展望[J].航天器环境工程,2008,(03):233-236.
    [5]闫桂荣,喻磊,董龙雷.基于机器学习的动力学环境预测方法[J].应用力学学报,2007,24(增刊).
    [6]韩义.同一结构不同边界条件下振动响应间映射关系研究及应用[D].西安:西安交通大学,2009.
    [7]邓乃扬,田英杰.数据挖掘中的新方法一支持向量机[M].北京:科学出版社,2004.
    [8]张小达,荣克林. NASA-HDBK-7005《动力学环境准则》分析[J].航天器环境工程,2009,26(05):436-441+398.
    [9] Szücs E, Balkay B. Similitude and modelling[M]. New York: Elsevier Scientific Publishing Co.,1980.
    [10] Barnoski RL, Piersol AG, Van Der Laan WF, et al. Summary of random vibration predictionprocedures[R]. NASA CR-1302,1969.
    [11] Barrett RE. Techniques for predicting localized vibratory environments of rocket vehicles[R].NASA TN-l836,1963.
    [12] Condos FM, Butler W. A Critical Analysis of Vibration Prediction Techniques[C]. Los Angeles,1963:321-326.
    [13] Franken PA. Sound-Induced Vibrations of Cylindrical Vehicles[J]. The Journal of the AcousticalSociety of America,1962,34(4):453-454.
    [14] Curtis AJ. A Statistical Approach to Prediction of the Aircraft Flight Vibration Environment[J]. TheShock and Vibration Bull,1964,33(Part1):1-13.
    [15] Kennedy CC, Pancu CDP. Use of vectors in vibration measurement and analysis[J]. Journal of theAeronautical Sciences,1947,14(11):603-625.
    [16] Ewins DJ. Modal testing: theory and practice[M]. Letchworth: Research Studies Press,1984.
    [17]陈昌亚,王本利,王德禹, et al.随振动量级增加卫星结构频率下移的分析[J].上海航天,2004,(03):44-47.
    [18]于百胜,黄文虎,郑钢铁.通用有限元NASTRAN中的阻尼计算问题[J].强度与环境,2003,(01):7-10.
    [19] Ohayon R, Soize C. Structural acoustics and vibration: mechanical models, variational formulationsand discretization[M]. Academic Press,1997.
    [20]孙目,王小军,潘忠文, et al.统计能量分析在飞行器动力学环境预示中的应用[J].导弹与航天运载技术,2009,(03):11-14.
    [21]廖庆斌,李舜酩.统计能量分析中的响应统计估计及其研究进展[J].力学进展,2007,37(3):337-345.
    [22] Lyon RH, De Jong RG. Theory and Applications of Statistical Energy Analysis[M]. Boston, MA:Butterworth-Heinemann,1995.
    [23] Fahy FJ. Statistical energy analysis: a critical overview[M]. Cambridge: Cambridge UniversityPress,1994.
    [24] Nefske DJ, Sung SH. Power flow finite element analysis of dynamic systems: basic theory andapplications to beams[J]. Journal of Vibration, Acoustics, Stress and Reliability in Design,1989,111:94-100.
    [25] Vlahopoulos N, Garza-Rios LO, Mollo C. Numerical implementation, validation, and marineapplications of an energy finite element formulation[J]. Journal of Ship Research,1999,43(3):143-156.
    [26] Langley RS, Bremner P. A hybrid method for the vibration analysis of complex structural-acousticsystems[J]. Journal of the Acoustical Society of America,1999,105(3):1657-1671.
    [27]邹元杰,张瑾,韩增尧.基于FE-SEA方法的卫星部组件随机振动条件研究[J].航天器环境工程,2010,27(04):456-461.
    [28]张瑾,邹元杰,韩增尧.声振力学环境预示的FE-SEA混合方法研究[J].强度与环境,2010,37(03):14-20.
    [29] Rosenblatt F. The perceptron: a probabilistic model for information storage and organization in thebrain[J]. Psychological Review,1958,65(6):386-408.
    [30] Minsky ML, Papert S. Perceptrons: an introduction to computational geometry[M]. Cambridge,MA: MIT Press,1969.
    [31] Vapnik VN, Chervone.Ay. On the uniform convergence of relative frequencies of events to theirprobabilities[J]. Theory of Probility and Its Applications,1971,16(2):264-280.
    [32] Tikhonov AN. On solving ill-posed problem and method of regularization[J]. Doklady AkademiiNauk USSR,1963,153:501-504.
    [33] Rissanen J. Modeling by shortest data description[J]. Automatica,1978,14(5):465-471.
    [34] Rumelhart DE, Hinton GE, Williams RJ. Learning internal representations by error propagation[J].Parallel Distributed Processing,1986,1:318-362.
    [35] Valiant LG. A theory of Learnable[J]. Communications of the ACM,1984,27:1134-1142.
    [36] Vapnik V. The nature of statistical learning theory[M]. New York: Springer,1995.
    [37] Cristianini N, Shawe-Taylor J. An introduction to support Vector Machines and other kernel-basedlearning methods[M]. Cambridge, UK: Cambridge University Press,2000.
    [38] Zhou W, Zhang L, Jiao L. Linear programming support vector machines[J]. Pattern recognition,2002,35(12):2927-2936.
    [39] Sch lkopf B, Smola AJ, Williamson RC, et al. New support vector algorithms[J]. NeuralComputation,2000,12(5):1207-1245.
    [40] Chang CC, Lin CJ. Training v-support vector classifiers: Theory and algorithms[J]. NeuralComputation,2001,13(9):2119-2147.
    [41] Chang C, Lin C. Training v-support vector regression: theory and algorithms[J]. NeuralComputation,2002,14(8):1959-1977.
    [42] Suykens JAK, Vandewalle J. Least squares support vector machine classifiers[J]. Neural processingletters,1999,9(3):293-300.
    [43] Suykens JAK, Lukas L, Vandewalle J. Sparse least squares support vector machine classifiers[C].Bruges, Belgium,2000:37-42.
    [44] Tipping ME. Sparse Bayesian Learning and the Relevance Vector Machines[J]. Journal of MachineLearning Research,2001,1:211-244.
    [45] Cortes C, Vapnik VN. Support-vector networks[J]. Machine learning,1995,20(3):273-297.
    [46] Platt J. Fast training of support vector machines using sequential minimal optimization[B]. In:Sch lkopf B, Burges C, Smola AJ, editors. Advances in Kernel Methods--Support Vector Learning.Cambridge, MA: MIT Press,1999.
    [47] Sch lkopf B, Smola AJ. Learning with kernels: support vector machines, regularization,optimization, and beyond[M]. Cambridge, MA: MIT Press,2002.
    [48] Vapnik VN, Chapelle O. Bounds on error expectation for support vector machine[J]. NeuralComputation,2000,12(9):2013-2020l2016.
    [49] Wahba G, Lin Y, Zhang H. Generalized approximate cross validation for support vector machinesor another way to look at margin-like quantities[M]. MIT Press,2000.
    [50] Chapelle O, Vapnik V, Bousquet O, et al. Choosing multiple parameters for support vectormachines[J]. Machine learning,2002,46(1-3):131-159.
    [51] Keerthi SS. Efficient tuning of SVM hyperparameters using radius/margin bound and iterativealgorithms[J]. Ieee Transactions on Neural Networks,2002,13(5):1225-1229.
    [52] Gold C, Sollich P. Model selection for support vector machine classification[J]. Neurocomputing,2003,55(1-2):221-249.
    [53] Chen PW, Wang JY, Lee HM. Model selection of SVMs using GA approach[C]. Piscataway, NJ:IEEE Press,2004:2035-2040.
    [54] Zheng C, Jiao L. Automatic parameters selection for SVM based on GA[C]. Piscataway, NJ: IEEEPress,2004:1869-1872.
    [55] Chang MW, Lin CJ. Leave-one-out bounds for support vector regression model selection[J]. NeuralComputation,2005,17(5):1188-1222.
    [56] Ito K, Nakano R. Optimizing support vector regression hyperparameters based oncross-validation[C]. Portland, OR, IEEE Press,2003:2077-2082.
    [57]周伟达,张莉,焦李成.一种改进的推广能力衡量准则[J].计算机学报,2003,(05):598-604.
    [58] Xu ZB, Dai MW, Meng DY. Fast and Efficient Strategies for Model Selection of Gaussian SupportVector Machine[J]. Ieee Transactions on Systems Man and Cybernetics Part B-Cybernetics,2009,39(5):1292-1307.
    [59] Amari S, Wu S. Improving Support Vector Machine Classifiers by Modifying Kernel Function[J].Journal of Neural Networks,1999,12(6):783-789.
    [60] Wu S, Amari SI. Conformal transformation of kernel functions: A data-dependent way to improveSupport Vector Machine classifiers[J]. Neural processing letters,2002,15(1):59-67.
    [61]宋彦坡,唐英,彭小奇.基于小波分析和非线性映照的多维异常样本检测方法[J].系统仿真学报,2006,(04):978-981.
    [62] Zhang L, Zhou W, Jiao L. Wavelet support vector machine[J]. IEEE Transactions on Systems, Manand Cybernetics-Part B,2003,34(1):34-39.
    [63]李元诚,方廷健,郑国祥.短期电力负荷预测的小波支持向量机方法研究[J].中国科学技术大学学报,2003,(06):99-105.
    [64]崔万照,朱长纯,保文星, et al.最小二乘小波支持向量机在非线性系统辨识中的应用[J].西安交通大学学报,2004,(06):562-565+586.
    [65] Zhu J, Rosset S, Hastie T, et al.1-norm support vector machines[C]. MIT Press,2003.
    [66] Hastie T, Rosset S, Tibshirani R, et al. The entire regularization path for the support vectormachine[J]. Journal of Machine Learning Research,2004,5:1391-1415.
    [67] Gunter L, Zhu J. Efficient computation and model selection for the support vector regression[J].Neural Computation,2007,19(6):1633-1655.
    [68] Wang G, Yeung DY, Lochovsky FH. A New Solution Path Algorithm in Support VectorRegression[J]. Ieee Transactions on Neural Networks,2008,19(10):1753-1767.
    [69] Vapnik V. The nature of statistical learning theory(Second Edition)[M]. New York:Springer-Verlag,1999.
    [70]卢增详,李衍达.交互SVM学习算法及其在文本信息过滤中的应用[J].清华大学学报,1999,39(7):93-97.
    [71] Chen Z, Li J, Wei L. A multiple kernel support vector machine scheme for feature selection and ruleextraction from gene expression data of cancer tissue[J]. Artificial Intelligence in Medicine,2007,41(2):161-175.
    [72] Hotta K. Robust face recognition under partial occlusion based on support vector machine with localGaussian summation kernel[J]. Image and Vision Computing,2008,26(11):1490-1498.
    [73] Brown M, Lewis HG, Gunn SR. Linear spectral mixture models and support vector machines forremote sensing[J]. Ieee Transactions on Geoscience and Remote Sensing,2000,38(5):2346-2360.
    [74]饶鲜,董春曦,杨绍全.基于支持向量机的入侵检测系统[J].软件学报,2003,(04):798-803.
    [75] Ganyun LV, Cheng HZ, Zhai HB, et al. Fault diagnosis of power transformer based on multi-layerSVM classifier[J]. Electric Power Systems Research,2005,74(1):1-7.
    [76]王定成,方廷健,高理富, et al.支持向量机回归在线建模及应用[J].控制与决策,2003,(01):89-91+95.
    [77]杜树新,吴铁军.回归型加权支持向量机方法及其应用[J].浙江大学学报(工学版),2004,(03):47-51.
    [78] Thiessen U, van Brakel R, de Weijer AP, et al. Using support vector machines for time seriesprediction[J]. Chemometrics and Intelligent Laboratory Systems,2003,69(1-2):35-49.
    [79] Drezet PML, Harrison RF. Support Vector Machines for system identification[C]. UkaccInternational Conference on Control '98, Vols I&Ii,1998:688-692.
    [80] Suykens JAK, Vandewalle J, De Moor B. Optimal control by least squares support vectormachines[J]. Neural Networks,2001,14(1):23-35.
    [81] Afshin M, Sadeghian A, Raahemifar K. On efficient tuning of LS-SVM hyper-parameters inshort-term load forecasting: A comparative study[C]. IEEE Press,2007:104-109.
    [82] Min JH, Lee YC. Bankruptcy prediction using support vector machine with optimal choice of kernelfunction parameters[J]. Expert Systems with Applications,2005,28(4):603-614.
    [83] Hu D, Mao W, Zhao J, et al. Application of LSSVM-PSO to Load Identification in FrequencyDomain[J]. Lecture Notes in Computer Science,2009,5855/2009:231-240.
    [84] Smola AJ, Sch lkopf B. A tutorial on support vector regression[J]. Statistics and Computing,2004,14(3):199-222.
    [85] Suykens JAK, De Brabanter J, Lukas L, et al. Weighted least squares support vector machines:robustness and sparse approximation[J]. Neurocomputing,2002,48:85-105.
    [86] Kohavi R. A study of cross-validation and bootstrap for accuracy estimation and modelselection[C]. Francisco, CA: Morgan Kaufman,1995:1137-1143.
    [87] Stein CM. Estimation of the Mean of a Multivariate Normal-Distribution[J]. Annals of Statistics,1981,9(6):1135-1151.
    [88] Cawley GC, Talbot NLC. Preventing over-fitting during model selection via Bayesian regularisationof the hyper-parameters[J]. Journal of Machine Learning Research,2007,8:841-861.
    [89] Kwok JT, Tsang IW. Linear dependency between epsilon and-the input noise in epsilon-supportvector regression[J]. Ieee Transactions on Neural Networks,2003,14(3):544-553.
    [90] Friedrichs F, Igel C. Evolutionary tuning of multiple SVM parameters[J]. Neurocomputing,2005,64:107-117.
    [91] Kennedy J, Eberhart RC. Particle swarm optimization[C]. Piscataway, NJ: IEEE,1995:1942-1948.
    [92] Guo XC, Yang JH, Wu CG, et al. A novel LS-SVMs hyper-parameter selection based on particleswarm optimization[J]. Neurocomputing,2008,71:3211-3215.
    [93]李玲,董龙雷,闫桂荣.圆柱薄壳动力学建模及分析[J].应用力学学报,2007,24(增刊).
    [94]胡迪科,韩峰,闫桂荣.圆柱壳振动试验系统建模[C],2008.
    [95] Rosset S, Zhu J. Piecewise linear regularized solution paths[J]. Annals of Statistics,2007,35(3):1012-1030.
    [96]张讲社,郭高.加权稳健支撑向量回归方法[J].计算机学报,2005,(07):1171-1177.
    [97] Wen W, Hao ZF, Yang XW. A heuristic weight-setting strategy and iteratively updating algorithmfor weighted least-squares support vector regression[J]. Neurocomputing,2008,71(16-18):3096-3103.
    [98] Lin CF, Wang SD. Fuzzy support vector machines[J]. Ieee Transactions on Neural Networks,2002,13(2):464-471.
    [99] Tay FEH, Cao LJ. Modified support vector machines in financial time series forecasting[J].Neurocomputing,2002,48:847-861.
    [100] Pei J, Mai E, Wright J, et al. Neural network initialization with prototypes-function approximationin engineering mechanics applications[C]. Orlando, FL: IEEE Press,2007:2110-2116.
    [101] Friedman JH. Multivariate Adaptive Regression Splines-Rejoinder[J]. Annals of Statistics,1991,19(1):123-141.
    [102] Takens F. Detecting strange attractor in turbulence[J]. Lecture Notes in Mathematics,1981,898(2):361-381.
    [103] Liu HF, Dai ZH, Li WF, et al. Noise robust estimates of the largest Lyapunov exponent[J]. PhysicsLetters A,2005,341(1-4):119-127.
    [104] Lau KW, Wu QH. Local prediction of non-linear time series using support vector regression[J].Pattern recognition,2008,41(5):1539-1547.
    [105] Harpham C, Dawson CW. The effect of different basis functions on a radial basis function networkfor time series prediction: A comparative study[J]. Neurocomputing,2006,69(16-18):2161-2170.
    [106] Avci E. Selecting of the optimal feature subset and kernel parameters in digital modulationclassification by using hybrid genetic algorithm-support vector machines: HGASVM[J]. ExpertSystems with Applications,2009,36(2):1391-1402.
    [107] Watts DJ. Six degrees: the science of a connected age[M]. New York: W.W. Norton,2003.
    [108] Kleinberg J. Small-world phenomena and the dynamics of information[J]. Advances in NeuralInformation Processing Systems,2002,14:431-438.
    [109]杜海峰,庄健,张进华, et al.用于函数优化的小世界优化算法[J].西安交通大学学报,2005,(09):1011-1015.
    [110]杜海峰,李树茁, W.F.Marcus, et al.小世界网络与无标度网络的社区结构研究[J].物理学报,2007,(12):6886-6893.
    [111]孟仲伟.中美电网的小世界拓扑模型比较分析[J].电力系统自动化,2004,28(15):21-29.
    [112] Glover F. Tabu Search-Part I[J]. ORSA Journal on Computing,1989,1-3:190-206.
    [113] Glover F. Tabu search-Uncharted domains[J]. Annals of Operations Research,2007,149(1):89-98.
    [114] Michalewicz Z. Genetic algorithms+data structures=evolution programs[M]. New York:Springer-Verlag,1996.
    [115] Eckmann JP, Ruelle D. Ergodic-Theory of Chaos and Strange Attractors[J]. Reviews of ModernPhysics,1985,57(3):617-656.
    [116]李敏强,寇纪淞,林丹.遗传算法的基本原理与应用[M].北京:科学出版社,2002.
    [117] Glover F, Hanafi S. Tabu search and finite convergence[J]. Discrete Applied Mathematics,2002,119(1-2):3-36.
    [118] Hanafi S. On the convergence of Tabu Search[J]. Journal of Heuristics,2001,7(1):47-58.
    [119] Baylar A, Hanbay D, Batan M. Application of least square support vector machines in the predictionof aeration performance of plunging overfall jets from weirs[J]. Expert Systems with Applications,2009,36(4):8368-8374.
    [120] Hanbay D. An expert system based on least square support vector machines for diagnosis of thevalvular heart disease[J]. Expert Systems with Applications,2009,36(3):4232-4238.
    [121] Kumar MA, Gopal M. Least squares twin support vector machines for pattern classification[J].Expert Systems with Applications,2009,36(4):7535-7543.
    [122] Du HF, Wu XD, Zhuang J. Small-world optimization algorithm for function optimization[J].Lecture Notes in Computer Science,2006,4222/2006:264-273.
    [123] Houck C, Joines J, Kay M. A genetic algorithm for function optimization: a matlabimplementation[R]. NCSU-IE TR95-09. North Carolina State University: USA,1995.
    [124] Newman DJ, Hettich S, Blake CL, et al. UCI Repository of machine learning databases[EB/OL].Irvine, CA: Department of Information and Computer Science, University of California,1998.http://www.ics.uci.edu/~mlearn/MLRepository.html.
    [125] Dai WY, Xue GR, Yang Q, et al. Co-clustering based Classification for Out-of-domainDocuments[C].2007:210-219.
    [126] Yin J, Yang Q, Ni L. Adaptive temporal radio maps for indoor location estimation[C]. IEEE Press,2005:85-94.
    [127] Pan SJ, Yang QA. A Survey on Transfer Learning[J]. Ieee Transactions on Knowledge and DataEngineering,2010,22(10):1345-1359.
    [128] Dai W, Yang Q, Xue G, et al. Boosting for transfer learning[C]. New York: ACM,2007:193–200.
    [129] Shi Y, Wang J. Similarity-based weighting approach for transfer learning[R]. Sun Yat-senUniversity, China,2009.
    [130] Zadrozny B. Learning and evaluating classifiers under sample selection bias[C]. Banff, Canada:ACM,2004:903-910.
    [131] Liao X, Xue Y, Carin L. Logistic regression with an auxiliary data source[C]. ACM Press,2005:505-512.
    [132] Argyriou A, Evgeniou T, Pontil M. Convex multi-task feature learning[J]. Machine learning,2008,73(3):243-272.
    [133] Pan SJ, Kwok JT, Yang Q. Transfer learning via dimensionality reduction[C]. Chicago, USA: AAAIPress,2008:677-682.
    [134] Hastie T. Principal curves and surfaces[R]. Department of Statistics, Stanford University,1983.
    [135] Hermann T, Meinicke P, Ritter H. Principal curve sonification[C]. International Community forAuditory Displayn,2000:81-86.
    [136] De'ath G. Principal curves: A new technique for indirect and direct gradient analysis[J]. Ecology,1999,80(7):2237-2253.
    [137]张军平,王珏.主曲线研究综述[J].计算机学报,2003,(02):129-146.
    [138] Verbeek JJ, Vlassis N, Krose B. A k-segments algorithm for finding principal curves[J]. PatternRecognition Letters,2002,23(8):1009-1017.
    [139] Kegl B, Krzyzak A, Linder T, et al. Learning and design of principal curves[J]. Ieee Transactions onPattern Analysis and Machine Intelligence,2000,22(3):281-297.
    [140] Qi H, Wang J. A model for mining outliers from complex data sets[C]. ACM, New York,2004:595-599.
    [141] Perez-Cruz F, Camps-Valls G, Soria-Olivas E, et al. Multi-dimensional function approximation andregression estimation[J]. Artificial Neural Networks-Icann2002,2002,2415:757-762.
    [142]王静龙.多元统计分析[M].北京:科学出版社,2008.
    [143]朱文彪,孙增圻.一种MIMO复杂过程的模糊建模新方法[J].系统工程与电子技术,2005,(01):97-99.
    [144]黄德先,金以慧.基于小波神经网络的通用多变量非线性系统辨识算法和应用[J].控制理论与应用,2001,(S1):63-68.
    [145]舒华,舒怀林.基于PID神经网络的多变量非线性动态系统辨识[J].计算机工程与应用,2006,(12):47-49.
    [146]蔡艳宁,胡昌华.辨识非线性MIMO系统的多输出ε-SVR模型研究[J].控制与决策,2008,(07):813-816+822.
    [147] Martínez Ramón M, Rojo-álvarez JL, Camps-Valls G, et al. Support vector machines for nonlinearkernel ARMA system identification[J]. IEEE Trans on Neural Networks,2006,17(6):1617-1622.
    [148]李建伟,汪友华,吴清.基于多维输出支持向量回归机的脑电源定位[J].中国组织工程研究与临床康复,2009,(17):3256-3259.
    [149]王晶,靳其兵,曹柳林.面向多输入输出系统的支持向量机回归[J].清华大学学报(自然科学版),2007,(S2):1737-1741.
    [150]唐发明.基于统计学习理论的支持向量机算法研究[D].武汉:华中科技大学,2005.
    [151]胡根生,邓飞其.具有多分段损失函数的多输出支持向量机回归[J].控制理论与应用,2007,(05):711-714.
    [152]胡根生,邓飞其.在线多输出支持向量回归及在投资决策中的应用[J].华南理工大学学报(自然科学版),2006,(06):64-68.
    [153]安欣,徐硕,张录达, et al.多因变量LS-SVM回归算法及其在近红外光谱定量分析中的应用[J].光谱学与光谱分析,2009,(01):127-130.
    [154]张丽新.高维数据的特征选择及基于特征选择的集成学习研究[D].北京:清华大学,2004.
    [155] Dong J, Zhong N, Ohsuga S. Using rough sets with heuristics for feature selection [J]. Lecture Notesin Computer Science,2004,1711:178-187.
    [156] Yang J, Honavar V. Feature subset selection using a genetic algorithm[J]. IEEE Intelligent Systems,1998,13(2):44-49.
    [157] Rakotomamonjy A. Variable selection using SVM-based criteria[J]. Journal of Machine LearningResearch,2003,3:1357-1370.
    [158] Weston J, Mukherjee S, Chapelle O, et al. Feature selection for SVMs[J]. Advances in NeuralInformation Processing Systems,2001,13:668-674.
    [159] Rakotomamonjy A. Analysis of SVM regression bounds for variable ranking[J]. Neurocomputing,2007,70(7-9):1489-1501.
    [160] Kuhnert C, Bernard T. Extraction of optimal control patterns in industrial batch processes based onSupport Vector Machines[C]. Saint Petersburg, Russia: IEEE Press,2009:481-486.
    [161]余岭,陈鸿天,罗绍湘.用时域法和频时域法识别桥面移动车载[J].工程力学,2001,(05):100-107.
    [162]黄林,袁向荣.小波分析在桥上移动荷载识别中的应用[J].铁道学报,2003,(04):97-101.
    [163]瞿伟廉,王锦文.振动结构动态荷载识别综述[J].华中科技大学学报(城市科学版),2004,(04):1-4.
    [164] Desanghere G, Snoeys R. Indirect identification of excitation forces by modal coordinatetransformation[C]. Florida, USA,1985:685-690.
    [165] Cao X, Sugiyama Y, Mitsui Y. Application of artificial neural networks to load identification[J].Computers&Structures,1998,69(1):63-78.
    [166]田燕,张志斌,邹劲松, et al.基于人工神经网络的齿轮箱载荷识别研究[J].军械工程学院学报,2003,(03):11-15.
    [167]李守巨,刘迎曦,何翔, et al.基于人工神经网络的爆炸冲击荷载参数识别方法[J].岩石力学与工程学报,2003,(11):1870-1873.
    [168]李军,刘君华.基于支持向量回归神经网络的动态系统辨识(英文)[J]. Journal of SoutheastUniversity(English Edition),2006,(02):228-233.
    [169] Drezet PML, Harrison RF. Support Vector Machines for system identification[C],1998:688-692.
    [170]周成召.基于支持向量机的动态载荷识别[D].太原:太原理工大学,2006.
    [171]杨洁明,李敏,周成召.支持向量回归机在动态载荷识别中的应用[J].振动、测试与诊断,2006,26:258-261.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700