基于局部模型的时间序列预测方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
时间序列分析一直受到国内外学者的广泛重视,成为一个具有重要理论和使用价值的热点研究课题。时间序列预测是时间序列分析中的主要研究任务,在工业自动化、水文、地质、股市以及军事科学等领域都有着广泛的应用。
     目前,时间序列预测主要采用全局模型进行预测,其建模效率低、预测性能不佳、模型实时更新的计算复杂度高。近年来,人们开始将数据挖掘、模式识别、信号处理、混沌等理论及技术融合到时间序列的预测研究中,通过对时间序列数据进行时域或频域划分,在各个局部时频区域进行预测建模。时间序列预测的局部模型不仅可以提高预测精度,而且可以降低时间序列预测模型的复杂度和预测建模的计算复杂度。但是,基于局部模型的预测方法仍然有许多问题需要解决。本文从分解域和时间域两个方面研究时间序列预测的局部建模方法,重点讨论了经验模式分解端点效应处理、分解域局部模型选择与实时更新、任意形状簇时间序列的自适应聚类、时间序列分类的非线性特征提取及快速属性约简、局部时域支持向量预测建模及增量更新等问题。本文所取得的主要创新性成果包括:
     第一,针对经验模式分解存在端点效应问题,本文提出基于相似性搜索的序列延拓方法进行端点效应抑制。该方法利用线性时间序列或非线性时间序列本身的自相似性,查找序列中与端点处模式相似之处的前续或后续子序列进行时间序列延拓,这使得延拓的子序列更接近于时间序列可能的前续或后续序列,从而大大降低了端点效应。此外,由于采用快速最近邻搜索算法进行相似子序列的搜索,基于相似性搜索的序列延拓方法的计算复杂度很低。仿真实验验证了该端点效应抑制方法的有效性。
     第二,本文在分解域各个内禀模态函数(Intrinsic mode functions, IMF)分量中采用径向基神经网络和增量核空间独立向量组合预测算法进行预测建模,但是分解造成了模型参数选择的计算负担。为了解决该问题,本文提出仅进行两个分量的模型参数选择,而其他分量的模型参数则利用局部分量建模最优参数取值与各个IMF之间的关系计算得到,从而大大降低了分解域预测建模的计算负担。此外,针对RBF神经网络实时更新慢的缺点,本文在各个IMF分量预测建模中提出增量核空间独立向量组合预测算法,该算法的计算复杂度低。仿真实验验证了在分解域各个IMF分量采用RBF神经网络和增量核空间独立向量组合预测算法进行预测的性能优于单一预测模型。
     第三,针对目前聚类的簇数目估计有效性准则泛函不能有效地估计出正确簇数目,本文引入正则化思想提出基于惩罚方法的簇数目估计准则泛函,该泛函随簇数目变化的曲线是单峰或近似单峰,这使得使用该泛函估计得到簇数目更准确更鲁棒。仿真实验验证了该有效性准则泛函可以有效地估计出正确或接近正确的簇数目。
     第四,针对目前粗糙集属性约简算法的计算复杂度仍然较高的问题,本文提出基于函数映射的粗糙集快速属性约简算法,该算法利用空间逐渐收缩的最近邻搜索算法实现各个样本到各不可区分关系的快速映射,从而可以大大降低原有算法的计算复杂度。仿真结果表明基于函数映射的粗糙集快速属性约简算法随数据规模变化和数据维数变化的伸缩性好。
Time series analysis has caught the focus of many researchers, and becomes a hot research field with great theoritical value and application value. Time series forcasting is the main task of time series analysis, and has been widely applied in many fields such as industry automatic, Hydrological, Geology, stock market, military science and so on.
     Nowadays, global model is the main tool for time series predicting, but it suffers low prediction efficiency, low prediction accuracy and high computation complexity for model updating. In recent years, the techniques such as data mining, pattern recognition, signal processing and chaos so on are incorporated into time series prediction, which divides the time series data in time domain or frequency domain and constructs the prediction models in local time domain or frequency domain. Local model for time series prediction can obtain more accuracy prediction results, and has lower complexity of models and lower computation complexity of modeling. But there are still many problems worth doing some research. In this paper, local modeling for time series prediction is discussed in the decomposition field and time field respectively, and mainly research focuses on the boundary effect processing method for empirical mode decomposition, model selection and updating in decomposition field, adaptive time series data clustering with arbitrary shape, nonlinear feature extraction and fast attribute reduction of time series classification, time series prediction modeling in local time field with support vector and so forth. The main contributions of this dissertation are as follows:
     Firstly, similarity searching based boundary effect processing method for empirical mode decomposition is proposed. This method utilise the property of local self-similarity of a nonlinear or linear signal to extend series and get extra extrema for spline interpolation, make extended series more similar to the real series before the fore-endpoint or behind of the back-endpoint and reduces the boundary effect greatly. Furthermore, the computing complexity is low for using fast nearest neighbor searching.Experimental results validated the effectiveness of this method.
     Secondly, RBF network and Incremental Independent Vector Combination Predicting algorithm in Kernel Space are proposed to construct the forcasting model for every intrinsic mode function component of time series decomposed with empirical mode decomposition method. But empirical mode decomposition aggravates computation for parameter selection of forcasting models. In order to resolve this problem, model parameter selection is done only for two IMFs, and other model parameters are computed by the relationship between the model parameters and the IMFs, and computation burden for model selection is alleaviated. Furthermore, for low updating efficiency of RBF network, Incremental Independent Vector Combination Predicting algorithm in Kernel Space is proposed and has lower computation complexity. Experimental results show that time series prediction in the empirical mode decomposition domain with RBF neural network and IIVCPKS algorithm can obtain more accuracy prediction than single model.
     Thirdly, a novel validity index for clustering with penalizing method is proposed to estimate the cluster number. A penal factor is introduced into this validity index and makes the curve of this validity index convex-like or almost convex-like. A more accuracy estimated cluster number can be obtained by minimize this validity index. Experimental results show that the proposed validity index can estimate correct or almost correct cluster number.
     Finally, efficient rough-set-based attribute reduction algorithm with function maping is proposed. In this algorithm, a fast nearest neighbour searching method with gradually shrinking search space is proposed to reduce the computing complexity of indiscernibility relation, positive relation and so on. Experimental results show that the proposed algorithm computed attribute reduction more efficiently and had good scalability with data size and data dimension.
引文
1 M Versace, R Bhatt, O Hinds, M Shiffer. Predicting the exchange traded fund DIA with a combination of genetic algorithms and neural networks. Expert Systems with Applications, 2004, 27(3): 417-425.
    2 R.Pintelon, J.Schoukens. Time series analysis in the frequency domain. IEEE Transactions on Signal Processing, 1999, 47(1):206-210.
    3 X.Yu and S.Liong. Forcasting of hydrologic time series with ridge regression in feature space. Journal of Hydrology, 2007, 332(3-4):290-302.
    4 J.S. Armstrong. Findings from evidence-based forecasting: methods for reducing forecast error. International Journal of Forecasting, 2006, 22(3):583-598.
    5 A.B.Geva. Hierachical-fuzzy clustering of temporal-patterns and its application for time-series prediction. Pattern Recognition Letters, 1999, 20(14):1519-1532.
    6 H.Kantz and T.Schreiber. Nonlinear Time Series Analysis.北京:清华大学出版社, 1997:202-219
    7 E.Balaguer, A.Palomares, E.Soria and J.D.Martin-Guerrero. Predicting service request in support centers based on nonlinear dynamics, ARMA modeling and neural networks. Expert Systems with Applications, 2006, Available online 9 November.
    8 Z.Deng, Y.Gao et al. New approach to information fusion steady-state Kalman filtering. Automatica, 2005, 41(10):1695-1707.
    9 R.J.Bhansali. Autoregressive model selection for multistep prediction. Journal of Statistical Planning and Inference, 1999, 78(1-2):295-305.
    10 S.A.Jiren et al. Evolving Gaussian RBF network for nonlinear time series modelling and prediction. Electronics Letters, 1998, 34(2):1241-1243.
    11 M.C.Chan et al. Financial time series forecasting by neural network using conjugate gradient learning algorithm and multiple linear regression weight initialization. http://fmwww.bc.edu/cefoo/papers/paper61.pdf
    12 H.Li et al. A dynamic neural network method for time series prediction using the KIII model. In the proceedings of the International Joint Conference on - 110 -Neural Network,2003,1:347-352.
    13 F. Tay et al. Modified support vector machines in financial time series forecasting. Neurocomputing, 2002, 48:847-861.
    14 K.Tse et al Nonlinear time series prediction based on Lyapunov theory-based fuzzy neural network and multi-objective genetic algorithm. LCNS, 2003:890-898.
    15 S.F.Crone et al. Prediction of white noise time series using artificial neural networks and asymmetric cost functions. In the proceedings of the International Joint Conference on Neural Network,2003,4: 2460 - 2465.
    16 V.M.Rivas et al. Evolving RBF neural networks for time series forecasting with EvRBF. Information Sciences, 2004, 165:207-220.
    17 X.Wei et al. Analysis and applications of time series forecasting model via support vector machines. Systems Engineering and Electronics, 2005, 27(3):529-532.
    18 C.Wu et al. Travel-time prediction with support vector regression. IEEE Transaction on Intelligent Transportation Systems, 2004, 5(4): 276-220.
    19 B.Chen et al. Load forecasting using support vector machines: a study on EUNITE competition 2001. IEEE Transaction on Power Systems, 2004, 19(4):1821-1830.
    20 L.Vanajakshi et al. A comparison of the performance of artificial neural networks and support vector machines for the prediction of traffic speed. In the proceedings of IEEE Intelligent Vechicles Symposium, 2004: 194 - 199
    21 H.Ince et al. Kernel principal component analysis and support vector machines for stock price prediction. In the proceedings of IEEE International Joint Conference on Neural Networks,2004,3:2053-2058.
    22 Y.Mitani et al. Time series prediction of acoustic signals using neural network model and wavelet shrinkage. Proceedings of the 10th International Congress on Sound and Vibration, 2003:4189-4196.
    23 Zongwu Cai, Ram C. Tiwari. Application of a local linear autoregressive model to BOD time series.Environmentrics,2000,11(3):341-350.
    24 Yuehui Chen, Bo Yang, Jiwen Dong. Time-series prediction using a local linear wavelet neural network, Neurocomputing, 2006, 69:449–465.
    25 HEE-YEAL Yu I AND SUNG-YANG BANG. An Improved Time SeriesPrediction by Applying the Layer-by-Layer Learning Method to FIR Neural Networks. Neural Networks, 1997,10(9):1717-1729.
    26 A.Aussem. Dynamical recurrent neural networks towards prediction and modeling of dynamical systems.Neurocomputing,1999,28:207-232.
    27 S. Soltani, D. Boichu, P. Simard, S. Canu. The long-term memory prediction by multiscale decomposition.Signal Processing, 2000,80:2195-2205.
    28 Pekka Teppola and Pentti Minkkinen. Wavelet–PLS regression models for both exploratory data analysis and process monitoring. Journal of Chemometrics, 2000, 14:383-399.
    29 S. Soltani. On the use of the wavelet decomposition for time series prediction. Neurocomputing, 2002,48:267–277.
    30 A. Antoniadis and T. Sapatinas. Wavelet methods for continuous-time prediction using Hilbert-valued autoregressive processes. Journal of Multivariate Analysis, 2003,87:133–158.
    31 F. Murtagh, J.L. Starck, O. Renaud. On neuro-wavelet modeling. Decision Support Systems, 2004,37:475 – 484.
    32 A.Antoniadis, and T.Sapatinas. Wavelet methods for continuous-time predictionusing Hilbert-valued autoregressive processes. Journal of Multivariate Analysis, 2003,87:133–158.
    33 Y.Chen, B.Yang, J.Dong. Time-series prediction using a local linear wavelet neural network. Neurocomputing, 2006,69:449–465.
    34 S.Oh, K.Seo, and J.Lee. Time Series Prediction by Mixture of Linear Local Models. In the proceedings of the IEEE Conference on Industrial Electronics Society, 2003,2:1905-1908.
    35 G.Bontempi, E.Bcrtolissi, M.Rirattsni. Predicting stock markets in boundary conditions with local models. Proceedings of the IEEE Conference on Computational Intelligence for Financial Engineering, 2000:158-161.
    36 F.Zeng, Z.Qiu. Adaptively predicting time series with local v-supprot vector regression machine. Proceedings of IEEE International Conference on Neural Networks and Signal Processing, 2003:790-792.
    37 P. ANGELOVI?. Time Series Prediction Using RSOM and Local Models. In the proceedings of Informatics and Information Technologies Student Research Conference, 2005:27-27.
    38 S.A.Sannasiraj, V.Babovic, E. Chan. Local model approximation in the real time wave forecasting.Coastal Engineering, 2005,52:221 – 236.
    39 M.Bianchi, G.Corani, G.Guariso, C.Pinto. Prediction of ungulates abundance through local linear algorithms. Environmental Modelling & Software, 2006 21:1508-1511.
    40 V. Babovic, S.A. Sannasiraj, E.Chan. Error correction of a predictive ocean wave model using local model approximation. Journal of Marine Systems, 2005,53:1– 17.
    41 M. Aqil, I. Kita, A.Yano, S. Nishiyama. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool. Journal of Environmental Management, 2007.
    42 R.L.Milidiu, R.J.Machado. R.P.Rentera. Time series forecasting through wavelets transformation and a mixture expert models.Neurocomputing, 1999, 28: 145-146.
    43 L. Cao. Support vector machines experts for time series forecasting. Neurocomputing, 2003, 51:321-339.
    44 N.E.Huang, Z.Shen et al.The empirical mode decomposition method and the Hilbert spectrum for non-stationary time series analysis. Proc.Ray.Soc, London, A454,1998:903-995.
    45 Y.Chen and K.Hwang. Collaborative detection and filtering of shrew DDoS attacks using spectral analysis. Journal of Parallel and Distributed Computing, 2006, 66(9):1137-1151.
    46 A.Farina, P.Lombardo and L.Ortenzi. A unified approach to adaptive radar processing with general antenna array configuration. Signal Processing, 2004, 84(9):1593-1623.
    47 M.Huang, Y.He and H.Cen. Predictive analysis on electric-power supply and demand in China. Renewable Energy, 2007,32(7):1165-1174.
    48 S.R.Qin and Y.M.Zhong. Research on the unified mathematical model for FT, STFT and WT and its applications. Mechanical Systems and Signal Processing, 2004, 18(6)1335-1347.
    49 D.Gabor. Theory of Communication, J.Inst.Elec.Eng., 1964,93:429-457.
    50 G.Y.Chen, T.D.Bui and A.Krzyzak. Rotation invariant pattern recognition using ridgelets, wavelets cycle-spinning and Fourier features. PatternRecognition, 2005, 38(12):2314-2322.
    51 P.D.Spanos, A.Giaralis and N.P.Politis. Time-frequency representation of earthquake accelerograms and inelastic structural response records using the adaptive chirplet decomposition and empirical mode decomposition. Soil Dynamics and Earthquake Engineering, 2007, 27(7):675-689.
    52 R.B.Pachori and P.Sircar. A new technique to reduce cross terms in the Wigner distribution. Digital Signal Processing, 2007,17(2):466-474.
    53 N.S.Rao and P.S.Moharir. A signal-dependent evolution kernel for Cohen class time-frequency distribution. Digital Signal Processing, 1998,8 (3):158-165.
    54 N.Huang and N.O.Attoh-Okine. The Hilbert-Huang Transform in Engineering. Taylor&Francis, 2005:1-24.
    55 T. Warren Liao. Clustering of time series data – a survey. Pattern Recognition, 2005,38:1857-1874.
    56 K. Kosmelj et al. Cross-sectional approach for clustering time varying data. Journal of Classification, 1990, 7:99-109.
    57 T.W.Liao et al. Understanding and projecting the battle state. www.ie.lsu.edu/People/faculty/Liao/LiaoVita.doc.
    58 X.Golay et al. A new correlation-based fuzzy logic clustering algorithm for fMRI. Mag. Resonance Med., 1998, 40:249-260.
    59 J.J. van Wijk et al. Cluster and calendar based visualization of time series data. Proceedings of IEEE Symposium on Information Visualization, 1999: 4-9.
    60 M.Kumar et al. Clustering seasonality patterns in the presence of errors. Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining,2002:557-563.
    61 C.S. Moller-Levet et al. Moller-Levet C S, Klawonn F, Cho K - H, Wolkenhauer O. Fuzzy clustering of short time-series and unevenly distributed sampling points[R]. Germany: Bioinformatics & Systems Biology Group of Rostock University, 2003.
    62 R.H.Shumway.Time-frequency clustering and discriminant analysis. Stat.Probab.Letter, 2003, 63:307-314.
    63 S.Policker et al. Nonstationary time series analysis by temporal clustering.IEEE Trans.Syst.Man Cybernet.-B: Cybernet, 2000, 30(2):339-343.
    64 T.W. Liao. Mining of vector time series by clustering. Working paper, 2005.
    65 J.G Wilpon et al. Modified k-means clustering algorithm for use in isolated word recognition. IEEE Trans. Acoust. Speech Signal Process, 1985, 33(3):587-594.
    66 C.Goutte et al. On clustering fMRI time series. Neuroimage, 1999, 9(3):298-310.
    67 T.-C. Fu et al. Pattern discovery from stock time series using self-organizing maps. KDD Workshop on Temporal Data Mining, 2001:27-37.
    68 L.M.D. Owsley et al. Self-organizing feature maps and hidden Markov models for machine-tool monitoring. IEEE Transactions on Signal Processing, 1997, 45(11):2787-2798.
    69 M.Vlachos et al. A wavelet-based anytime algorithm for k-means clustering of time series. Proceedings of the Third SIAM International Conference on Data Mining, 2003: 23--30
    70 D.Piccolo. A distance measure for classifying ARMA models. J. Time Ser.Anal. 1990,11(2): 153-163.
    71 E.A. Maharj. Clusters of time series. J.Classification, 2000, 17: 297-314.
    72 M.Ramoni et al. Bayesian clustering by dynamics. Machine Learning, 2002, 47(1): 91-121.
    73 K. Kalpakis et al. Distance measures for effective clustering of ARIMA time series. Proceedings of the 2001 IEEE International Conference on Data Mining, 2001:273-280.
    74 Y.Xiong et al. Mixture of ARMA models for model-based time series clustering. Proceedings of the IEEE International Conference on Data Mining, 2002:717-720.
    75 T. Oates et al. Clustering time series with hidden Markov models and dynamic time warping. IJCAI-99 Workshop on Sequence Learning, 1999: 17-21.
    76 G.Karypis et al. CHAMELEON: A hierarchical clustering algorithm using dynamic modeling. IEEE Computer, 1999,32(8):68-75.
    77 M. Ester et al. A density-based algorithm for discovering clusters in large spatial database with noise. In the proceedings of the 2nd ACM SIGKDD,1996: 226-231.
    78 R.Jenssen et al.Clustering using Renyi’s Entropy. Proceedings of the International Joint Conference on Neural Networks,2003,1:523-528.
    79 S.Asharaf. S.K.Shevade and M.N.Murty. Rough support vector clustering. Pattern Recognition, 2005,38(10):1779-1783.
    80 D.Q.Zhang and S.C.Chen. Clustering incomplete data using kernel-based fuzzy C-means algorithm. Neural Processing Letters, 2003, 18:155-162.
    81 J.C.Dunn. Well-separated clusters and the optimal fuzzy partitions. Journal of Cybernetics, 1974,4:95-104.
    82 J.C. Bezdek. Clustering validity with fuzzy sets. Journal of Mathematical Biology, 1974, 22(1):57-71.
    83 E.Trauwaert. On the meaning of Dunn’s partition coefficient for fuzzy clusters. Fuzzy Sets and Systems, 1988, 25:217-242.
    84 D.Davies and D.Bouldin. A cluster separation measure. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1979, 2(1):224-227.
    85 L.Hubert and J.Schultz. Quadratic assignment as a general data-analysis strategy. British Journal of Mathematical and Statistical Psychology, 1976, 29:190-241.
    86 J.Dunn. Well separated clusters and optimal fuzzy partitions. Journal of Cybernetics, 1974, 4:95-104.
    87 A.Hardy. On the number of clusters. Computational Statistics and Data Analysis, 1996, 23:83-96.
    88 X.L.Xie and G.A.Beni. Validity measure for fuzzy clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1991,3(3):841-846.
    89 M.K.Pakhira, S.Bandyopadhyay, and U.Maulik. Validity index for crisp and fuzzy clusters. Pattern Recognition, 2004, 37(3):487-501.
    90 M.K.Pakhira, S.Bandyopadhyay, and U.Maulik. A study of some fuzzy cluster validity indices, genetic clustering and application to pixel classification. Fuzzy Sets and Systems, 2005, 155(2):191-214.
    91 J.Kim and S.Choi. Semidefinite spectral clustering. Pattern Recognition, 2006,39(11):2025-2035.
    92 L.Angelini et al. Kernel method for clustering based on optimal target vector.Physics Letters A, 2006, 42(2):413-416.
    93 K.N. Plataniotis et al. A new time series classification approach. IEEE International Conference on Acoustics,Speech and Signal Processing, 1997,4: 3346-3348.
    94 V. Pavlovic et al. Time-series classification using mixed-state dynamic Bayesian network.Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1999, 2:609-615.
    95 E. Haselsteiner et al. Using time-dependent neural networks for EEG classification. IEEE Transactions on Rehabilitation Engineering, 2000, 8(4):457-463.
    96 W.Au et al. Mining fuzzy rules for time series classification. IEEE, 2004:239-244.
    97 R.J. Povinelli et al. Time series classification using Gaussian mixture models of reconstructed phase spaces. IEEE transactions on Knowledge and Data Engineering, 2004, 16(6):779-783.
    98 C. Dietrich et al. Classification of bioacoustic time series based on the combination of global and local decisions. Pattern Recognition, 2004, 37:2293-2350.
    99 T. McConaghy et al. Classification of audio radar signals using radial basis function neural networks. IEEE Transactions on Instrumentation and Measurement, 2003, 52(6):1771-1179.
    100 J.J.Rodriguez et al. Support vector machines of interval-based features for time series classification. Knowledge-Based Systems, 2005, 18:171-178.
    101 Y.P. Huang et al. Pattern recognition in time series database: a case study on financial database. Expert Systems with Applications, 2007, 33:199-205.
    102 P.K. Dash et al. Mining for similarities in time series data using wavelet-based feature vectors and neural networks. Engineering Applications of Artificial Intelligence, 2007, 20:185-201.
    103 S.Theodoridis and K. Koutroumbas. Pattern recognition. 北京:机械工业出版社,2003:207-255.
    104 H.Li, T.Jiang and K.Zhang. Efficient and robust feature extraction by maximum margin criterion. IEEE Transactions on Neural Networks, 2006, 17(1): 157-165.
    105 W.Zuo, D.Zhang,J.Yang and K.Wang. BDPCA plus LDA: a novel fast feature extraction technique for face recognition. IEEE Transactions on Systems, Man, Cybernetics-Part B: Cybernetics, 2006, 36(4):946-953.
    106 Z.Hong and J.Yang. Optimal discriminant plane for a small number of samples and design method of classifier on the plane. Pattern Recognition, 1991, 24(4):317-324.
    107 J.H. Friedman. Regularized discriminant analysis. Journal of American Statistics Association, 1989, 84: 165-175.
    108 T.Hastie and R.Tibshirani. Penalized discriminant analysis. Ann.Stat., 1995,23:73-102.
    109 L.F.Chen, H.Y.M.Liao, M.T.Ko, J.C.Lin and G.J.Yu. A new LDA-based face recognition system which can solve the small sample size problem. Pattern recognition, 2000,33(10):1713-1726.
    110 H.Yu and J.Yang. A direct LDA algorithm for high-dimensional data – with application to face recognition. Pattern Recognition, 2001, 34 (10):2067-2070.
    111 C.Xiang and D.Huang. Feature extraction using recursive cluster-based linear discriminant with application to face recognition. IEEE Transactions on Image Processing, 2006, 15(12):3824-3832.
    112 B.Yu, B.Yuan. A more efficient branch and bound algorithm for feature selection.Pattern Recognition, 1993, 12(26):883-889.
    113 X.Chen. An improved branch and bound algorithm for feature selection. Pattern Recognition Letters, 2003, 12(24):1925-1933.
    114 W.Siedlecki, J.Sklansky. A note on genetic algorithm for large scale feature selection. Pattern Recognition Letters, 1989, 10:335-347.
    115 F.J.Ferri et al. Comparative study of techniques for large-scale feature selection. Pattern Recognition in Practive IV, Multiple Paradigms, Comparative Studies and Hybrid Systems, E.S.Gelsema, L.S.Kanal, Eds., Amsterdam, Elsevier, 1994:403-413.
    116 H.Zhang, G.Sun. Feature selection using Tabu search method. Pattern Recognition, 2002, 35(3):44-49.
    117 D.Korycinski et al. Adaptive feature selection for hyperspectral data analysis using a binary hierarchical classifier and Tabu search. Proceedings ofInternational Geoscience and Remote Sensing Syposium,2003:297-299.
    118 Y.Liu et al. Feature selection with particle swarms. Lecture Notes in Computer Science, Springer-Verlag, 2004,3314:425-430.
    119 陈光英,张千里,李星.特征选择和 SVM 训练模型的联合优化.清华大学学报(自然科学版),2004,44(1):9-12.
    120 X.H. Hu, N.Cercone. Learning in relational database: A rough set approach. Computational Intelligenc, 1995, 11(2):323-338.
    121 杨 明 . 一 种 基 于 改 进 差 别 矩 阵 的 核 增 量 式 更 新 算 法 . 计 算 机 学 报 , 3(29):407-413.
    122 叶 东 毅 . 一 个 改 进 的 Jelonek 的 属 性 约 简 算 法 . 电 子 学报,2000,28(12):81-82.
    123 刘少辉,盛秋戬,史忠植.一种新的快速计算正区域的方法.计算机研究与发展,2003,40(5):637-642.
    124 刘 少 辉 , 盛 秋 戬 等 .Rough 集 高 效 算 法 的 研 究 . 计 算 机 学报,2003,26(3):524-529.
    125 杜金莲,迟忠先, 翟巍. 基于属性重要性的逐步约简算法.小型微型计算机系统.2003, 24(6):976-978.
    126 刘文军,谷云东,冯艳宾,王加银.基于可辨识矩阵和逻辑运算的属性约简算法.模式识别与人工智能, 2004,17(1):119-123.
    127 徐章艳,刘作鹏,杨炳儒,宋威.一个复杂度为 ma(O(|C||U|),O(|C|2|U/C|))的快速属性约简算法.计算机学报, 2006,3(29):391-399.
    128 梁霖,徐光华.基于克隆选择的粗糙集属性约简方法.西安交通大学学报, 39(11):1231-1235.
    129 S.Q.Rice. Mathematical analysis of random noise.III.Statistical properties of random noise currents. J.Bell Sys.Tech., 1945, 24:46-108.
    130 N.E.Huang,Z.Shen,S.R.Long. A new view of nonlinear water waves: the Hilbert spectrum. J.Annu.Rev.Fluid Mech., 1999, 31:417-457.
    131 P.A.Hwang, N.E.Huang and D.W.Wang. A note on analyzing nonlinear and nonstationary ocean wave data. Applied Ocean Research, 2003, 25(4):187-193.
    132 R.R.Zhang, R.King, L.Olson and Y.Xu.Dynamic response of the Trinity River Relief Bridge to controlled pile damage: modeling and experimental data analysis comparing Fourier and Hilbert–Huang techniques. Journal ofSound and Vibration,2005,285(4-5):1049-1070.
    133 D.Pines and L.Salvino. Structural health monitoring using empirical mode decomposition and the Hilbert phase. Journal of Sound and Vibration, 2006, 294(1-2):97-124.
    134 Z.Liu and S.Peng. Boundary processing of bidimensional EMD using texture synthesis. IEEE Signal Processing Letters, 2005, 12(1):33-36.
    135 P.M.Oliveira and N.Barroso. Instantaneous frequency of multicomponent signal. IEEE Signal Processing Letters. 1999,6:81-83.
    136 S.Kay. Modern spectral estimation. Englewood Cliffs, NJ:Prentice-Hall,1988: 50.
    137 M.Sansal, S.A.Kayhan. IF and GD estimation from evolutionary spectrum. Signal Processing, 2001, 81:197-202.
    138 Z.K.Peng, P.W.Tse and F.L.Chu. An improved Hilbert-Huang transform and its application in vibration signal analysis. Journal of Sound and Vibration, 2005, 286(1-2):187-205.
    139 P.R.Zhang, L.VanDemark, J.Liang and Y.Hu. On estimating site damping with soil non-linear from earthquake recordings. International Journal of Non-linear mechanics, 2004, 39(9):1501-1517.
    140 Z.K.Peng and P.W.Tse and F.L.Chu. A comparison study of improved Hilbert-Huang transform and wavelet transform: application to fault diagnosis for rolling bearing. Mechanical Systems and Signal Processing, 2005,19(5): 978-988.
    141 P.Flandrin, G.Rilling and P.Goncalves. Emperical mode decomposition as a filter bank. IEEE Signal Processing Letters, 2004, 11(2):112-114.
    142 H.Ding, Z.Huang, Z.Song and Y.Yan. Hilbert-Huang transform based signal analysis for the characterization of gas-liquid two-phase flow. Flow Measurement and Instrumentation, 2007, 18(1):37-46.
    143 J.Cheng, D.Yu and Y.Yang. Discussion of the end effects in Hilbert-Huang transform. J.Vib.Shock, 2005, 24(6):40-47.
    144 J.Cheng, D.Yu and Y.Yang. Application of support vector regression machines to the processing of end effects of Hilbert-Huang transform. Mechanical Systems and Signal Processing, 2007, 21(3):1197-1211.
    145 张郁山,梁建文,胡聿贤.应用自回归模型处理 EMD 方法中的边界问题.自然科学进展,2003,13(10):1054-1059.
    146 Y.Deng and W.Wang. Boundary processing technique in EMD method and Hilbert transform. Chinese Science Bullitin, 2001, 46(11):257-263.
    147 钟佑明.希尔伯特-黄变换局瞬信号分析理论的研究[D].重庆:重庆大学博士学位论文,2002.
    148 邓拥军,王伟,钱成春,王忠,戴德君.EMD 方法及 Hilbert 变换中边界问题的处理.科学通报,2001,46(3):257-263.
    149 刘慧婷,张旻,程家兴.基于多项式拟合算法的 EMD 端点问题的处理.计算机工程与应用,2004,16:84-86.
    150 S.J.Baek, B.K.Jeon and K.M.Sung. A fast encoding algorithm for vector quantization. IEEE Signal Processing Letters, 1997,12(4):325-327.
    151 Time Series Prediction group [EB/OL]. http://www.cis.hut.fi/projects/tsp/? page=Timeseries,2006-05-08.
    152 Ethernet Packet [EB/OL]. http://math.bu.edu/people/murad/methods/time _series/index.html#Ethernet, 2006-05-08.
    153 解应春, 王海清,李平. RKRLS 及在混炼胶质量建模与预测中的应用研究. 浙江大学学报, 2004, 38(8): 941-945.
    154 Y.Engel et al.The kernel recursive least-squares algorithm. IEEE Transaction on Signal Processing, 2004, 52(8): 2275-2285.
    155 Richard O. Duda, Peter E. Hart, David G. Stork. 模式分类(第二版). 北京:机械工业出版社,2003:416-417.
    156 M.Halkidi, Y. Batistakis and M. Vazirgiannis. On clustering validation techniques. Intelligent Information Systems, 2001, 17(2-3):107-145.
    157 S. Wu et al. Clustering of the self-organizing map using a clustering validity index on inter-clustering and intra-cluster density. Pattern Recognition, 2004:175-188.
    158 M.J.Fadili et al. on the number of clusters and the fuzziness index for unsupervised FCA application to BOLD fMRI time series. Medical Image Analysis, 2001, 5:55-67.
    159 U.Moller, M. Ligges, P.Georgiewa, C.Grunling, W.A.Kaiser, H.Witte, B.Blanz. How to avoid spurious cluster validation?A methodological investigation on simulated and fMRI data. NeuroImage, 2002,35:431-446.
    160 E.Gokcay and J.Principe. Information theoretic clustering. IEEETransactions on Pattern Analysis and Machine Intelligence, 2002, 24(2):158-170.
    161 UCI KDD Archive [EB/OL]. http://kdd.ics.uci.edu.
    162 Juan José Rodríguez, Carlos J. Alonso and José A. Maestro. Support vector machines of interval-based features for time series classification. Knowledge-Based Systems, 2005,18(4-5):171-178.
    163 Z. Pawlak. Rough sets. International Journal of Computer and Information Science, 1982, 11(5):341-356.
    164 Z. Pawlak. Rough sets and intelligent data analysis. Information Science, 2002, 147(1-4):1-14.
    165 Tay F et al. Fault diagnosis based on rough set theory. Engineering Applications of Artificial Intelligence, 2003, 16(1):39-43.
    166 Zhang T F. Dynamic system modeling based on rough sets and RBF neural networks [A]. Proc. Of the 5th World Congress on Intelligent Control and Automation[C]. Hangzhou, 2004:185-189.
    167 张文修等.Rough 集理论与方法[M].北京:科学出版社,2001.
    168 Richard Jensen, Qiang Shen. Semantics-Preserving Dimensionality Reduction: Rough and Fuzzy-Rough-Based Approaches. IEEE Transactions on Knowledge and Data Engineering, December, 2004, 16 (12):1457-1471.
    169 A.Skowron, C.Rauszer. The Discernibility Matrices and Functions in Information Systems. Intelligent Decision Support, 1992:331-362.
    170 Liang Ji-Ye, Xu Zong-ben. The algorithm on knowledge reduction in incomplete information systems. International Journal of Uncertainty. Fuzziness and Knowledge-based Systems, 2002,10(1): 95~103.
    171 胡可云,陆玉昌,石纯一. 粗糙集理论及应用进展. 清华大学学报(自然科学版),2001,41(1): 64-68.
    172 J.W. Guan, D.A. Bell. Rough computational methods for information systems. Artificial Intelligence, 1998, 105(1-2):77-103.
    173 J.Jelonek,K.Krawiec, R.Slowinski. Rough set reduction of attributes and their domains for neural networks. International Journal of Computational Intelligence, 1995, 11(2): 339-347.
    174 J.Wang et al. Reduction algorithms based on discernibility matrix: The ordered attributes method. Journal of Computer Science and Technology,2001, 16(6):489-504.
    175 F.M?rchen. Time series feature extraction for data mining using DWT and DFT. Department of Mathematics and Computer Science, Philipps-University Marburg,Technical Report No.33,2003.7
    176 U.Thissen et al. Using support vector machines for time series prediction. Chemometrics and Intelligent Laboratory Systems,2003,69(1-2):35-49.
    177 W.C.Chan et al.On the modeling of nonlinear dynamic systems using support vector neural networks. Engineering Applications of Artificial Intelligence,2001,14(2):105-113.
    178 J.Suykens, J.Vandewalle, B. De Moor. Optimal control by least squares support vector machines. Neural Networks, 2001,14:105-113.
    179 K.R.Muller,A.J.Smola et al. Predicting time series with support vector machines[C]. Proceeding of the International Conference on Artificial Neural Networks. Lausaunne,Switzerland,Springer,1997:999-1004.
    180 王定成,方廷健,高理富,马永军.支持向量机回归在线建模及应用.控制与决策,2003,18(1):89-91.
    181 B.E.Boser,I.M.Guyon and V.N.Vapnik. A training algorithm for optimal margin classifiers.5th Annual ACM Workshop on COLT, 1992:144-152.
    182 E.Osuna, R.Freund, F.Girosi.An improved training algorithm for support vector machines. Proceedings of the IEEE Workshop on Neural Networks for Signal Processing, 1997. 276-285
    183 Joachims T. SVMlight: an implementation of support vector machines,Version:5.00. http://svmlight.joachims.org
    184 J.Platt. Sequential minimal optimization: a fast algorithm for training support vector machines[A].Advances in Kernel Methods-Support Vector Learning [C].The MIT Press,1999: 185-208.
    185 李建民 ,张钹 ,林福宗 .序贯最小化的改进算法 .软件学报 ,2003,14(5): 918-924.
    186 Y.J. Lee and O. Mangasarian. RSVM: Reduced support vector machines. Proceedings of the First SIAM International Conference on Data Mining, 2001: 350-366.
    187 K.-M. Lin and C.-J. Lin. A study on reduced support vector machines. IEEE Transactions on Neural Networks,2003,14:1449-1459.
    188 C.Williams and M.Seeger.Using the nystrom method to speed up kernel machines. In T.K.Leen,T.G.Diettrich, and V.Tresp, editors. Advances in Neural Information Processing Systems.MIT Press,2001,13:682—688.
    189 H.Shin and S.Cho. Fast pattern selection for support vector classifiers. Lecture Notes in Computer Science, 2003,2637: 376-387.
    190 M.H.Yang et al. A geometric approach to train support vector machines[C].Proceedings of CVPR2000,2000:430-437.
    191 R.Koggalage and S.Halgamuge. Reducing the number of training samples for fast support vector machine classification. Neural Information Processing- Letters and Reviews, 2004, 2(3):57-65.
    192 S.Abe and T.Inoue. Fast training of support vector machines by extracting boundary data. In the proceedings of the international conference on artificial neural networks,2001:308-313.
    193 J.Wang, P.Neskovic, and L.Cooper. Training data selection for support vector machines. Lecture Notes In Computer Science: Advances in Natural Computation, 2005, 3610:554-564
    194 G.Cauwenberghs and T.Poggio. Incremental and decremental support vector machine learning. Machine Learning, 2001,44(13):409-415
    195 J.Ma,J.Theiler and S.Perkins. Accurate on-line support vector regression. Neural Computation, 2003,15:2683-2703.
    196 P. Laskov, C.Gehl,S. Kruger,K.-R. Muller. Incremental Support Vector Learning: Analysis, Implementation and Applications. Journal of Machine Learning Research, 2006, (7):1909-1936.
    197 S.Iplikci. Online trained support vector machines-based generalized predictive control of non-linear systems. International Journal of Adaptive Control and Signal Processing, 2006, 20: 599-621.
    198 N.A.Syed, H.Liu, and K.K.Sung. Incremental learning with support vector machines. In the proceedings of Joint Conferenc on Artificial Intelligence, 1999:352-356
    199 B.Peng, W.Liu, Y.Liu and G.Huang. An SVM-based incremental learning algorithm for user adaptation of sketch recognition. International Journal of Pattern Recognition and Artificial Intelligence, 2004, 18(8):1529-1550.
    200 R.Xiao, J.Wang, and F.Zhang. An approach to incremental SVM learningalgorithm. Proceedings of the 12th IEEE International Conference on Tools with Artificial Intelligence, 2000:268-273.
    201 L.Ralaivola and F.d’Alché-Buc. Incremental support vector machine learning: a local approach. Lecture Notes in Computer Science, 2001, 2130:322-329.
    202 W.Wang. An Incremental Learning Strategy for Support Vector Regression. Neural Processing Letters, 2005, 21:175-188.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700