用户名: 密码: 验证码:
基于流形学习的旋转机械故障诊断方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
故障诊断需解决的基本问题是根据传感器采集到的机械设备运行状态信号,提取特征参量,设计决策函数,最终求出其故障状态,核心是特征提取和模式识别问题。由于机械设备运行状态复杂、工作环境恶劣、运行时间较长,因此其状态信号具有数据量大、非线性程度高、噪声干扰强等特性,人们对这些庞大而复杂的数据信息地驾驭和处理越来越难,表现在一方面我们可以获取的数据量越来越大,而另一方面却难以得到更多有助于决策的信息。从2000年开始发展起来的流形学习方法,将数据分析与状态决策从欧氏空间扩展到流形,从而能够从分布在高维流形上的数据集中高效快速地挖掘出数据的本质特征,找到数据产生的内在规律,达到准确故障诊断的目的。论文主要进行了以下研究工作:
     在非线性降噪方面,提出三种流形学习降噪方法。基于本征维数的局部切空间降噪方法,直接将高维相空间数据直接约简到信号主流形所在的本征维数空间上,再反求一维信号实现对信号的降噪,避免了主流形提取时约简目标维数的盲目性。局部切空间均值重构降噪方法,将局部切空间降噪后的低维数据在全局范围内通过求取各点均值的方法重构到原高维空间,相当于对各点的局部坐标进行均值处理,实质是二次降噪,避免了相空间数据在全局排列过程中出现相点畸变的问题。基于高阶累积量的局部切空间降噪方法,利用高阶累积量理论上可完全抑制高斯有色噪声干扰的特性,用四阶累积量函数代替二阶矩函数构造协方差矩阵,提高了对含有色噪声信号的降噪效果。
     为解决了局部Fisher判别分析求解不对称特征方程时得出的投影基向量不正交使得数据重建困难的问题,提出了基于迭代正交和Schur正交的局部Fisher判别方法。通过迭代正交或Schur正交分解的方法构建正交基函数,可有效保留故障信号流形空间中的与近邻距离有关的结构信息,并在主特征求取的过程中,保留类别信息,使提取的主特征量能在尽量保持甚至降低类内散度的同时,使得类间信号特征量之间的距离尽可能远离,进而更好地实现故障诊断
     将核方法引入正交局部Fisher判别中,提出了基于核的迭代正交和Schur正交局部Fisher判别方法,通过非线性核函数将信号投影到高维特征空间,在此空间进行正交局部Fisher判别分析,进行故障特征提取,实现了线性流形学习方法到非线性方法的转变,取得了比线性正交方法更好的故障诊断效果。
     以局部边界邻域点来构建Fisher判别函数进行故障特征提取和诊断,提出局部边界Fisher判别方法,直接利用邻域空间边界点对来计算局部类内散度和类间散度,大大提高了方法效率。为避免伪边界点干扰,还设计了用模糊聚类来寻找真实局部边界的方法。用核方法实现了局部模糊聚类边界Fisher判别由线形向非线形方法的转变,基于核的方法具有更强的故障诊断能力。
     在监督流形学习方面,对增量局部切空间排列和线性局部切空间排列方法进行改进,并引入了非线性支持向量机分类器,提出了监督增量局部切空间排列-支持向量机和监督线性局部切空间排列-支持向量机故障诊断方法,既解决了非线性流形学习的泛化能力不足的问题,又增强了流形学习方法的故障诊断能力。
The basic problem of fault diagnosis is to obtain fault status by extracting feature, design decision functions based on information of equipment running,feature extraction and pattern recognition is the core of the problem.Machinery generally runs in a complex and poor working conditions,so the state signal has large amount of data, high nonlinearity characteristic, strong noise and interference.We are more and more difficult to manage and process these large and complex data, specific performance is that on the one hand we can get the amount of data, on the other hand it is hard to get more help in decision. Since 2000, manifold learning methods begin to develop and become research focus of the machine learning and pattern recognition. This method extends from euclidean space to manifold space in data analysis and state decision, efficiently and quickly to dig out the essential characteristics of the data from the high-dimensional data sets, to find the internal laws of data, and to achieve an accurate diagnosis.The main work include the following:
     In the perspective of nonlinear noise reducation, three manifold learning methods are proposed.In local tangent space alignment algorithm based on the intrinsic dimension, at first the intrinsic dimension of signal is obtained, and then the data in high dimensional phase space are reducated to the intrinsic dimension space, at last one-dimensional signal is obtained by reverse process. Algorithm avoid from blindness of dimension reduction targets' selection, improve the efficiency of noise reduction. In local tangent space mean reconstruction algorithm, low dimension data after noise reducation in local tangent space are reconstructed to the high dimension data by obtaining the mean of each point in global space. Algorithm's nature is the second noise reduction, not only enhancing effect of noise reduction,but also avoiding from the distortion of phase space data in the course of the global arrangement. Making use of restraining characteristic to colored noise of high-order cumulan,covariance matrix is constructed with a fourth-order cumulant function instead of construct second-order moment function covariance matrix,local tangent space alignment algorithm based on fourth-order cumulan is also proposed. This algorithm improves effect of noise reduction to signal with colored noise.
     In local fisher discriminant analysis, projection basis vectors obtained by calculating asymmetric the characteristic equatio are no-orthogonal,this leads to be difficult to data's reconstruction. To solve this question,iteration orthogonal and schur orthogonal local fisher discriminant methods are proposed. Orthogonal local fisher discriminant algorithm may effectively preserve the structure information of nearest neighbors in manifold space, and in the prosess of main features'seeking, class information are retained,and then main features obtained can maintain or even reduce the with-class divergence of the same category sample, at the same time make between-class distance becaome far, better achieve fault classification.
     In this paper,kernel method is introduced orthogonal local fisher discriminant analysis, iterative orthogonal and schur orthogonal local fisher fault diagnosis algorithm based on kernel method are proposed. Feature signals are projected into the high dimensional kernel space by nonlinear kernel function, and make orthogonal local fisher discriminant analysis in this space. Algorithm has achived transformation from linear to nonlinear method, and obtain better effect than linear orthogonal fault diagnosis.
     In the perspective of fault diagnosis based on the concept of local margin, local fuzzy clustering margin fisher discriminance is proposed. The fisher discriminant function is built by directly computing local within-divergence and between-class divergence using local margin points in neighborhood, instead of using all points, greatly increased the efficiency of the algorithm. In order to avoid from using possible pseudo margin points, a method is proposed ny means of fuzzy clustering algorithm to find the real local boundary.Meanwhile,by kener method local fuzzy clustering margin fisher discriminance becomes non-linear algorithm,and has better fault diagnosis ability.
     In the perspective of supervised manifold learning, increment local tangent space alignment(ILTSA) and linear local tangent space alignment(LLTSA) algorithm are improved, and nonlinear support vector machine(SVM) classifier is introduced, supervised ILTSA-SVM and supervised LLTSA-SVM are proposed. Two algorithms increaze generalization ability and fault diagnosis ability of the non-linear manifold learning.
引文
[1]虞和济,陈长征.基于神经网络的智能诊断[M].北京:冶金工业出版社,2001
    [2]王道平,张义忠.故障智能诊断系统的理论与方法.北京[M]:冶金工业出版社,2001
    [3]何正嘉,訾艳阳,孟庆丰,等.机械设备非平稳信号的故障诊断原理及应用.北京:高等教育出版社[M],2001
    [4]盛兆顺,尹琦岭.设备状态监测与故障诊断技术及应用[M].北京:化学工业出版社,2003
    [5]崔宁博、设备诊断技术,南京:南开大学出版社,1988
    [6]Mallat S, Hwang W.L. Singularity detection and processing with wavelets[J]. IEEE Transactions on Information Theory.1992,38(2):617-643.
    [7]Krim H, Mallat S, Donoho D.L. On denoising and best signal representation[J].. IEEE Transactions on Information Theory,1999,45(7):2225-2238
    [8]Hierehoren, Gustavo A. Estimation of fractal signals using wavelets and filter banks[J].. IEEE Transactions on Signal proeessing,1998,46(6):1624-1630
    [9]Juluri N, Swarnamani S. Improved accuracy of fault diagnosis of rotating machinery using wavelet denoising and feature selection[C].2003ASME(American Society of Mechanical Engineers) Turbo Expo, Atlanta, Georgia,2003,563-571
    [10]史秀志,薛剑光,陈寿如.爆破振动信号双线性变换的二次型时频分析[J].振动与冲击,2008.27(12):131-134
    [11]滕召胜,罗隆福,童调生.智能检测系统与数据融合[M].北京:机械工业出版社,2000
    [12]Grossmann P. Multisensor data fusion[J]. GEC Journal of Technology, 1998.15(1):27-37
    [13]Yager R.R, Filev D.P. Induced ordered weighted averaging operators[J]. IEEE Transactions on Systems,Man and Cybernetics, Part B,1999,29(2):141-150
    [14]李宏,刘江涛,安玮,等.主观Bayes方法与神经网络相结合的多传感器数据融合空间点目标识别方法[J].红外与毫米波学报,1997,16(6):448-454.
    [15]李宏,刘江涛,安玮,等.主观Bayes方法与神经网络相结合的多传感器数据融合空间点目标识别方法.红外与毫米波学报,1997,16(6):448-454.
    [16]Bogler P.L. Dempster-Shafer Reasoning with Applications to Multi-sensor Target Identification Systems[J]. IEEE Transactions on systems,Man,and cybernetics, 1989,17(6):901-930
    [17]Ertun H.M.C, Loparo K.A. A decision fusion algorithm for tool wear condition monitoring in drilling[J]. International Journal of Machine Tools & Manufacture, 2001,41:1347-1362
    [18]Liang H, Small target detection in multisensor system based on Demopster-Shafter evidence theory[J]. The International Society for Optical Engineering,2001, 9:272-279
    [19]Niu G, Ha T.n, Yang B.S, et al. Multi-agent decision fusion for motor fault diagnosis[J]. Mechanical Systems and Signal Processing,2007,21:1285-1299
    [20]Yan B.S.G, Kim J.K. Application of Dempster-Shafer theory in fault diagnosis of induction motors using vibration and current signals[J]. Mechanical Systems and Signal Processing,2006,20(2):403-420
    [21]李斌,章卫国,宁东方等.基于神经网络信息融合的智能故障诊断方法[J].计算机仿真,2008,25(6):35-37
    [22]王娜,杜海峰,庄健等.用于故障诊断的网络分割谱聚类方法[J].机械工程学报,2008,10(10):228-233
    [23]任若恩,王惠文.多元统计数据分析——理论、方法、实例[M].北京:国防工业出版社,1997
    [24]Jackson J E, A user's guide to principal components[M]. New York:John Wiley, 1991.
    [25]肖健华.智能模式识别方法[M].华南理工大学出版社,2006
    [26]Chang S.Y, Lin C.R,Chang.C.T. A fuzzy diagnosis approach using dynamic fault trees[J]. Chemical Engineering Science,2002,57(15):2971-2985.
    [27]Kokiopoulou E,Saad Y.Orthogonal neighborhood preserving projections[C], Fifth IEEE International Conference on Data Mining,2005.11
    [28]Zhang J. A self-learn fault-diagnosis system[J],Trans Instt MC,1991,13(1)
    [29]Deal D.E, Chen J.G, Ignizio J.P. An expert system scheduler:some reflections on expert systems development[J], Computers and Operations Research,1990,17(6): 571-580
    [30]Xiao L.T, Li M.Y. The fault diagnosis expert system for automatic control system[C]. Proceedings of the third international conference on Young computer scientists, Beijing, China,1993:2153-2154
    [31]Rahman A.F.R, Fairhurst M.C. Multiple expert classification:a new metho- dology for parallel decision fusion[J]. Internation Journal on Document Analysis and Recognition,2000,3:40-55
    [32]樊亚文,徐立中.黑板型信息融合专家系统在水环境监测中的应用[J].水利水文自动化,2006,1:19-22
    [33]韩捷,张瑞林,等.旋转机械故障机理及诊断技术[M].北京:机械工业出版社,1997
    [34]Tememnaum J.B, Silva V, Langford J.C. A global geometric gramework for nonlinear dimensionlity reduction[J].Science,2000,290(5500):2319-2323.
    [35]Roweis S, Saul L. Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.
    [36]Seung H.S, Lee D. The manifold ways of perception. Science[J],2000,290 (5500):2268-2269.
    [37]Belkin M,Niyogi P. Laplacian Eigenmaps for dimensionlity reduction and data representation[J],Neural Computation,2003,15(6):1373-1396
    [38]Belkin M,Niyogi P. Laplacian Eigenmaps and spectral techniques for embedding and clustering[C],NIPS 14,2002
    [39]Belkin M, Niyogi P. Semi-supervised learning on riemannian manifolds[J], Machine Learning,2004,56,209-239
    [40]Zhang Z.Y, Zha H.Y. Nonlinear dimension reduction via local tangent space alignment[C], IDEAL 2003, Hong Kong, China
    [41]Zhang Z.Y, Zha H.Y. Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment[J], SIAM J. Scientific Computing,2005.26(1): 313-318
    [42]李春光.流形学习及其在模式识别中的应用:[博士学位论文].北京:北京邮电大学,2007
    [43]阳建宏,徐金梧,杨德斌,等.基于主流形识别的非线性时间序列降噪方法及其在故障诊断中的应用[J].机械工程学报,2006.42(8):154-158
    [44]蒋全胜,贾民平,胡建中,等.基于拉普拉斯特征映射的故障模式识别方法.系统仿真学报[J],2008.20(20):5710-5713
    [45]成忠,诸爱士,陈德钊.ISOMAP-LDA方法用于化工过程故障诊断[J].化工学报,2009.60(1):122-126
    [46]张沐光,宋执环.LPMVP算法及其在故障检测中的应用[J].自动化学报,2009.6(6):7676-772
    [47]陈维省,微分流形初步(第二版)[M].北京:高等教育出版社.2001
    [48]孙明明.流形学习理论与算法研究:[博士学位论文].南京:南京理工大学,2007
    [49]徐蓉,姜峰,姚鸿勋.流形学习概述[J],智能工程学报.2006,1(1):44-51
    [50]Bregler C,Omohundro S.M. Nonlinear manifold learning for visual speech recognition[C],Int. Conf. Computer Vision,1995
    [51]Bregler C,Omohundro S.M. Nonlinear image interpolation using manifold learning[C],Advanced in Neural Information Processing Systems &,MIT Press,1995
    [52]Silva V.D.,Tenenbaum J.B..Global versus local methods in nonlinear dimensionality reduction[C],Neural Information Processing Systems 15(NIPS' 2002), 2002:705-712
    [53]Joseph B. Kruskal, Myron Wish. Multidimensional scaling[J].2001.SAGE Publications,Inc.
    [54]刘小明.数据降维及分类中的流形学习研究:[博士学位论文].杭州:浙江大学,2007
    [55]Belkin M,Niyogi P. Towards a theoretical foundation for laplacian-based manifold methods[J]. Journal of Computer and System Sciences,2008. 74(8):1289-1308
    [56]Yang L. Building k-edge-conneted neighborhood graphs for distance-based data projection[J],Pattern Recognition Letters,2005.10(13):2015-2021
    [57]Yang L.Building k-conneted neighborhood graphs for isometric data embedding [J],IEEE Trans. Pattern Analysis and Machine Intelligence,2005.5(5):1680-1683
    [58]Bernstein M,Silva V.D,Langford J.C,et al.Graph approximations to geodesics on embedded manifolds[C], Stanford University, Technical report,2000.9
    [59]何力,张军平,周志华.基于放大因子和延伸方向研究流形学习算法[J].计算机学报,2005.28(12):2000-2009.
    [60]Zha H,Zhang Z. Isometric embedding and continuum ISOMAP[G], Proceedings of the Twentieth International Conference on Machine Learning(ICML 2003).2003.
    [61]Zha H,Zhang Z. Continuum isomap for manifold learnings[J]. Computational Statistics & Data Analysis,2006.
    [62]Chen H,Jiang G,Yoshihira K. Robust nonlinear dimensionality reduction for manifold learning[C]. Proeeedings of the 18th International Conference on Pattern Recognition(ICPR2006),2006.
    [63]Yang L.Building k edge-disjoint spanning trees of minimum total length for iso metric data embedding[J], Pattern Analysis and Machine Intelligenee, IEEE Transactions on,2005.27(10):1650-1653.
    [64]Zhang C.H,Wang J. Zhao N.Y,et al. Reconstruction and analysis of multi-pose face images based on nonlinear dimensionality reduction[J].Pattern Recognition, 2004,37(1):325-336.
    [65]Kokiopoulou E,Saad Y. Orthogonal neighborhood preserving projections[C],Fifth IEEE International Conference on Data Mining,2005.11
    [66]He X,Cai D,Yan S,et.al. Neighborhood preserving embedding[C], ICCV 2005,2005.10(2):1208-1213
    [67]Kouropteva O,Okun O, Pietiknen M. Incremental locally linear embedding[J], Pattern Recognition,2005.10(10):1764-1767
    [68]Ridder D,Kouropreva O,Okun O.Supervised locally linear embedding[J], Proceedings of Artifical Neural Networks and Neural Information Processing (ICANN/ICONIP),2003,LNCS2714,Springer,333-341
    [69]Ridder D,Loog M,Reinders M.Local Fisher Embedding[CICPR2004
    [70]He X,Niyogi P. Locality preserving projections[C,Proceedings of the Annual Conference on Neural Information Processing Systems 16,NIPS2004
    [71]Yan S,Xu D,Zhang B,et al. Graph embedding:A general framework for dimensionality reduction[C],PR2005(2),830-837
    [72]Chen H.T,Chang H.W,Liu T.L. Local discriminant embedding and its variants[C],CVPR2005(2),846-853
    [73]Weinberger K.Q,Saul L.K. Unsupervised learning of image manifolds by semidefinite programming[C],CVPRⅡ,2004:988-995
    [74]The Y.W,Roweis S.T. Automatic alignment of local representations [J], Advances in Neural Information Processing Systems,2003(15):841-848
    [75]Brand M. Charting a manifold[J],Advanced in Neural Information Processing Systems,2003(15):961-968
    [76]Brand M.From subspaces to submanifolds[J],Technical Reports:2004-134
    [77]Brand M. Non-rigid embeddings for dimensionality reduction[C],European conference on Machnie Learning(ECML)2005.10(3720)
    [78]Lafon S,Keller Y,Coifman R. Data fusion and multicue data matching by diffusion maps[J]. IEEE Transaction on pattern analysis and machine intelligence. 2006,28(11):1784-1797.
    [79]Saul L.K,Roweis S.T. Think globally,Fit locally:Unsupervised learing of low dimensional manifolds[J].Jounal of Machine Learning Research.2003,4(4):119-155
    [80]Weinberger K, Saul L. Unsupervised Learning of image manifolds by semidefinite programming[J].International Jounal of Computer Vision,2006, 70(1):77-90
    [81]Chang Y,Hu C,Feris R,et al. Manifold based analysis of facial expression[J]. Image Vision Computing,2006,24(6):605-614
    [82]Gu R.J,Xu W.B. Face recognition based on supervised kernel isomap[C]. Proceedings of International Coference on Computational Intelligence and Security. 2006,1:674-677
    [83]Mekuz N,Bauckhage C,Tsotsos J.K. Face recognition with weighted locally linear embedding[C]. Proceedings of the conference on computer and robot vision, 2005:290-296
    [84]Zhu L,Zhu S.A. Face recognition based on extended locally linear embedding[C]. Proceeding of IEEE conference on industrial electronics and application,2006:1-4
    [85]Geng X,Zhan D.C,Zhou Z.H. Supervised nonlinear dimensionality reduction for visualization and classification[J]. IEEE Transcations on Systems,Man and Cybernetics -Part B:Cybernetics,2005,35(6):1098-1107
    [86]Chang H,Yeung D.Y. Locally linear metric adaptation with application to semi-supervised clustering and image retrieval[J]. Pattern Recognition.2006,39(7): 1253-1264
    [87]Souvenir R,Pless R. Manifold clustering. Proceedings of the International Congerance on Computer Vision[J],2005,1:648-653
    [88]Seward A.E.,Bodenheimer B. Using nonlinear dimensionality reduction in 3D figure animation[C]. Proceedings of the ACM Regional Conference,2005,2:388-392
    [89]Niskanen M,silvn O. Comparison of dimensionality reduction methods for wood surface inspection[C].Proceedings of the International Conferrence on Quality Control by Artificial Vision,2003:178-188
    [90]Chen H.F, Jiang G.F. Yoshihira K.Robust nonlinear dimensionality reduction for manifold learning[C]. Proceedings of the International Conference on Pattern Recognition,2006,2:447-450
    [91]Shi R,Shen I.F,Chen W,et al.. Manifold learning for image denosing[C]. Proceedings of the fifth International Conference on Computer and Information Technology.2005:596-602
    [92]李勇,陈贺新,赵刚等.基于可变k近邻LLE数据降维的图像检索方法[J].吉林大学学报(工学版),2008 38(4):946-949
    [93]He J,Li M,Zhang H.J.,et al. Manifold ranking based image retriecal[C]. Proceedings of 12th ANN.ACM internel Conference.Multimedia 2004
    [94]Cai D,He X.F,Han J.W. Regularized regression on image manifold for retrieval[C].Proceedings of 9th ACM SIGMM International Workshop on Multimedia Information Retrieval,Augsburg,Germany,2007
    [95]He X.F,Cai D,Han J.W. Learning a maximum margin subspace for image retrieval[J]. IEEE Transactions on Konwledge and Data Engineering.2008.20(2): 189-201
    [96]He X.F,Yan S.C,Hu Y.X,et al. Face recognition using laplacianfaces[J].IEEE Transcations on Pattern Analysis and Machine Intelligence,2005,27(3):1-13
    [97]Yang J,Zhang D.Yang J.Y,et al. Globally maximizing,Locally minimizing: Unsupervised discriminant projection with applications to face and palm biometrics[J]. IEEE Trans.Pattern Analysis and Machine Intelligence.2007,29(4):650-664
    [98]Nilsson J. Nonlinear dimensionality reducation of gene expression data[J]. Phdthedsis Lund University,2006
    [99]Shen X.L,Meyer F.G. Nonlinear dimensionality reducation and activation detection for fMRI dataset[C].Proceedings of International Conference on Computer Vision and Pattern Recognition workshop.2006:144-151
    [100]Verna R,Davatzikos C. Manifold baded analysis of diffusion tensor images using isomaps[J].IEEE International Symposium on Biomedical Imaging,2006:790-793
    [101]Errity A,Mckenna J. An investigation of manifold learning for speech analysis[C].Proceedings of the International Conference on Spoken Language Processing,2006:2506-2509
    [102]许馨,物福朝,胡站义,等.一种基于非线性降维求正常星系红移的新方法[J].光谱与光谱分析,2006,26(1):182-186
    [103]尹峻松.流形学习理论与方法及其在人脸识别中的应用:[博士学位论文].长沙:国防科技大学,2007
    [104]Yin J.S.,Hu D W,Zhou Z T,et al. Noisy manifold learning using neighborhood smoothing embedding[J].Pateern Recognition,2008.8(11):1613-1620
    [105]谭璐,吴翊,易东云.稳健局部线性嵌入方法[J].国防科技大学学报,2004,26(6):91-95
    [106]Choi H,Choi S. Kernel isomap on noisy manifold[C].Proc of IEEE International conference on Development and Learning 2005
    [107]Zhang Z.Y, Zha H.Y. Local linear smoothing for nonlinear manifold learning[J].CSE-03-003,Technical Report,Penn State University,2003
    [108]Chang H,Yeung D Y. Robust locally linear embedding[J].Pattern Recognition. 2006,39(6):1053-1065
    [109]Hein M,Maier M. Manifold denoising[C].Advances in Neural Information Processding Systems 20,8,MIT Press,Cambridge,MA,2006.
    [110]Ham I,Lee D,Mika S, et al. A kernel view of the dimensionality reduction of manifolds[C].proc of the 21st International Conference on Machine Learning Banff, 2004:369-376
    [111]Choi H,Choi S. Kernel isomap[J].electronics letters,2004,40(25):1612-1613
    [112]Yan S.C,Xu D,Zhang H.J,et al. Graph embedding a general framework for dimensiongality reducation[C], Proc of Conference on Computer Vision and Pattern Recognition,2005:830-837
    [113]Weinberger K.Q, Sha F,Saul L.K. Learning a kernel matrix for nonlinear dimensionlity reduction[J].Proc of the 21st International Conference on Machine Learning Banff,2004:839-846
    [114]Wu Y.M,Chan K.L. An extended isomap algorithm for learning muli-class manifold[C],Proc.ICMLC2004,2004.6:3429-3433
    [115]Geng X,Zhang D.C,Zhou Z.H. Supervised nonlinear dimensionality reduction for visualization and classification[J],IEEE Trans. On SMC-part B:Cybernetics, 2005.35(6):1098-20
    [116]Yang Xin,Fu Hao-ying, Zha Hong-yuan,et al. Semi-supervised nonlinear dimensionality reducation[C]. Proc of the 23th international Conference on Machine Learning 2006:1065-1072
    [117]Souvenir R., Pless R. Manifold clustering[C].Proc of IEEE international Conference on Computer Vision.Beijing,2005:648-653
    [118]谭璐.高维数据的减维理论及应用[J].长沙:国防科技大学.2005
    [119]詹德串,周志华.基于集成的流形学习可视化[J].计算机研究与发展.2005,42(9):1533-1537
    [120]Jenkins O, Mataric M.A. Spatio-temporal extension to isomap nonlinear dimension reduction[J].Proc of the 21st International Conference on Machine Learning 2004:56-64
    [121]Zhang Z.Y,Zha H.Y. A domain decomposition method for fast manifold learning[J].Advances in Neural Information Processing Systems,2006:1625-1632.
    [122]Takens F. Determing strang attractors in turbalence[J].Lecture notes in Math,1981,898:361-381
    [123]李夕海,刘代志,张斌,等.基于重采样的混沌时间序列相空间重构研究[J].信号处理,2006,22(2):248-251.
    [124]Takens F. Determing strang attractors in turbalence[J].Lecture notes in Math,1981,898:361-381
    [125]Matthew B. Kennel. False neighbors and false strands:a reliable minimum embedding dimension algorithm[J]. Physical Review E,2002,66(2):1-18.
    [126]John F. Gibson, J Doyne Farmer, Martin Casdagli,elt. An analytic approach to practical state space reconstruction[J]. Physica D,1992,57(1):1-30.
    [127]Kim H.S, Eykholt R, Salas J.D. Nonlinear dynamics, delay times,and embedding windows[J]. Physica D,1999,127(2):48-60.
    [128]Henry D.I, Abarbanel. The analysis of observed chaotic data in physical systems [J].Review of Modern Physics,1993.65(4):1331-1393.
    [129]Th Buzug G. Pfister. Optimal delay time and embedding dimension for delay-time coordinates by analysis of the global static and local dynamical behavior of strange attractors[J]. Physical Review A,1992,45(10):7073-7084.
    [130]何岱海,徐健学,陈永红.非线性动力学相空间重构中小波变换方法研究[J].振动工程学报,1999,12(1):27-32.
    [131]陈哲,冯天谨,张海燕.基于小波神经网络的混沌时间序列分析与相空间重构[J].计算机研究与发展,2001,38(5):591-596.
    [132]吕金虎,陆君安,陈士华.混沌时间序列分析及其应用[M].武汉:武汉大学出版社,2002
    [133]蒋培,胡晓棠.一种新的选择相空间重构参数的方法[J].机械科学与技术,2001,120(3):364—367.
    [134]胡晓,陈永军,曾敏,等.一种选取相空间重构最优延迟时间的算法.电子科技大学学报,2000,29(3):282—-285.
    [135]Bennett R.S, The intrinsic dimensionality of signal collections[J], IEEE Transcations on Information Theory,1969,15(5):517-525.
    [136]Fukunaga K,Olsen D.R. An algorithm for finding intrinsic dimensionality of data[J],IEEE Trans. On Computers,1971,C-20,176-183.
    [137]Verveer P.,Duin R. An evaluation of intrinsic dimensionality estimators [J],IEEE Trans. On RAMI,1995.1(17):81-86
    [138]Bruske J.Sommer G Intrinsic dimensionality estimation with optimally topology preserving maps[J]. IEEE Trans.on PAMI 1998.5(20):572-575.
    [139]石博强,申焱华.机械故障诊断的分形方法——理论与实践[M].北京:冶金工业出版社,2001,267-287
    [140]Levina E, Biekel P. Maximum likelihood estimation of intrinsie dimension[J]. Advances in Neural Information Proeessing Systems.15,2005:777-784
    [141]Jolliffe I.T. Principal component analysis[C].New York:Spronger-Verlag,1986
    [142]Camastra F, Vinciarelli A. Estimating the Intrinsic dimension of data with a fractal-based method[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(10):1404-1407.
    [143]Grassberger P, Procaccia I. Measuring the strangeness of strange attractors[J], Physica D,1983,9:189-208
    [144]侯澍昊,李友荣,刘光临.局部线性嵌入算法及其在信号处理中的应用[J].仪器仪表学报,2006,27(6)
    [145]Nikias C.L, Mendel J.M. Signal processing with higher-order spectra[J]. IEEE Signal Processing Magazine,1993,10:7-11.
    [146]Heidari S, Nikias C.L. Characterizing chaotic attractors using fourth-order off-diagonal cumulant slices[C].1993 Conference Record of the Twenty-Seventh Asilomar Conference on Signals Systems and Computers Pacific Grove(CA,USA): IEEComputer Society Press.1993,1:466-470.
    [147]Schittenkopf C, Deco G. Testing nonlinearMarkovian hypotheses in dynamical systems[J]. Physica D,1997,104:61-74.
    [148]张贤达.时间序列分析一高阶统计量方法[M].北京:清华大学出版社,1996,23-24
    [149]Lorenz E. Deterministic Non-periodic Flow[J]. Journal of the atmospheric sciences,1963,20:130-141
    [150]Masashi Sugiyama. Local fisher discriminant analysis for aupervised dimensionality reduction[C], Proceedings of 23rd International Conference on Machine Learning,2006,905-912
    [151]Masashi Sugiyama. Dimensionality reduction of multimodal labeled data by local fisher discriminant Analysis[J],Machine Learning Research,2007,8(5): 1027-1061
    [152]Fishe R.A. The use of multiple measurements in taxonomic problems[J]. Annals of Eugenics.1936.7:179-188
    [153]He xiaofei,Yanshuichen,et al. Face recognition using Laplacianfaces[J]. IEEE Transcations On Pattern Analysis and machine intelligence,2005,27(3):328-340
    [154]Deng Cai,He Xiaofei,et al. Orthogonal laplacianfaces for face recognition[J]. IEEE Transcations of Image Processing,2006,15(11):3608-3614
    [155]陶晓燕,姬红兵,景志红.一种用于人脸识别的正交邻域保护嵌入算法[J].西安电子科技大学学报(自然科学版),2008.6(3):439-443
    [156]李瑞东,余党军,陈偕雄.一种新的正交保局投影人脸识别方法[J].科技通报.2007.9(5):702-704
    [157]林宇生,郑宇杰,杨静宇.一种基于Schur分解的正交鉴别局部保持投影方法[J].中国图象图形学报,2009.14(4):701-706
    [158]刘慧,袁文燕.矩阵论及其应用[M].北京:化学工业出版社,2003
    [159]Scholkopf B,Burges C.J.C,Smola A.J.. Advances in kernel methods support vector learning[C].MIT press,Cambridge,MA,1999.
    [160]Scholkopf B,Smola A.J. Learning with kernels[M].the MIT Press,cambrige,England,2001
    [161]Ham J,Lee D,et al. Akernel view of the dimensionlity reducation of manifolds[c]. Procedding of the International Conference on Machnie Learning, 2004:47-54
    [162]边肇祺,张学工.模式识别[M].清华大学出版社,2000
    [163]Aizerman M.A,Braverman E.M,Rozonoer L.I. Theoretical foundation of potential function method in pattern recognition learning[J].Automation and remote control,1964,25:821-837
    [164]Cristianini N,Shawe Taylor J. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods[J]., Cambridge University Press.2000
    [165]Scholkopf B,Smola A.J,MUller K.R. Nonlinear component analysis as a kernel eigenvalue problem[J]..Neural Computation,1998,10:1299-1319
    [166]Mika S,Scholkopf B,Smola A.J. Kernel PCA and de-noising in feature spaces[J].Advances in Neural Information Processing Systems.Cambridge,MA:MIT Press,1999:536-542
    [167]Vapnik V.N. The nature of statistical learning theory[M].New York:springer Verlag,1995
    [168]Xu J,Zhang X.,Li Y.D. Kernel perceptron algorithm[J].Department of Automation, Tsinghua University,2000
    [169]Mike S.Weston J. Fisherdiscriminant analysis with kernels[J].Neural Networks for Signal Processing 4,IEEE press,1999:41-48
    [170]Gestel T.V. Least squares support vector machine regression for discriminant analysis[C]. IN:Proceedings of International Joint Conference on Neural Networks 4,2001:2445-2449
    [171]Bach F.R. Kernel independent component analysis[J].Computer Science Division,University of California Berkeley,California,2001
    [172]陶晓燕.基于支持向量机和流形学习的分类方法研究,[博士学位论文],西安电子科技大学,西安,2008.
    [173]Richard O Duda,Peter E Hart,David G. Pattern Classification(Second Edition)[M].李宏动,姚天翔等译,模式分类(第二版).北京:机械工业出版社,2003
    [174]Xia C,Wynne Hsu,Li Lee M. BORDER:Efficient computation of boundary points[J].IEEE Transaction on Knowledge and Data Engineering,2006,18(3):289-303.
    [175]Ester M,Kriegel H.P,Sander J.A. Density-based algorithm for discovering clusters in large spatial databases with noise[C]. Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining.Portland, Oregon: 1996:226-231.
    [176]谈正,王利生.曲面网格数据的边缘提取与显示[J].计算机辅助设计与图形学学报,2000,12(8):580-584
    [177]邱泽阳,宋晓宇,张树生,等.一种新的散乱数据边界点提取方法[J].机械科学与技术,2004.23(9):1037-1039
    [178]邱保志,张枫,岳峰.基于统计信息的聚类边界模式检测算法[J].计算机工程,2008.34(3):91-93
    [179]曾宪华,罗四维.动态增殖流形学习算法[J].计算机研究与发展,2007.44(1462-1468):91-93
    [180]Law,M.H, Jain A.K. Incremental nonlinear dimensionality reduction by manifold learning[J].Pattern Analysis and Machine Intelligence,IEEE Transcations on,2006,28(3):377-391
    [181]Kouropteva O,Okun O. IPietik.N.Incremental locally linear embedding[J]. Pattern recognition 2005,38(10):1764-1767
    [182]Xiaoming Liu,Jianwei Yin,Zhilin Feng,elt. Ubcrenebtak incremental manifold learning via tangent space alignment[J],Artificial Neural Networks in Pattern Recognition,Ulm,Germany,2006:107-121
    [183]Zhang Tianhao,Jie Yang,Deli Zhao,etl. Linear local tangent space alignment and application to face recognition[J].Neurocomputing Letters,2007,70(7-9):1547-1553.
    [184]戈卢布GH,范洛恩CF,矩阵计算[M].2001,北京:科学出版社
    [185]Saul L.,Roweis.. Think globally,fit locally:unsupervised learning of low dimnensional manifolds [J]. Journal of Machine Learning Research,2002,4(2):119-155
    [186]Vapnik V.N. The Nature of Statistical Learning Theory[J]. NY:Springer-Verlag, 1995.
    [187]Vapnik V.N. An overview of statistical Learning Theory[J].IEEETrans. Neural Network,1999,10(5):988-999.
    [188]http://www.eecs.case.edu/laboratory/bearing/welcome_overview.htm

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700