支持向量数据描述的若干问题及应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
统计学习理论的目的是研究有限样本下机器学习的特征,为有限样本学习问题提供了完备统一的理论框架。支持向量机(Support Vector Machine, SVM)就是在此基础上发展起来的新的学习方法,它是基于结构风险最小化原则以及与多种机器学习方法相融合的标准技术,在使用过程中,己展现出许多优于其它方法的性能。支持向量数据描述(Support Vector Data Description, SVDD)是一种源于统计学习理论和SVM的全新的数据描述方法,与SVM寻求最优超平面不同,SVDD的出发点是寻求一个包容所有目标样本数据的最小超球体。这种有监督的单类分类器,被广泛应用于故障检测、工业及医疗检测、网络安全、目标类识别、入侵检测以及人脸识别等领域,近年来其研究在机器学习领域中非常活跃。
     然而,由于支持向量数据描述是机器学习领域中比较新的理论,因此在很多方面尚不成熟,亟需做进一步研究。其中,关于SVDD的学习方法的研究是该理论的重点和难点之一。通常被使用的支持向量数据描述方法是有监督的机器学习方法。本文以提高学习能力为目标,分别从无监督和半监督两方面,围绕新型学习算法探索、提高学习准确率、数据预处理及扩展应用等方面对支持向量数据描述的若干问题进行了研究,主要工作结果如下:
     (1)针对无监督模式下传统的SVDD方法无法准确描绘目标数据的分布问题,提出了基于人工免疫核聚类的支持向量数据描述方法AIKCSVDD(Artifical Immune Kernel Cluster-based SVDD)。AIKCSVDD将人工免疫核聚类产生的记忆抗体作为目标数据点,使用SVDD方法进行学习。在AIKCSVDD中,一方面实现了核聚类方法解决各类数据边界不清晰的长处、免疫网络聚类方法全局收敛以及不需要先验知识等优点的有机结合;另一方面,由于用记忆抗体代替原始数据进行学习,从而在不事先指定分类个数的情况下仍能更好展现原始数据的全局分布特征。
     (2)针对半监督模式下传统的SVDD方法无法准确描绘目标数据的分布问题,提出了基于半监督学习的加权支持向量域数据描述方法。在现实生活中,大量的具有已知分类信息的数据通常很难得到,为了能在较少已知信息的情况下准确描述未知数据集,考虑将标记繁殖及加权思想应用到SVDD方法中。为此,本文首先利用半监督的标记繁殖算法,根据已知信息有效学习大量未标记数据中的隐含信息,然后再通过加权的SVDD方法学习数据集的潜在分类情况。实验结果表明,该方法在较少已知信息的情况下明显优于传统的SVDD方法。
     (3)在前述半监督工作的基础上,对半监督学习方法展开深入研究,从经典的的kNN(k-Nearest Neighbor)分类方法入手,给出了一种基于半监督加权距离度量学习的kNN分类方法。为了从有限的已知标签数据中找到一种合适的距离度量,考虑使用相关成分分析(Relevant Component Analysis, RCA)方法来学习一个马氏距离度量。然而传统的RCA方法在度量学习过程中对类别信息标记的数量具有很强的依赖性,且在标记信息数量很少或有错误的情况下可能会引起相应的度量偏差,进而考虑使用半监督的学习方法来克服传统RCA方法的局限性。该方法可从极少量已知标记信息中通过标记繁殖和加权算法学习到一个马氏距离;进而将其应用于kNN分类方法。实验结果表明,在标记信息极少的情况下该方法的分类效果优于采用欧式距离的kNN分类方法。
     (4)针对在故障诊断等应用领域数据维度较高、数据分布不均匀等特点,本文研究并给出了基于核距离度量LLE的支持向量数据描述方法。为了能够挖掘出隐藏在高维观测数据中有意义的低维结果,更好地提取易于识别的特征,该方法考虑在应用数据的预处理过程中使用LLE方法对数据降维。但由于LLE算法需要稠密采样,在高维稀疏空间中采用欧式距离往往导致效果不尽人意的状况,因此使用核空间距离代替原LLE算法的欧式距离度量,然后利用改进的LLE方法对数据集进行降维,从而使新得到的数据在较小的数据维度中更好地保持了原有的数据流形。最后应用SVDD方法处理新得到的数据。基于SVDD的故障检测实验表明,该方法特别适合于维度较高、分布不均匀的应用数据集。
     综上所述,本文对支持向量数据描述的若干问题及应用开展了研究,文中提出的一些新方法对于提高SVDD的学习能力很有理论意义和应用价值。在后继工作中,将进一步完善、深入现有的研究结果,同时将研究成果融入到工程应用实践当中。
Statistical Learning Theory aims to investigate characteristics of learning problems with finite samples and provides complete and consistent theoretical framework. Built on Statistical Learning Theory,Support Vector Machine (SVM) is a classical learning method which uses Structural Risk Minimization principle, is capable of combining with lots of other machine learning technology and shows many better performances. Support Vector Data Description (SVDD) is a completely new method based on Statistical Learning Theory and SVM. Different from SVM’s looking for hyperplane, it pursues to find a hyperspere enclosing target data. SVDD is a classical one-class classifier or data description method and has widespread application in the field of fault detect, industrial and medical diagnosis, network security, target class identification, intrusion detect, face recognition and so on. SVDD becomes hot spot of machine learning in the recent years.
     However, SVDD is still immature in many aspects and needs further research for it is a quite new theory in machine learning. Among these researches, SVDD’s learning algorithm is a key point and difficult part. In this thesis, we aim to improve SVDD’s learning capability under the setting of unsupervised learning and semi-supervised learning. We explores SVDD’s some problems circling the aspects on improving learning accuracy, studying new learing algorithm, data preprocessing, application extension and so on. The following is the detail:
     (1) To solve inaccurate classification problem of conventional SVDD in unsupervised settings, AIKCSVDD, a support vector data description method based on artificial immune kernel clustering is proposed. It uses memory antibodies generated by artificial immune kernel clustering algorithm as target data, and then uses SVDD to execute multi-class classification. On one hand, immune kernel clustering method does not need prior knowledge and can recognize data of no clear boundaries better; on the other hand, using memory antibodies as target data can reflect original data’s global distribution better and need not know previously cluster number.
     (2) To enhance classification precision of traditional Support Vector Data Description with less classification information, the method of Semi-Supervised Weighted Support Vector Data Description for data classification is proposed, which uses a graph-based semi-supervised learning technology to learn the potential classification information of large number of unlabeled data with small amount of labeled data, then adopts a method of weighted Support Vector Data Description to learn a classifier for the whole data. Experiments on UCI datasets show that our method is efficient in the context of tiny known classification information.
     (3) K-Nearest Neighbor (kNN) classification is one of the most popular machine learning techniques, but it often fails to work well due to less known information or inappropriate choice of distance metric or the presence of a lot of unrelated features. To handle those issues, we introduce a semi-supervised distance metric learning method for kNN classification. This method uses a semi-supervised Label Propagation algorithm to gain more label information with tiny initial classification information, then resorts to an improved weighted RCA to learn a Mahalanobis distance function, and finally uses learned Mahalanobis distance metric to replace the original Euclidean distance of kNN classifier. Experiments on UCI datasets show the effectiveness of our method.
     (4) In real application, such as fault diagnosis, data often has very high dimension and non-uniform distributions. We propose a new method combining kernel distance metric LLE and SVDD solving these problems. In order to mine low-dimension meaningful information hiding in high-dimension data and extract better classification features, we uses LLE to dimensionality reduction in data preprocessing. For LLE needs dense sampling and has unsatisfactory results with Eucilian distance in high-dimension sparse space, we use distance metric in kernel space to improve LLE and get better original data manifold in low-dimension. Then we utilize SVDD method to process the new dataset. Experiments results show the proposed method has better performance for data of high-dimension and non-uniform distributions.
     On the whole, this thesis does researches on some problems and applications in Support Vector Data Description method. These researches have certain theoretical and practical significance in improving SVDD’s learning capability. In future works, in addition to improve our current woks, we hope to make deeper research on SVDD and apply them to real applications.
引文
[1] Vapnik.V The Nature of Statistical Learning Theory [J]. Springer-Verlag. New York,1995
    [2]张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000
    [3] Rosenblat F. Principles of neurodinamics [J]: Perceptron and Theory of Brain Mechanisms, Spartan Books, Washington D C, 1962.
    [4] Novikof A.On convergence proofs on perceptrons [C]. Proceedings of the Symposium on the mathematical Theory of Automata, Poltechnic Instatute of Brooklyn,1962.
    [5] Vapnik, V.N., Chervonenkis, A.Ya. (1968), On the uniform convergence of relative frequencies of event to their probabilities [J], Soviet Math. Dokl., 9, 915 - 918.
    [6] Vapnik, V., Chervonenkis, A. (1974), Theory of Pattern Recognition: Statistical problems of learning [J], Nauka, Moscow. Math. Review 57:14274
    [7] Tikhonov A. On solving ill-posed problem and method of regularization [J]. Doklady Akademii Nauk USSR, 1963, Vol.153: 501~504
    [8] Kolmogoroff A. Three Approaches to the Quantitative Definitions of Information [J]. Proble of Information Transmission,1965,Vol.1,No.1:1~7
    [9] Chaitin G. On the Length of Programs for computing finite binary Sequences [J]. Journal of Association with Computer Machine.1966, Vol.13:547~569.
    [10] LeCun Y. Learning Processes in an Asymmetric Threshold Network. Disordered Systems and Biological Organizations [J] ,Les. Houches, France, Springger, 1986, Vol.9:233~240..
    [11] Rumelhart D., Hinton G.,Willams R.J. Learning Internal Representations by Error Propagation [J]. Parallel Distrubuted Processing: Explorations in Macrostructure of Cognition.Badford Books,Cambridge, MA.,1986, Vol.1:318~362
    [12] Powell M J D.The theory of radial basis functions approximation in 1990. WA Light ed. Advances in Numerical Analysis Volume II: Wavelets, Subdivision algorithms and radial basis functions [D] , Oxford University,105~120, 1992.
    [13] Boser B.,Guyon L., Vapnik V. A Training Algorithm for Optimal Margin Classifier [C]. In Proceedings of the 5th Annual Workshop on Computational Learning Theory,1992:144-152
    [14] Cortes C.,Vapink V. The Soft Margin Classifier.[J] Technical Memorandum, 11359-931209-18TM, AT&T Bell Labs,1993
    [15] Vapnik V., Golowich S., Smola A. Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing[C]. In:Advances in Neural Information Processing System, Cambridge, MA, MIT Press, 1997:281-287
    [16]袁亚湘,孙文瑜.最优化理论与方法[M].北京:科学出版社,1997
    [17]邓乃杨,田英杰.数据挖掘中的新方法——支持向量机[M].北京:科学出版社,2005
    [18] Tax D M J, Duin R. Support vector domain description [J]. Pattern Recognition Letters, 1999, 20:1191-1199.
    [19] DavidTax,One-class classifieation [J],Ph.Dthesis,Teehnisehe University Dleft,19July 2001
    [20] Tax D M J, Duin R. Support Vector Data Description [J]. Machine Learning, 54, 45–66, 2004
    [21]潘志松,陈斌,缪志敏,倪桂强.One—Class分类器研究[J].电子学报,第11期
    [22] Sain S R,Gray H L,Woodward W A,et a1.Outlier detection from a mixture distribution when training data ale unlabeled [J].Bulletin ofthe Seismological Society ofAmerica,1999,89(1):294—304.
    [23] Bishop C.Novelty detection and neural network validation[A].IEEProceedings Vision,Image and si al Processing [C],London:IEE press,1994.217—222.
    [24] Yeung D,Chow C.Parzen-window network intrusion detectors [A].Proceedings of the Sixteenth International Conference on Pattern Recognition [C],London:IEE press,1994.385-388.
    [25] Japkowicz N.Concept-learning in the absence of counter-examples:an autoassociation-based approach to classification [D].New Jersey:The StateUniversity ofNew Jersey,1999.
    [26] Tax D . One-class classification concept-learning in the absence of counter-examples [D].Netherlands:Universiteit Delft,2001.
    [27] Ypma E,W R P.Novelty Detection Using Self-Organizing Maps[C].Proc.of ICONIP,Berlin:Springer,1997.1322—1325.
    [28] Cherkassky V M.Learning from Data[M ].New York:John W iley& Sons.1998.
    [29] Ypma A,Duin R.Support objects for domain approximation[A].ICANN 98[C].Sweden_,Springer,1998.719—724.
    [30] Scholkopf B,Platt J,Shawe-Taylor J,et a1.Estimating the support of high-dimensional distribution[J].Neural Computation,2001,13(7):1443—1471.
    [31] Scholkopf B,Giesen J,Spalinger S.Kernel methods for implicit surface modeling [A].Advances in Neural Information Processing Systems[C].Cambridge USA:MIT Press,2005.17.1193一l200.
    [32] D.M.J. Tax, P. Juszczak, Kernel whitening for one-class classification,[C] Lecture Notes in Computer Science, vol. 2388, Springer, Berlin, 2002, pp. 40–52.
    [33] H. Hoffmann, Kernel PCA for novelty detection [J], Pattern Recognition 40 (3) (2007) 863–874.
    [34] S.M. Guo, L.C.Chen,J.S.H Tsai. A boundary method for outlier detection based on support vector domain description [J]. Pattern Recognition,2009,42(10):77-83.
    [35] Hyun-Woo Cho. Data description and noise filtering based detection with its application and performance comparison [J]. Expert Systems with Applications 36 (2009) 434–441
    [36] Lee, K., Kim, D.-W., Lee, K.H., Lee, D. Density-induced support vector data description [J]. IEEE Transactions on Neural Networks 18 (1), 284–289, 2007.
    [37]赵峰,张军英,刘敬.一种改善支撑向量域描述性能的核优化算法[J].自动化学报,第34卷,第9期, 2008.
    [38] Karl Sjo¨strand, Michael Sass Hansen, Henrik B. Larsson, Rasmus Larsen. A path algorithm for the support vector domain description and its application to medical imaging [C]. Medical Image Analysis 11 (2007) 417–428
    [39] Michael Sass Hansen, Karl Sj?strand, Rasmus Larsen. On the regularization path of the support vector domain description[J]. Pattern Recognition Letters 31 (2010) 1919–1923
    [40] Pyo Jae Kim, Hyung Jin Chang, Dong Sung Song, and Jin Young Choi. Fast Support Vector Data Description Using K-Means Clustering [C]. LNCS 4493, pp. 506–514, 2007.
    [41]陆从德,张太镒,胡金燕.基于乘性规则的支持向量域分类器[J],计算机学报,第27卷,第5期, 2004.
    [42]胡正平,张晔.带拒识能力的双层支持向量模型分类器[J],电子学报,第33卷,第7期, 2005
    [43] Ban, T., Abe, S., 2006. Implementing multi-class classifiers by one-class classification methods.[C] In: International Joint Conference on Neural Networks, 2006, IJCNN’06. pp. 327–332.
    [44] Choi, J., Im, K., Kang, W.-S., 2006. SVDD-based method for fast training of multi-class support vector classifier [C]. Lecture Notes in Computer Science 3971, 991–996.
    [45] Lee, D., Lee, J., 2007. Domain described support vector classifier for multi-classification problems.[J] Pattern Recognition 40 (1), 41–51.
    [46] Woo-Sung Kang, JinYoung Choi.Domain density description for multiclass pattern classification withreduced computational load [J]. Pattern Recognition 41 (2008) 1997– 2009.
    [47]朱孝开,杨德贵.基于推广能力测度的多类SVDD模式识别方法[J],电子学报,第37卷,第3期, 2009
    [48] Nico G?rnitz, Marius Kloft, and Ulf Brefeld Active and Semi-supervised Data Domain Description. [C] ECML PKDD 2009, Part I, LNAI 5781, pp. 407–422, 2009.
    [49] Yong Zhang, Zhong-Xian Chi, Ke-Qiu Li. Fuzzy multi-class classifier based on support vector data description and improved PCM. [J] Expert Systems with Applications 36 (2009) 8714–8718.
    [50] Yi-Hung Liu, Szu-Hsien Lin, Yi-Ling Hsueh, Ming-Jiu Lee. Automatic target defect identification for TFT-LCD array process inspection using kernel FCM-based fuzzy SVDD ensemble [J]. Expert Systems with Applications 36 (2009) 1978–1998
    [51]谢磊,刘雪芹,张建明,王树青.基于NGPP_SVDD的非高斯过程监控及其应用研究,自动化学报,第35卷,第1期, 2009.
    [52] Banerjee, A., Burlina, P., Diehl, C., 2006. A support vector method for anomaly detection in hyperspectral imagery.[C] IEEE Transactions on Geoscience and Remote Sensing 44 (8), 2282–2291.
    [53] Hao Lin ,Jie-wen Zhao, Quan-sheng Chen,Jian-rong Cai, Ping Zhou. Eggshell crack detection based on acoustic response and support vector data description algorithm [C]. Eur Food Res Technol (2009) 230:95–100
    [54] Hong-gang Bu, Jun Wang, Xiu-bao Huang. Fabric defect detection based on multiple fractal features and support vector data description [C]. Engineering Applications of Artificial Intelligence 22 (2009) 224–235
    [55] Hyun-Woo Cho, Myong K. Jeong. Support vector data description for calibration monitoring of remotely located microrobotic system [J]. Journal of Manufacturing Systems, Vol. 25, No. 3, 2006.
    [56] F. Bovolo, G. Camps-Valls, L. Bruzzone. A support vector domain method for change detection in multitemporal images [J]. Pattern Recognition Letters 31 (2010) 1148–1154
    [57] Zhiqiang Ge, Uwe Kruger, Lisa Lamont, Lei Xie, Zhihuan Song.Fault detection in non-Gaussian vibration systems using dynamic statistical-based approaches[J]. Mechanical Systems and Signal Processing 24 (2010) 2972–2984
    [58] Zhiqiang Ge, Zhihuan Song.Process structure change detection by eigenvalue-based method. [J] Computers and Chemical Engineering (2010).
    [59] Yong Zhang, Xiao-Dan Liu, Fu-Ding Xie , Ke-Qiu Li. Fault classifier of rotating machinery based on weighted support vector data description[J]. Expert Systems with Applications 36 (2009) 7928–7932
    [60] Yuna Pan, Jin Chen,Lei Guo. Robust bearing performance degradation assessment method based on improved wavelet packet–support vector data description [J]. Mechanical Systems and Signal Processing,23, 2009, pp. 669–681
    [61] Joo young Park,Dae sung Kang,James T. Kwok,Sang-Woong Lee,Bon-Woo Hwang, Seong-Whan Lee. Facial Image Reconstruction by SVDD-Based Pattern De-noising [C]. ICB 2006, LNCS 3832, pp. 129–135, 2005.
    [62] Sang-Woong Lee, Jooyoung Park, Seong-Whan Lee. Low resolution face recognition based on support vector data description.[J] Pattern Recognition 39 (2006) 1809– 1812.
    [63]胡正平,谭营.基于目标模糊置信度描述驱动的区域能量进化增长图像分割算法[J],自动化学报,第34卷,第9期, 2008.
    [64] Jie wen Zhao, Hao Lin, Quan sheng Chen, Xingyi Huang, Zongbao Sun, Fang Zhou. Identification of egg’s freshness using NIR and support vector data description.[J] Journal of Food Engineering 98 (2010) 408–414.
    [65] JinFa Zhuang, Jian Luo, YanQing Peng. etc. On-line fault detection method based on modified SVDD for industrial process system. Intelligent System and Knowledge Engineering [C], 2008. ISKE 2008. Xiamen 2008 17-19(1): 754– 760
    [66] Liu Jiaomin, Wang Zhenzhou, Fang Xinchun, Wang Jing, Intrusion Detection Technology Based on SVDD,[C] 2009 Second International Conference on Intelligent Networks and Intelligent Systems, 2009:15-18
    [67] Sang-Woong Lee, Seong-Whan Lee. SVDD-Based Illumination Compensation for Face Recognition. [C] Advances in Biometrics. Lecture Notes in Computer Science. 2007(4642):154-162
    [68] Woo-Sung Kang, Ki Hong Im and Jin Young Choi. SVDD-Based Method for Fast Training of Multi-class Support Vector Classifier [C]. Advances in Neural Networks - ISNN 2006.Lecture Notes in Computer Science. Springer Berlin/Heidelberg 2006(3971):991-996
    [69]缪志敏,潘志松,袁伟伟.一种新的基于SVDD的多类分类算法[J].计算机科学. 36(3), 2009:65-68
    [70] MIAO Zhi-min, PAN Zhi-song, YUAN Wei-wei, ZHAO Lu-wen. New Multi-class Classif ication Based on Support Vector Date Description [J]. Computer Science. 36(3),2009:65-68
    [71] Li Jie, Gao Xin-bo, Jiao Li-cheng. A Novel Clustering Method with Network Structure Based on Clonal Algorithm[J]. ACTA ELECTORONICA SINICA, 2004, 7:1115-1118.
    [72] Girolami M. Mercer Kernel-based Clustering in Feature Space[J]. IEEE Trans. on Neural Networks, 2002, 13(3):780-784.
    [73] Wenlong Huang, Licheng Jiao. Artificial immune kernel clustering network for unsupervised image segmentation [J]. Progress in Natural Science, 2008(18):455-461
    [74] Jerne N K. Towards a Network Theory of the Immune System[M]. ANN. Immunol ,Paris (Inst Pasteur) ,1974 ,125C:373 - 389.
    [75] G. B. BEZERRA, L. N. DE CASTRO. Bioinformatics data analysis using an artificial immune network [A]. Proceedings of ICARIS[C], 2003: 22-33.
    [76] Leandro N C ,Fernando J Z. An evolutionary immune network for data clustering[A] ,Proceedings of the IEEE Computer Society Press [ C] .
    [77] De Castro, L. N. & Von Zuben, F. J. (2001), aiNet: An Artificial Immune Network for Data Analysis, in Data Mining: A Heuristic Approach, H. A. Abbass, R. A. Sarker, and C. S. Newton (eds.), Idea Group Publishing, USA, Chapter XII, pp. 231-259.
    [78] Chapelle O, Scholkopf B, Zien A. Semi-supervised learning [M]. Cambridge, Mass.: MIT Press, 2006.
    [79]高滢.多关系聚类分析方法研究[D].吉林:吉林大学计算机学院,2008.
    [80] Bouchachia A. Learning with partly labeled data [J]. Neural Comput & Applic. 2007, 16: 267-293.
    [81] Scudder H J. Probability of error of some adaptive pattern-recognition machines [J]. IEEE Transac-tions on Information Theory, 1965, 11:363-371.
    [82] Fralick S C. Learning to recognize patterns without a teacher [J]. IEEE Transactions on Information Theory, 1967, 13:57-64.
    [83] A. K. Agrawala. Learning with a probabilistic teacher [J]. IEEE Transactions on Information Theory, 1970, 16: 373-379.
    [84] Yarowsky. Unsupervised word sense disambiguation rivaling supervised methods. In Meeting of the Association for Computational Linguistics [C], pages 189-196, 1995.
    [85] McCallum A, Nigam K. A comparison of event models for naive Bayes text classification. In AAAI-98 Workshop on Learning for Text Categorization [C], AAAI Press, pages 41-48, 1998.
    [86] McCallum A, Nigam K. Employing EM and pool-based active learning for text classification. In Proceedings of the International Conference on Machine Learning [C], Madison, WI, 1998.
    [87] Zhang Y, Brady M, Smith S. Hidden Markov random field model and segmentation of brain MR images [J]. IEEE Transactions on Medical Imaging, 2001, 20(1):45–57.
    [88] Wang L, Chan K L, Zhang Z. Bootstrapping SVM active learning by incorporating unlabelled images for image retrieval. In Conference on Computer Vision and Pattern Recognition [C], pages 629-634, 2003.
    [89] Zhu X, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions. In Proceedings of International Conference on Machine Learning [C], Washington. DC, 2003.
    [90] Lin T, Zha H B. Riemannian manifold learning [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 30(5): 796-809.
    [91] Zhang Yong, Chi Zhongxian, Li Keqiu. Fuzzy multi-class classifier based on support vector data description and improved PCM [J]. Expert Systems with Applications, 2009, 36:8714-8718
    [92] Joachims T. Transductive inference for text classification using support vector machines [C] // Proc of the 16th International Conference on Machine Learning (ICML-99), San Francisco: Morgan Kaufmann Publishers, 1999: 200-209
    [93] Zhu X. Semi-supervised learning with graphs [D]. Pittsburgh: Carnegie Mellon University, 2005
    [94] Phelleg D, Moore A. X-means: Extending K-means with efficient estimation of the number of clusters [C] // Proc of the 17th International Conference on Machine Learning (ICML-2000), San Francisco: Morgan Kaufmann Publishers, 2000:727– 734
    [95] Blake C, Keogh E, Merz C. UCI repository of machine learning databases [EB/OL]. Irvine, CA: University of California, 2010 [2010-02-03]. http://archive.ics.uci.edu/ml/index.html
    [96] Chopra, R. Hadsell, and Y. LeCun. Learning a similiarty metric discriminatively, with application to face verification [C]. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR-05), pages 349–356, San Diego, CA, 2005.
    [97] J. Goldberger, S. Roweis, G. Hinton, and R. Salakhutdinov. Neighbourhood components analysis[J]. In L. K. Saul, Y. Weiss, and L. Bottou, editors, Advances in Neural Information Processing Systems 17, pages 513–520, Cambridge, MA, 2005. MIT Press.
    [98] Kilian Q. Weinberger, Lawrence K.Saul. Distance Metric Learning for Large Margin Nearest Neighbor Classification [J]. Journal of Machine Learning Research 10 (2009):207-244
    [99] Noam Shental, Tomer Hertz, Daphna Weinshall and Misha Pavel. Adjustment Learning and Relevant Component Analysis [C]. In Proc. of 7th European Conference of Computer Vision (ECCV), Copenhagen, May 2002: 776-790
    [100]王新颖三维模型检索中基于语义方法的若干问题研究[D]吉林:吉林大学, 2009
    [101] Isermann R ,Balle P. Trends in the application of model based fault detection and diagnosis of technical processes [C]//Proc of IFAC World Congress ,San Francisco ,USA ,1996 :1 - 12.
    [102] Garcia E A ,Frank P M. On the relationship between observer and parameter identification based approaches to fault detection[C]//Proc of IFAC World Congress ,2001 :25 - 29.
    [103] Peter T W, Yang Wenxian. Machine fault diagnosis through an effective exact wavelet analysis [J]. Journal of Sound and Vibration ,2004 ,11 (5) :1005 - 1024.
    [104]陈安华,蒋玲莉,刘义伦基于知识网格的故障诊断专家系统模型[J]仪器仪表学报vol 30. No.11 2009:2450-2454
    [105] Yong Zhang, Xiao-Dan Liu, Fu-Ding Xie, et al. Fault classifier of rotating machinery based on weighted support vectord ata description [J]. Expert Systems with Applications. 36 (2009) 7928–7932
    [106]庄进发,罗键,李波等,基于拒绝式转导推理M-SVDD的机械故障诊断[J]仪器仪表学报vol30 no.7 2009:1353-1358
    [107] JOLL IFFE I T. Principal component analysis [M]. New York: Springer-Verlag,1986.
    [108] HYVAR INEN A,OJA E,KARHUNEN J. Independent component analysis[M ]. New York: Wiley,2001.
    [109] HUBER P J. Projection pursuit[J]. Annals of Statistics,1985,13(2) : 435-475.
    [110] HUO Xiaoming,CHEN Jihong. Local linear projection (LLP)[C]. Proc of theWorkshop on Genomic Signal Processing and Statistics. 2002: 1183-1186.
    [111] GORDON A D. Classification: methods for the exploratory analysis of multivariate data[M]. New York: Wiley,1977.
    [112] Balazs Kegl,Adam Krzyzak,Tamas Linder,et al. Learning and design of principal curves[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence ,2000,22 (3) : 281-297.
    [113] Kohnone T. Self-organizing maps [M]. 3rd ed. Berlin: Springer-Verlag,2001.
    [114] Sam T. Roweis,Lawrence K. Saul. Nonlinear dimensionality reduction by locally linear embedding [J]. Science,2000,290 (5500): 2323-2326.
    [115] Mikhail Belkin. Laplacian eigenmaps for dimensionality reduction and data representation [J]. Neural Computations,2003,15(6): 1373-1396.
    [116] Borg I,Groenen P. Modern multidimensional scaling: theory and application [M]. New York: Springer-Verlag,1997.
    [117] Tenenbaum,J. B.,de Silva V.,and Langford J. C. A global geometric framework for nonlinear dimensionality reduction [J]. Science,2000,290(12): 2319-2323.
    [118] Zhenyue Zhang and Hongyuan Zha. Principal manifolds and nonlinear dimension reduction via local tangent space alignment [J]. SIAM Journal of Scientific Computing,2004,26(1):313-338.
    [119]王培良,颜文俊一种新的动态非高斯过程监控方法[J]仪器仪表学报vol.30 no.3 2009:471-476
    [120]陈斌,李斌,潘志松等流形嵌入的支持向量数据描述模式识别与人工智能vol22 No.4 2009:548-553
    [121] Yu Daren,Hu Qinghua, Bao Wen.Combining rough set methodology and fuzzy clustering for knowledge discovery from quantitative data [J]. Proceedings of the Chinese Society for Electrical Engineering, 2004, 4(6):205-210.
    [122] Kumar, Rakesh; Jade, Avinash M.; Jayaraman, Valadi K.; and Kulkarni, Bhaskar D. A Hybrid Methodology For On-Line Process Monitoring, International Journal of Chemical Reactor Engineering, 2004 Vol. 2: A14.
    [123] Shen Lixiang, Francis E H, Qu Liangsheng, et al. Fault diagnosis using rough sets theory[J].Computers in Industry, 2000, 43(1):61-72
NGLC 2004-2010.National Geological Library of China All Rights Reserved.
Add:29 Xueyuan Rd,Haidian District,Beijing,PRC. Mail Add: 8324 mailbox 100083
For exchange or info please contact us via email.