多分类器系统中信息融合方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
分类问题具有不确定性,尽管由于多分类器融合技术在降低分类系统泛化误差、简化分类器设计等方面的优良表现,国内外对它的研究取得了不少成果,但至今还有许多关键理论和技术问题有待解决和完善。再者,由于标记样例的难以获取,近几年来,多分类器融合已经由传统的“有监督”多分类器融合,逐步拓展到“无监督”多分类器融合(也称“聚类融合”)、“半监督”多分类器融合(有标记的样例不足)两个更年轻的高不确定性的领域。因此,近几年来,国际上越来越多研究者投入到多分类器融合的研究中,使得该领域成为了一个相当活跃的研究热点。
     本文针对多分类器系统中信息融合问题,在全面介绍和分析分类器融合的研究现状、工作机理的基础上,提出了多种用于进一步提高分类器融合系统性能、扩大其应用范围的算法。本文的主要研究成果如下:
     针对传统多分类器设计框架仅适用于有标记资料的缺陷,提出了一种通用的自适应多分类器设计框架。该框架集成了各种资料条件下(最初无标记信息、中期有少量标记信息、后期有足够标记信息)的多分类器设计方法,具有自适应功能,对于一项应用能根据资料所处的阶段(无监督、半监督和有监督)自动调整。
     针对有监督分类器融合方法-模糊积分,对该方法中的重要问题-模糊密度确定问题进行了研究,首次将两种典型的静态确定方法进行了细致的比较研究。在此基础上,提出了一种动态自适应模糊积分融合算法,该方法的特点是采用一种新的基于模糊测度的差异性度量方法进行初始的成员分类器选择,并选择合理的初始模糊密度,引入修正系数对模糊密度进行自适应动态调整,不仅降低了融合的规模,而且提高了整体的性能。
     针对无监督分类器融合算法可能出现的信息失真问题,提出了一种基于信息滚动机制的平均互信息方法,并通过实验验证了其有效性;然而,也发现随着规模增大,暴露出易被噪声聚类成员干扰的缺陷。因此,将问题拓展,提出了一种基于蚁群算法的匹配方法,该方法具有随着规模的扩大而优势越来越明显的优点,为未来解决聚类类别不匹配的情况提供了一个方向。在此基础上,提出了一种新的无监督分类器融合算法,该算法兼顾聚类质量与成员多样性,采用一种新的相似性度量,并依据度量结果先对聚类成员进行剪辑操作,再分组、选择,最后根据了每个聚类成员对每类别的贡献设计了一种新的加权函数,与其他方法相比,该方法具有较好的稳定性与精确性
     针对半监督分类器融合算法中的两类算法(多视图和单视图),提出了两种基于统计的协同训练算法。多视图方面,提出了一种改进的基于统计方法的多视图协同训练算法,该算法采用多元统计方法KCCA对两视图中变量组进行分析,并在KCCA的应用过程中利用类标号信息产生奖励或惩罚因子,使得抽取的同类样本特征之间的相关最大化,同时使得不同类样本特征之间的相关最小化,最后采用二次确认的投票方法进行标注。实验表明,该方法在有标记样本占比例较小时,具有较明显的效果。单视图方面,提出一种新的单视图协同训练算法,该算法通过最小显著性差异(LSD)假设检验方法使得三个成员分类器两两之间具有显著性差异,然后采用D-S证据理论提高标注的稳定性,再用局部离群点检测方法-LOF算法对剔除误标记的噪声样本,从而很大程度上保证了新标记样本的纯净,实验验证该方法具有较高的分类精度和稳定性。
Though the technique of multiple classifier fusion plays an important role in decreasing classifier system generalization error and simplifying classifier design, many key theoretic and technical problems are to be solved and improved. Furthermore, because marked samples are difficult to obtain, multiple classifier fusion has been transformed from traditional supervised multiple classifier fusion into two younger high-uncertainty fields (unsupervised and semi-supervised multiple classifier fusion). Therefore, more and more international scholars have involved in the study of multiple classifier fusion which makes the field a hot topic.
     Aiming at information fusion problem of multiple classifier system, based on the introduction and analysis of classifier fusion research status and working mechanism, the paper puts forward an algorithm which can improve the property of classifier fusion system and expand the application scope. The research result of the paper mainly includes the following:
     First,aiming at the limitation that the traditional multiple classifier design applies to marked materials only, the paper puts forward a common self-adapted multiple classifier design which integrates designs of multiple classifiers in various material conditions and is self-adapted.
     Second, aiming at suprivised classifier fusion method,fuzzy integral, the paper studies the problem of fuzzy density determination, being the first to have made a meticulous compared study. Based on the above, the paper brings forward a dynamic self-adapted fuzzy integral fusion algorithm whose characteristics are making initial member classifier choice by a new difference measuring method, choosing suitable initial fuzzy density, introducing correction factor to dynamically self-adapt fuzzy density. It can decrease the fusion scale and improve the overall property.
     Third, aiming at the information distortion problem which might be occurred in unsupervised classifier fusion algorithm, the paper puts forward a information scrolling method whose validity has been validated by experiments. Whereas, with the increase of scope, the limitation which is easily interfered by noise cluster members has been exposed. Therefore, the paper puts forward an Ant Colony Algorithm-based matching method whose merit becomes more obvious with the increase of scope. It provides a future direction to solve the problem of cluster category mismatching. Based on the above, it puts forward a new unsupervised classifier fusion algorithm which gives consideration to both cluster quality and member diversity, adopts a new similarity measurement, edits cluster members according to measuring result, divides into different groups and makes choice, finally designs a new weighting function according to the contribution of each cluster member to each category. Compared with others, the method possesses better stability and accuracy.
     Forth, aiming at multi-view and single-view Cooperative Training algorithms, two Cooperative Training algorithms based on statistics has been put forward. One is improved multi-view Cooperative Training algorithm based on statistics which adopts KCCA to analyze the variables in the two views and uses category labeled information to produce encouragement and penalty factors in the application of KCCA. It maximizes the relevance among the samples of the same category and minimizes the relevance among the samples of different categories at the same time. Finally it is labeled by twice-confirmed voting method. Experiments indicate that this method has visible effect when the proportion of labeled samples is small. The other new single-view cooperative training algorithm is put forward as well. Using LSD method, it makes difference between every two of the three-member classifier obvious. The stability of labeling is increased by D-S evidence theory. Then LOF algorithm is used to reject mislabelled noise samples. Thus the purity of newly labelled samples is guaranteed to great degree. Experiments indicate the high classifying accuracy and stability of the method.
引文
11]杨杰,胡英,全勇.结合信息融合和数据挖掘技术的信息智能处理平台.高技术通讯,2003,13(1):57~61.
    [2]Waltz E.L.Information understanding:integrating date fusion and data mining processes.In:Proceedings of IEEE International Symposium on Circuits and Systems. Monterey Park CA:IEEE,1998.553-556.
    [3]Dietterich T G.Machine learning research:four current directions. AI Magazine,1997,18(4):97-136.
    [4]李德毅,杜鹢.不确定性人工智能.国防工业出版社,2005.248-248.
    [5]Schapire R.E.The strength of weak learnability.Machine Learning,1990,5(2):197-227.
    [6]Freund Y.and Schapire R.E. A decision-theoretic generalization of on-line learning and an application to boosting.Journal of Computer and System Sciences,1997,55(1):119-139
    [7]Schapire R.E.and Singer Y.Improved boosting algorithms using confidence-rated predictions. Machine Learning,1999,37(3):297-336.
    [8]Kuncheva L.I."Fuzzy"versus"nonfuzzy"in combining classifiers designed by boosting.IEEE Transactions on Fuzzy Systems,2003,11(6):729-741.
    [9]Schapire R.E,Freund Y,Bartlett P,and Lee W.S.Boosting the margin:a new explanation for the effectiveness of voting methods.The Annals of Statistics,1998,26(5):1651-1686.
    [10]Ratsch G.and Warmuth M.K.Marginal boosting.NeuroCOLT2 Technical Report 97,Royal Holloway College,London,July 2001.
    [11]Ratsch G,Onoda T,and Muller K-R.Soft margins for AdaBoost.Machine Learning,2001,42(3):287-320.
    [12]Breiman L.Bagging predictors.Machine Learning,1996,24 (2):123-140.
    [13]Breiman L.Randomizing outputs to increase prediction accuracy.Machine Learning,2000,40(3):229-242.
    [14]Bryll R,Gutierrez-Osuna R and Quek F.Attribute bagging:improving accuracy of classifier ensembles by using random feature subsets. Pattern Recognition,2003,36(6):1291-1302.
    [15]D.Opitz,R.Maclin.Popular ensemble methods:An empirical study,Journal of AI Research,1999,11 (8):169-198.
    [16]Wolpert D.Stacked generalization. Neural Networks,1992,5(2):241-260.
    [17]Todorovski L,Dzeroski S.Combining multiple models with meta decision trees.In:D.A.Zighed,J.Komorowski,J.ZZytkow,Eds.Proceedings of Principles of Data Mining and Knowledge Discovery.Lyon,France:Springer,2000.54-64.
    [18]Todorovski L,Dzeroski S.Combining classifiers with meta decision trees.Machine Learning,2003,50(3):223-249.
    [19]Gama J,Brazdil P.Cascade generalization.Machine Learning,2000,41(3): 315-343.
    [20]Seewald A.K.Furnkranz J.An evaluation of grading classifiers.In:Proceeding of the Fourth International Symposium on Intelligent Data Analysis.Berlin: Springer,2001.115-124.
    [21]Ludmila I. Kuncheva, James C. Bezdek, Robert P.W. Duin. Decision Templates for Multiple Classifier Fusion:An Experimental Comparison. Pattern Recognition,2001,34(2):299-314.
    [22]Dymitr Ruta,Bogdan Gabrys. An Overview of Classifier Fusion Methods. Computing and Information Systems,2000,7(1):1-10.
    [23]Antanas Verikas,Arunas Lipnikas. Soft Combination of neural classifiers:A comparative study.Patten Recognition Letters,1999,20(4):429-444.
    [24]Anne M P Canuto,Michael Fairhurst,Gareth Howells. An Investigation of Fuzzy Combiners applied to a Hybrid Multi-neural System. In:Proceedings of the VII Brazillian Symposium on Neural Networks. Pernambuco, Brazil:IEEE CS,2002.156-161.
    [25]Ronald Mahler. Random Sets:Unification and Computation for Information Fusion—A Retrospective Assessment. In:Per Svensson and Johan Schubert,eds.Proceedings of the 7th International Information Fusion Conference.Stockholm,Sweden:ISIF,2004.1-20.
    [26]Xuefei Li,Huimin Feng,Dongmei Huang,etc. Some aspects of classifier fusion based on fuzzy integrals. In:Proceedings of the Fourth International Conference on Machine Learning and Cybernetics.Guangzhou:IEEE,2005.18-21.
    [27]Wang Xizhao, Wang Xiaojun.A New Methodology For Determining Fuzzy Densities In The Fusion Model Based On Fuzzy Integral. In:Proceedings of the Third International Conference on Machine Learning and Cybernetics. Shanghai:IEEE,2004.2028-2031.
    [28]Wang Xizhao,Chen Junfen. Multiple Neural Networks Fusion Model Based On Choquet Fuzzy Integral.In:Proceedings of the Third International Conference on Machine Learning and Cybernetics.Shanghai:2004.2024-2027.
    [29]Stefan Arnborg, Royal Institute of Technology. Robust Bayesianism:Relation to Evidence Theory. Journal of Advances in Information Fusion,2006,1(1):63-74.
    [30]孙怀江,胡钟山,杨静宇.基于证据理论的多分类器融合方法研究,计算机学报.2001,24(3):231-235.
    [31]戴冠中,潘泉,张山鹰,张洪才.证据推理的进展及存在问题.控制理论与应用,1999,16(4):465~469.
    [32]S. Y. Sohn, Y. S. Kim. Comparison of Fusion Algorithms Based on Logistic Model of Correlated Classifiers.In:Per Svensson and Johan Schubert,eds. Proceedings of the 7th International Information Fusion Conference. Stockholm, Sweden:ISIF,2004.569-575.
    [33]L.I. Kuncheva.A theoretical study on six classifier fusion strategies.IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(2):281-286.
    [34]Jean-Luc Marichal.On Sugeno Integral as An Aggregation Function.Fuzzy Sets and Systems,2000,114(33):347-365.
    [35]Dymitr Ruta,Bogdan Gabrys.An Overview of Classifier Fusion Methods. Computing and Information Systems,2000,7(1):1-10.
    [36]谢华,夏顺仁,张赞超.医学图像识别中多分类器融合方法的研究进展,国际生物医学工程杂志,2006,29(3):152~157.
    [37]Sansanee Auephanwiriyakul,James M.Keller,Paul D. Gader. Generalized Choquet Fuzzy Integral Fusion. Information Fusion,2002,3(2):69-85.
    [38]Christophe Labreuchea,Michel Grabisch. The Choquet integral for the aggregation of interval scales in multicriteria decision making. Fuzzy Sets and Systems,2003,137:11-26.
    [39]Sung-Bae Cho,Jin H. Kim. Combining Multiple Networks by Fuzzy Integral for Robust Classification.IEEE Transactions on Systems, Man, and Cybernetics, 1995,25(2):380-384.
    [40]刘永祥,黎湘,庄钊文.基于Choquet模糊积分的决策层信息融合目标识别.电子信息学报,2003,25(5):695-699.
    [41]Ludmila I. Kuncheva. Fuzzy Classifier Design.Physica-Verlag Heidelberg:2000.
    [42]Ted E. Senator. Multi-Stage Classification. In:Proceedings of the Fifth IEEE International Conference on Data Mining.Huston:IEE CS.2005.386-393.
    [43]刘汝杰,袁保宗,唐晓芳.用遗传算法实现模糊密度赋值的一种多分类器融合算法.电子学报,2002,30(1):145-147.
    [44]Hassiba Nemmour, Youcef Chibani.Neural Network Combination by Fuzzy Integral for Robust Change Detection in Remotely Sensed Imagery. EURASIP Journal on Applied Signal Processing,2005,14(1):2187-2195.
    [45]Paul D. Gader,Magdi A. Mohamed,James M. Keller.Fusion of Handwritten Word Classifiers.Pattern Recognition Letters,1996,17(6):577-584.
    [46]S. Beiraghi,M.Ahmadi,M.Shridhar,M.Sid Ahmed. Application of Integrals in Fusion of Classifiers for Low Error Rate Handwritten Numerals Recognition.In:Proceedings of 15th International Conference on Pattern Recognition.Barcelona,Spain:IEEE CS,2000.487-490.
    [47]James M. Keller,Jeffrey Osborn.Training the Fuzzy Integral. International Journal of Approximate Reasoning,1996,15(1):1-24.
    [48]Jia Wang,Zhenyuan Wang.Using Neural Networks to Determine Sugeno Measure by Statistics.Neural Networks,1997,10(1):183-195.
    [49]Zhenyuan Wang, Kwong sak Leung, Jia Wang. A genetic algorithm for determining nonadditive set functions in information fusion.Fuzzy Sets and Systems,1999,102(3):463-469.
    [50]M. Grabisch, M. Sugeno.Multi-attribute classification using fuzzy integral.In: Proceedings of 1st IEEE International Conference on Fuzzy Systems.San Diego:IEEE CS,1992.47-54.
    [51]Michel Grabisch,Jean-Marie Nicolas.Classification by fuzzy integral:Performance and tests.Fuzzy Sets and Systems,1994,65(3):255-271.
    [52]James M. Keller, Paul Gader, Hossein Tahani, Jung-Hsien Chiang, Magdi Mohamed.Advances in fuzzy integration for pattern recognition. Fuzzy Sets and Systems,1994,65(2-3):273-283.
    [53]姚明海,何通能.一种基于模糊积分的多分类器联合方法.浙江工业大学学报,2002,30(2):156~159.
    [54]Wang Xizhao, Feng Huimin.Nonnegative Set Functions in Multiple Classifier Fusion.In:Proceedings of the Third International Conference on Machine Learningand Cybernetics.Shanghai:IEEE,2004.2020-2023.
    [55]Wang Xizhao,Wang Xiaojun.A New Methodology For Determining Fuzzy Densities In The Fusion Model Based On Fuzzy Integral. In Proceedings of the Third International Conference on Machine Learning and Cybernetics. Shanghai:IEEE,2004.2028-2031.
    [56]马树华,王熙照.模糊积分及其在多分类问题中的应用:[硕士学位论文].保定:河北大学,2004.
    [57]A.Strehl,J.Ghosh. Cluster ensembles-A knowledge reuse Framework for combining partitions.Journal of Machine Learning Research,2002,3:583-617.
    [58]Fred A.L.N, Jain A.K. Data Clustering Using Evidence Accumulation. In:R.Kasturi,D.Laurendeau,C.Suen,eds.Proceedings of the 16th International Conference on Pattern Recognition.Quebec City:IEEE CS,2002.276-280.
    [59]Strehl A, Ghosh J. Cluster ensembles-a knowledge reuse framework for combining multiple partitions. Journal of Machine Learning Research.2002,3:583-617.
    [60]BMinaei Bidgoli,A Topch,W F Punch. Ensembles of Partitions via Data Resampling.In:Proceedings International Conference on Information Technology, Coding and Computing.Las Vegas:IEEE CS,2004.188-192.
    [61]A Topchy,AK Jain, W F Punch.Combining Multiple Weak Clusterings. In: Xindong Wu,Alex Tuzhilin,Jude Shavlik,eds.Proceedings of the 3rd IEEE International Conference on Data Mining.Melbourne,Florida:IEEE CS,2003. 331-338.
    [62]A Topchy,B Minaei-Bidgoli,Anil K.Jain,William F. Punch.Adaptive Clustering Ensembles.In:Josef Kittler,Maria Petrou,Mark Nixon,eds.Proceedings of the 17th InternationalConference on Pattern Recognition.Cambridge UK:IEEE CS,2004. 272-275.
    [63]A Fred, A K Jain. Data Clustering Using Evidence Accumulation.In:R.Kasturi,D. Laurendeau,C. Suen,eds.Proceedings of the 16th International Conference on Pattern Recognition.Quebec City, QC, Canada:IEEE CS,2002.276-280.
    [64]A Streh,l JGhosh. Cluster Ensembles:A Knowledge Reuse Framework for Combining Multiple Partitions. Journal of Machine Learning Research, 2003,3(3):583-617.
    [65]H Ayad, M Kamel. Finding Natural Clusters Using MultiClusterer Combiner Based on Shared Nearest Neighbors.In:T Windeatt, F Roli,eds.Proceedings of the 4th International Workshop on Multiple Classifier Systems.New York:Springer, 2003.166-175.
    [66]H Ayad, O A Basir, M Kamel.A ProbabilisticModelUsing Information Theoretic Measures for Cluster Ensembles.In:F Roli,T Windeatt,eds.Proceedings of the 5th International Workshop on Multiple Classifier Systems.Cagliari, Italy:Springer, 2004.144-153.
    [67]A Topchy, A K Jain, W Punch. A Mixture Model for Clustering Ensembles.In:M W.Berry,U Dayal,C Kamath,D Skillicorn,eds.Proceedings of the 4th SIAM International Conference on Data Mining.Florida:Society for Industrial and Applied Mathematics,2004.379-390.
    [68]阳琳斌,王文渊.聚类融合方法综述.计算机应用研究,2005,(12):8~14.
    [69]A. Blum, T. Mitchell. Combining labeled and unlabeled data with co-training.In:Proceedings of the 11th Annual Conference on Computational Learning Theory.Madison, Wisconsin:ACM,1998.92-100.
    [70]K.Nigam and R.Ghani.Analyzing the effectiveness and applicability of Co-training.In:Proceedings of Information and Knowledge Management. McLean, Virginia:ACM,2000.86-93
    [71]Ion Muslea,Steven Minton and Craig A.Knoblock.Selective sampling with redundant views. In:Proceedings of National Conference on Artificial Intelligence.North Falmouth,Massachusetts:AAAI,2000.621-626.
    [72]Ion Muslea,Steven Minton,Craig A.Knoblock.Active+Semi-Supervised Learning= Robust Multi-View Learning.In:Claude Sammut, Achim G. Hoffmann,eds.Proceedings of the Nineteenth International Conference on Machine Learning.Stanford:ACM,2002.435-442.
    [73]S. Goldman, Y. Zhou. Enhancing supervised learning with unlabeled data.In:Proceedings of the 17th International Conference on Machine Learning.San Francisco,CA:ACM,2000.327-334.
    [74]Y. Zhou, S. Goldman. Democratic co-learning.In:Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence.Boca Raton,Florida:IEEE CS,2004.594-602.
    [75]Zhou Zhihua,Li Ming.Tri-training:Exploiting unlabeled data using three classifiers.IEEE Transactions on Knowledge and Data Engineering,2005,17(11): 1529-1541.
    [76]王珏,周志华,周傲英主编.机器学习及其应用.北京:清华大学出版社,2006.
    [77]Kai Goebel, Weizhong Yan. Choosing Classifiers for Decision Fusion.In:Per Svensson,Johan Schubert,eds.Proceedings of the 7th International Information Fusion Conference.Stockholm,Sweden:ISIF,2004.563-568.
    [78]Jin Huang,Charles X. Ling. Dynamic Ensemble Re-Construction for Better Ranking.In:A Jorge,L Torgo,P Brazdil,R Camacho,J Gama,eds.Proceedings of the 9th European Conference on Principles and Practice of Knowledge Discovery in Databases.Porto,Portugal:Springer,2005.511-518.
    [79]Hela Zouari, Laurent Heutte,Yves Lecourtier, Controlling the diversity in classifier ensembles through a measure of agreement.Pattern Recognition. 2005,38(11):2195-2199.
    [80]Paul N. Bennett, Susan T. Dumais.The Combination of Text Classifiers Using Reliability Indicators.Information Retrieval,2005,8(11):67-100.
    [81]Fabio Roli,Giorgio Giacinto,Gianni Vernazza.Methods for Designing Multiple Classifier Systems.In:Josef Kittler,eds.Proceedings of the First International Workshop on Multiple Classifier Systems.Cambridge:Springer,2001.78-87.
    [82]Steven N. Thorsen, Mark E. Oxley.Comparing Fusors within a Category of Fusors. In:Per Svensson,Johan Schubert,eds.Proceedings of the 7th International Information Fusion Conference.Stockholm,Sweden:ISIF,2004.435-441.
    [83]Nayer Wanas, Rozita Dara, Mohamed S. Kamel. Co-operative Training in Classifier Ensembles.In:Per Svensson,Johan Schubert,eds.Proceedings of the 7th International Information Fusion Conference.Stockholm, Sweden:ISIF,2004. 74-78.
    [84]Rolf Haenni, Stephan Hartmann. A General Model for Partially Reliable Information Sources. In:Per Svensson,Johan Schubert,eds.Proceedings of the 7th International Information Fusion Conference.Stockholm, Sweden:ISIF,2004. 153-160.
    [85]Giorgio Giacinto,Fabio Roli. Dynamic Classifier Selection. In:Josef Kittler,eds.Proceedings of the First International Workshop on Multiple Classifier Systems.Cambridge:Springer,2001.177-189.
    [86]雍少为,郁文贤,郭桂荣.信息融合的熵理论.系统工程与电子技术,1995,17(10):1-6.
    [87]章新华,林良骥,王骥程.目标识别中信息融合的准则和方法.软件学报,1997,3(4):303~307.
    [88]Bienvenu Fassinut Mombot,Jean Bernard Choquel. A New Probabilistic and Entroy Fusion Approach for Management of Information Sources.Information fusion,2004,5(1):35-47.
    [89]Qiao Xiangdong, Chang Kuochu. Information Matrix Fusion with Feedback Versus Number of Sensors.In:Per Svensson,Johan Schubert,eds.Proceedings of the 7th International Information Fusion Conference.Stockholm, Sweden: ISIF,2004.686-692.
    [90]Mounir Hamza, Denis Larocque. An Empirical Comparison of Ensemble Methods Based on Classification Trees. Journal of Statistical Computation and Simulation, 2005,75(8):629-643.
    [91]Robert P.W. Duin.The Combining Classifier:to Train or Not to Train? In:R. Kasturi,D. Laurendeau,C. Suen,eds.Proceedings of the 16th International Conference on Pattern Recognition.Quebec City, QC, Canada:IEEE CS,2002. 765-770.
    [92]M. Jordon, R. Jacobs, Hierarchical mixtures of experts and the EM algorithm, Neural Computing,1994,34:181-214.
    [93]L.Lam,Classifier combinations:implementations and theoretical issues.In:J.Kittler, eds.Proceedings of the First International Workshop on Multiple Classifier Systems.Cagliari,Italy:Springer,2000.77-86.
    [94]J. Asker,R. Maclin.Ensemble as a sequence of classifiers. In:Proceedings of the Fifteenth International Joint Conference on Artificial Intelligence. NAGOYA, Aichi, Japan:Morgan Kaufmann,1997.860-865.
    [95]王正群,叶晖,杨静宇等.模糊多分类器组合.小型微型计算机系统,2003,24(1):86~89.
    [96]Latinne P,Debeir O,Decaestecker C.Combining Different Methods and Numbers of Weak Decision Trees.Pattern Analysis & Applications,2002,5:201-209
    [97]David Alexander Gaines. Investigations into the cognitive abilities of alternate Learning Classifier System architectures:[A dissertation for the degree of Doctor]. Lexington, Kentucky:the University of Kentucky,2006.
    [98]Roli F,Giacinto GDesign of Multiple Classifier Systems.Bunke H,Kandel A(Eds.).Hybrid Methods in Pattern Recognition.World Scientific Publishing, 2002.199-226
    [99]L.Kuncheva.Combining Pattern Classifiers:Methods and Algorithms,A Wiley-Interscience publication,2004.
    [100]L.Kuncheva.Combining classifiers by clustering,selection and decision templates:an experiment,IEEE Transactions on Systems Man and Cybernetics, 2003,32(2):146-156
    [101]K.Woods,W.P.K.Jr,K.Bowyer.Combination of multiple classifiers using local accuracy estimates.IEEE Transactions on Pattern Analysis and Machine Intelligence,1997,19(4):405-410.
    [102]J.Kittler,M.Hatef,R.Duin,and J.Matas. On Combining Classifiers, IEEE Transactions on Pattern Analysis and Machine Intelligence,1998,20(3):226-239.
    [103]K.Tumer,J.Ghosh.Estimating the bayes error rate through classifier combining.In:Proceedings of the 13th Conference on Pattern Recognition.Vienna:IEEE CS,1996.695-699.
    [104]J. B. Predd,S. R. Kulkarni,H. V. Poor,D. N. Osherson. Scalable Algorithms for Aggregating Disparate Forecasts of Probability. In:Proceedings of the Ninth International Conference on Information Fusion, Florence,Italy:ISIF,2006.1-8.
    [105]马树华,王熙照.模糊积分及其在多分类问题中的应用:[硕士学位论文].保定:河北大学,2004.
    [106]L.Garmendia,A.Salvador,E.Trillas,C.Moraga.On measuringμ-T-inconditionality of fuzzy relations.Journal of Soft Computing-A Fusion of Foundations, Methodologies and Applications,2005,9(3):164-171.
    [107]James M. Keller, Paul Gader, Hossein Tahani,etc. Advances in Fuzzy for Pattern Recognition.Fuzzy Sets and Systems,1994,65:273-283.
    [108]Lei Xu, Adam Krzyzak, Ching Y Suen. Methods of Combining Multiple Classifiers and Their Application to Handwriting Recognition. IEEE Transaction on SMC,1992,22:418-435
    [109]Lim T,Loh W,Shih Y.A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-three Old and New Classification Algorithms.Machine Learning,2000,40:203-228.
    [110]Mounir Hamza,Denis Larocque. An Empirical Comparison of Ensemble Methods Based on Classification Trees.Journal of Statistical Computation and Simulation,2005,75(8):629-643.
    [111]Ludmila I. Kuncheva, Christopher J. Whitaker. Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy.Machine Learning,2003,51:181-207.
    [1121王熙照.模糊测度和模糊积分及在分类技术中的应用.科学出版社,2008.
    [113]Grabisch M,Orlovski S A,Yager R R.Fuzzy aggregation of numerical preferences.Fuzzy sets in decision analysis,operations research and statistic, 1999.31-68
    [114]Grabisch M.A New Algorithm for Identifying Fuzzy Measures and Its Application to Pattern Recognition. In:Proceedings of the fourth International Conference on Fuzzy Engineering Symposium. Yokohama, Japan:IEEE CS,1995. 145-150.
    [115]Albert Hung Ren Ko,Robert Sabourin,Alceu de Souza Britto,Jr.A New Objective Function for Ensemble Selection in Random Subspaces.In:Y.Y.Tang,S.P. Wang,G.Lorette,D.S.Yeung,H.Yan,eds.Proceedings of the 18th International Conference on Pattern Recognition.Hong Kong, China: IEEE CS,2006.185-188.
    [116]G.Brown,J.Wyatt,R.Harris,X.Yao.Diversity Creation Methods:A Survey and Categorisation.International Journal of Information Fusion,2005,6(1):5-20.
    [117]Claude C. Chibelushi, Farzin Deravi,John S. D. Mason. Adaptive Classifier Integration for Robust Pattern Recognition.IEEE Transactions On Systems, Man, and Cybernetics, Part B:Cybernetics,1999,29(6):902-907.
    [118]唐伟,周志华.基于Bagging的选择性聚类集成.软件学报,2005,16(4):496~502.
    [119]Zhou ZH, Tang W. Clusterer ensemble.Knowledge-Based Systems,2006, 19:77-83.
    [120]Stefan T. Hadjitodorov, LudmilaI. Kuncheva, LudmilaP.Todorova. Moderate diversity for better cluster ensembles.Information Fusion,2006,7:264-275.
    [121]Zhou ZH,Wu J,Tang W.Ensembling neural networks:Many could be better than all.Artificial Intelligence,2002,137(1-2):239-263.
    [122]Xiaoli Z.Fern,Wei Lin. Cluster Ensemble Selection. Statistical Analysis and Data Mining,2008,(3):128-141.
    [123]Carlotta Domeniconi,Muna Al Razgan. Weighted Cluster Ensembles:Methods and Analysis.ACM Transactions on Knowledge Discovery from Data.2009,2(4):1-42.
    [124]Kagan Turner, Adrian K.Agogino. Ensemble clustering with voting active clusters.Pattern Recognition Letters.2008,29:1947-1953.
    [125]Javad Azimi, Xiaoli Fern. Adaptive Cluster Ensemble Selection.In:Proceedings of the Twenty-First International Joint Conference on Artificial Intelligence. Pasadena,California:AAAI,2009.992-997.
    [126]蔡自兴,徐光祐.人工智能及其应用.北京:清华大学出版社,2004.
    [127]Dorigo M, Luca Maria Gambardella.Ant Colony System:A Cooperative Learning Approach to the Traveling Salesman Problem.IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION,1997,1(1):53-66.
    [128]Tsaipei Wang.Comparing Hard and Fuzzy C-Means for Evidence-Accumulation Clustering.In:Proceedings of the 18th International conference on Fuzzy Systems.Korea:IEEE,2009.468-473.
    [129]Hanan G.Ayad,MohamedS.Kamel. On voting-based consensus of cluster ensembles. Pattern Recognition,2010,43:1943-1953
    [130]Linyun Yang, Hairong Lv,Wenyuan Wang. Soft Cluster Ensemble Based on Fuzzy Similarity Measure.In:Proceedings of IMACS Multiconference on Computational Engineering in Systems Applications.Beijing:IEEE,2006. 1994-1997.
    [131]Punera K,Ghosh J.Soft cluster ensembles. In:Advances in Fuzzy Clustering and its Applications, J.V.de Oliveira and W.Pedrycz, Eds.John Wiley & Sons,Ltd,2007.69-90.
    [132]雷英杰,赵哗,王涛等.直觉模糊语义匹配的相似性度量.空军工程大学学报,2005,6(2):83~87.
    [133]路艳丽,雷英杰,李兆渊.直觉模糊相似关系的构造方法.计算机应用,2008,28(2):311~314.
    [134]Li M,Zhou Z H.SETRED:Self-training with editing.In:Proceedings of the 9th Pacific-Asia Conference on Knowledge Discovery and Data Mining.Hanoi,Vietnam:Springer,2005.611-621.
    [135]邓超,郭茂祖.基于自适应数据剪辑策略的Tri-training算法.计算机学报,2007,30(8):1213~1226.
    [136]Wang W,Zhou Z.H.Analyzing co-training style algorithms. In:J Kok,J Koronacki, R Lopez de Mantaras,S Matwin,D Mladenic,A Skowron, eds.Proceedings of the 18th European Conference on Machine Learning, Warsaw,Poland:Springer,2007.454-65.
    [137]Zheng WM,Zhou XY,Zou CR,etc.Facial expression recognition using kernel canonical correlation analysis.IEEE Transaction on Neural Networks,2006, 17(1):233-238.
    [138]Loog M,B.van Ginneken B,Duin RPW.Dimensionality reduction by canonical contextual correlation projections.In:Pajdla Tomas,Matas Jiri,eds.Proceedings of the 8th European Conference on Computer Vision.Berlin:Springer,2004.562-573.
    [139]Friman O,Carlsson J,Lundberg P,etc.Detection of neural activity in functional MRI using canonical correlation analysis. Magnetic Resonance in Medicine,2001,45(2):323-330.
    [140]Nielsen AA.Multiset canonical correlations analysis and multispectral,truly multitemporal remote sensing data.IEEE Transaction on Image Processing, 2002,11(3):293-305.
    [141]Vlassis N,Motomura Y,Krose B.Supervised linear feature extraction for mobile robot localization.In:Proceedings of the IEEE Internatinal Conference on Robotics and Automation.San Francisco:IEEE,2000.2979-2984.
    [142]Li YY,Shawe Taylor J.Using KCCA for Japanese-English cross-language information retrieval and document classification.Journal of Intelligent Information Systems,2006,27(2):117-133.
    [143]Zhou ZH,Zhan DC,Yang Q.Semi-Supervised learning with very few labeled training examples.In.Robert C Holte,Adele E Howe,eds.Proceedings of the 22nd AAAI Conference on Artificial Intelligence.Vancouver:AAAI,2007.675-680.
    [144]彭岩,张道强.局部判别型典型相关分析算法.计算机工程与应用,2008,44(21):126~129.
    [145]彭岩,张道强.半监督典型相关分析算法.软件学报,2008,19(11):2822~2832.
    [146]吴诚鸥,秦伟良.近代实用多元统计分析.北京:气象出版社,2007.
    [147]David R. Hardoon, Sandor Szedmak,John Shawe Taylor.Canonical correlation analysis:An overview with application to learning methods. Neural Computation,2004,16(12):2639-2664.
    [148]Nadeau C,Bengio Y. Inference for the generalization error.Machine Learning,2003,52(3):239-281.
    [149]安德森(Anderson,D.r)等著;张建华等译.商务与经济统计.北京:机械工业出版社,2006.319~322.
    [150]段新生.证据理论与决策、人工智能.北京:中国人民大学出版社,1993.
    [151]王先甲,匡小新,冯尚友.分离样本与复合样本统计证据推断的一致性.统计与决策,1996,11(6):662~666
    [152]匡小新,王先甲.水文特性参数的证据统计推断方法.武汉大学学报(工学版),2003,36(1):32~36.
    [153]杨风召,朱扬勇,施伯乐.IncLOF动态环境下局部异常的增量挖掘算法.计算机研究与发展,2004,41(3):477~484.
    [154]Jiawei Han,Micheline Kamber著.范明,孟小峰译.数据挖掘概念与技术.北京:机械工业出版社,2007.273~299.
    [155]K. Nigam, R. Ghani.Analyzing the effectiveness and applicability of co-training.In:Proceedings of the 9th ACM International Conference on Information and Knowledge Management.New York:ACM,2000.86-93.
    [156]蔡自兴,陈爱斌.人工智能辞典.北京:化学工业出版社,2008.
    [157]蔡自兴,贺汉根,陈虹.未知环境中移动机器人导航控制理论与方法.北京:科学出版社,2009.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700