用户名: 密码: 验证码:
基于证据理论的分类方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
面对当前大量产生及累积的数据和信息人们已不再满足于对信息的查询和统计,对数据分析的智能化和自动化要求越来越高。数据挖掘和机器学习技术的发展已经使得人们可以从大量的、不确定的和有噪声的数据中智能的和自动的提取出隐含在其中的有价值信息和知识。分类方法作为数据挖掘、机器学习和模式识别中的一项重要方法,它可以有效的帮助人们对获取的数据实例进行预测分析、对未知的模式样本进行判断识别,以便做进一步的分析和利用。
     证据理论是一种对不确定性问题和知识进行表示和处理有力工具,将证据理论与分类方法的结合研究能够提高分类器对不确定数据的分析、处理和表示能力。同时,将证据理论中的证据处理机制和合成规则引入分类方法中也能有效的提高分类器的分类准确率。当前基于证据理论的分类方法研究主要有两种:一、将证据处理和合成机制与分类方法的设计相结合,即在单个分类器的设计中用证据理论的方法来提高分类器的性能;二、将证据理论的组合规则应用于多分类器集成,组合多个分类器的结果以获得更高的分类准确率。本文基于证据理论对分类方法进行了深入研究,论文主要工作和创新点如下:
     (1)在将证据理论的证据表示和合成方法与分类器设计相结合的研究方面,在基于证据理论的k-NN分类算法和局部平均向量分类算法研究的基础上,提出了一种子空间局部平均证据分类算法(Subspace Local Mean Evidence Classifier, SLMEC)。该方法将训练样本集中每个类别的相对于测试样本的局部平均向量视为是对测试样本进行分类的证据,同时该方法在多次随机均等划分得到的多个特征子空间中收集更多的有效证据,然后将所有收集到的证据基于证据理论的证据表示和合成方法进行处理,最终完成对测试样本的分类。SLMEC由于采用了局部平均向量作为分类决策中的证据,因而具有很好的抗噪声和处理不平衡数据的能力,并且由于使用了结合在子特征空间中收集到的证据进行分类决策的方法,使得该方法达到了更高的分类准确率并且在高维数据上具有更好的表现;
     (2)在基于子空间局部平均证据分类算法(SLMEC)研究的基础上,通过与k最近邻局部超平面分类算法相结合,进一步提出了一种随机子空间证据分类算法(RandomSubspace Evidence Classifier, RSEC)。RSEC仍然采用了在随机划分生成的子空间中进行证据的收集并用于辅助分类决策的方法,但该方法中采用了局部生成的超平面作为对测试样本进行分类的证据。实验结果表明,与SLMEC算法相似,RSEC在不平衡数据和高维数据上也具有很好的表现,并且在UCI基准数据、人工合成数据和高维人脸识别应用中的测试结果显示,RSEC具有很好的分类性能。该研究还进一步表明我们提出的结合原特征空间和子空间中收集到的证据共同进行分类决策的方法的有效性;
     (3)在将证据理论的组合规则应用于多分类器集成的研究方面,将基于证据理论的多分类器集成方法应用于随机森林算法的多分类器组合阶段,使用基于证据理论的多分类器集成方法取代传统随机森林算法中的简单投票法给出了两种基于证据理论改进的随机森林算法。一种方法为直接使用随机森林算法中决策树基分类器的度量层输出作为基本信任分配,然后使用证据理论合成规则进行多分类器组合;另一种方法是结合Rogova提出的基于证据理论的分类器集成方法来组合随机森林算法中的多个决策树基分类器结果。实验结果表明,两种基于证据理论方法改进的随机森林算法与基于简单投票法的随机森林算法相比泛化性能均有明显提升;
     (4)在基于证据理论改进的随机森林算法研究的基础上,通过研究集成学习中的两个基本问题,即“如何提高个体基分类器的性能和多样性”和“使用更好的组合多个基分类器的方法”,提出了一种基于证据理论集成的多样性森林算法。该方法中采用决策树分类器作为基分类器,利用了随机子空间方法、Bagging法和基于主成分分析的坐标轴旋转法的叠加效应来产生决策树基分类器算法的多样性;同时该方法采用了不同于传统投票算法的基于证据理论的多分类器集成方法进行多分类器的集成。在UCI基准数据、人工合成数据和语音情感识别应用中的测试结果显示,基于证据理论集成的多样性森林算法与随机森林,决策森林和旋转森林等基于决策树的多分类器集成算法相比具有更好的性能。
Because there are huge amounts of data and information that have been generated andstored so far, people are not just satisfied with querying and statistics. It is a moresophisticated demand to analysis data and information intelligently and automatically. Thetechnology of data mining and machine learning can help us extract potentially valuableinformation and knowledge from huge amounts of uncertainty and noisy data. Classificationmethod, as an important method of data mining, machine learning and pattern recognition,can help people to make prediction, analysis data and recognise unknown patterns.
     Evidence theory is a powerful tool for handling uncertain problems and representing theuncertain knowledge. Combining the framework of the evidence theory to the classificationprocedure can effectively improve the accuracy and capability of the classifier. Currently,there are two main aspect of research about combining the evidence theory with theclassification method: firstly, combining the framework of evidence theory to the procedure ofsingle classifiers to improve its performance; and secondly, applying the combination rule ofevidence theory to multi-classifier ensemble. In this paper, we study the classification methodbased on the framework of evidence theory thoroughly, our works and innovations are asfollows:
     (1) A new classifier called subspace local mean evidence classifier (SLMEC) ispresented. The method first calculates the local mean vectors of each class regarding itsdistance with the test sample as evidences. Then for obtaining enough evidences, SLMECaccumulate some evidences from randomly divided subspaces of feature space. For eachevidence, the basic belief assignment is computed according to the distance between localmean vectors and test sample. In the following all these evidences represented by basic beliefassignments are then pooled together by the Dempster’s rule, and finally SLMEC assigns theclass label to test sample based on the combined belief assignment. The novel work of thismethod lies in that the evidences are created not only in whole feature space but also insubspaces. Besides, we consider the local mean vector and its distance with the test sample asthe evidence. SLMEC not only can deal with noisy and unbalanced data but also has goodperformance on high dimensional data.
     (2) Based on the research of SLMEC, we present another new classifier called RandomSubspace Evidence Classifier (RSEC). RSEC still make use of the information in both thewhole feature space and subspaces as SLMEC, but it considers the local hyperplane and itsdistance with the test sample for each class as the evidences. The experiments in the datasets from UCI machine learning repository, artificial data and face image database illustrate thatRSEC often gives better performance when performing the classification task. Besides, likeSLMEC, RSEC can deal with unbalanced data and high dimensional data well. The goodperformance of RSEC validated that our idea of making use of information in both wholefeature space and subspace is effective.
     (3) On the research of multiple classifier ensemble based on the evidence theory, wepresent two kind of improved random forest algorithm, which make use of combinationmethod of evidence theory to replace the traditional voting method in classical random forestalgorithm. One of our methods is making use of the measure level output of decision tree asthe basic belief assignment for each class. In this method, the multiple classifier combinationby using Dempster’s rule of evidence theory can be very easy. For the other method we takeplace of voting method by the Rogova’s evidence theory based multi-classifier combinationmethod. It is showed by the experiment result that two improved random forest algorithm byusing evidence theory based combination method has better performance comparing theclassical voting based random forest.
     (4) Based on the foundational research of improved random forest by using evidencetheory based combination method, we proposed a new classification ensemble method calledEvidence Theory based Diversity Forest (EDF). In our method the base classifier is decisiontrees and the diversity of the base classifier is produced by the combination of randomsubspace method、Bagging and principal component analysis. On the stage of multipleclassifier combination method, our method is making use of the evidence theory basedcombination method. The experiment results on UCI machine learning repository, artificialdata and application on speech emotion recognition suggest that the proposed approach givesbetter performance with the compared ensemble learning method, such as random forest,decision forest and rotation forest.
引文
[1]王珏,周志华,周傲英.机器学习及应用[M].北京:清华大学出版社,2006.3
    [2] Richard O. Duda Peter E. Hart, David G. Stork. Pattern Classification[M].2ed.北京:机械工业出版社,2003.9
    [3] Denoeux Thierry. A k-nearest neighbor classification rule based on Dempster-Shafertheory [J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics,1995.5,25(5):737-760
    [4] Denoeux Thierry, editor. An evidence-theoretic neural network classifier[A]. IEEEInternational Conference on Intelligent Systems for the21st Century[C].1995:712-717
    [5] Denoeux Thierry. A neural network classifier based on Dempster-Shafer theory [J]. IEEETransactions on Systems, Man and Cybernetics, Part A: Systems and Humans,2000,30(2):131-150
    [6] Denoeux Thierry. Analysis of evidence-theoretic decision rules for pattern classification[J]. Pattern recognition,1997,30(7):1095-1107
    [7] Den ux Thierry. Reasoning with imprecise belief structures [J]. International Journal ofApproximate Reasoning,1999,20(1):79-111
    [8] Zouhal Lalla Meriem, Den ux Thierry. An evidence-theoretic k-NN rule with parameteroptimization [J]. IEEE Transactions on Systems, Man, and Cybernetics, Part C:Applications and Reviews,1998,28(2):263-271
    [9]刘明,袁保宗,唐晓芳.证据理论k-NN规则中确定相似度参数的新方法[J].电子学报,2005,(04):766-768
    [10] Wang Zhuang, Hu Wei-Dong, Yu Wen-Xian, editors. A quick evidential classificationalgorithm based on k-nearest neighbor rule[A]. International Conference on MachineLearning and Cybernetics,2003[C].2003:3248-3252
    [11] Zhu Hongwei, Basir Otman, editors. A K-NN associated fuzzy evidential reasoningclassifier with adaptive neighbor selection[A]. Third IEEE International Conference onData Mining[C].2003:709-712
    [12] Zhu Hongwei, Basir Otman. An adaptive fuzzy evidential nearest neighbor formulationfor classifying remote sensing images [J]. IEEE Transactions on Geoscience and RemoteSensing,2005,43(8):1874-1889
    [13] Su Zhi-gang, Wang Pei-hong, editors. A robust adaptive version of evidence-theoretick-NN classification rule[A]. Sixth International Conference on Fuzzy Systems andKnowledge Discovery[C].2009:525-529
    [14] Su Zhi-Gang, Wang Pei-Hong. Improved adaptive evidential k-NN rule and itsapplication for monitoring level of coal powder filling in ball mill [J]. Journal of ProcessControl,2009,19(10):1751-1762
    [15] Su Zhi-gang, Wang Pei-hong. Minimizing neighborhood evidential decision error forfeature evaluation and selection based on evidence theory [J]. Expert Systems WithApplications,2012,39(1):527-540
    [16] Denoeux T, Bjanger M Skarstein, editors. Induction of decision trees from partiallyclassified data using belief functions[A]. IEEE International Conference on Systems, Man,and Cybernetics,2000[C]: IEEE:2923-2928
    [17] Guo Huawei, Shi Wenkang, Du Feng. EDTs: Evidential decision trees [J]. Fuzzy Systemsand Knowledge Discovery,2005:487-488
    [18] Xu Lei, Krzyzak Adam, Suen Ching Y. Methods of combining multiple classifiers andtheir applications to handwriting recognition [J]. IEEE Transactions on Systems, Man andCybernetics,1992,22(3):418-435
    [19] Rogova Galina. Combining the results of several neural network classifiers [J]. Neuralnetworks,1994,7(5):777-781
    [20] Fran ois J, Grandvalet Y, Den ux T, et al. Resample and combine: an approach toimproving uncertainty representation in evidential pattern classification [J]. InformationFusion,2003,4(2):75-85
    [21] Alt n ay Hakan. On the independence requirement in Dempster-Shafer theory forcombining classifiers providing statistical evidence [J]. Applied Intelligence,2006,25(1):73-90
    [22] Alt n ay Hakan. Ensembling evidential k-nearest neighbor classifiers throughmulti-modal perturbation [J]. Applied Soft Computing,2007,7(3):1072-1083
    [23] Johansson Ronnie, Bostrom H, Karlsson Alexander, editors. A study on class-specificallydiscounted belief for ensemble classifiers[A]. IEEE International Conference onMultisensor Fusion and Integration for Intelligent Systems[C].2008:614-619
    [24] Yang Yi, Han Chongzhao, Han Deqiang. Classifier fusion based on evidence theory andits application in face recognition [J]. Journal of Electronics (China),2009,26(6):771-776
    [25] Den ux Thierry, Masson M-H. EVCLUS: evidential clustering of proximity data [J].IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics,2004,34(1):95-109
    [26] Masson Marie-Hélène, Denoeux T. ECM: An evidential version of the fuzzy c-meansalgorithm [J]. Pattern recognition,2008,41(4):1384-1397
    [27] Masson Marie-Hélène, Denoeux Thierry. RECM: Relational evidential c-meansalgorithm [J]. Pattern Recognition Letters,2009,30(11):1015-1026
    [28] Masson Marie-Hélène, Denoeux Thierry. Ensemble clustering in the belief functionsframework [J]. International Journal of Approximate Reasoning,2011,52(1):92-109
    [29] Younes Zoulficar, Abdallah Fahed, Den ux Thierry. An evidence-theoretic k-nearestneighbor rule for multi-label classification [J]. Scalable Uncertainty Management,2009:297-308
    [30] Younes Zoulficar, Abdallah Fahed, Den ux Thierry. Evidential multi-label classificationapproach to learning from data with imprecise labels [J]. Computational Intelligence forKnowledge-Based Systems Design,2010:119-128
    [31]杨风暴,王肖霞. D-S证据理论的冲突证据合成方法[M].北京:国防工业出版社,2010.2
    [32] Smets Philippe, Kennes Robert. The transferable belief model [J]. Artificial Intelligence,1994,66(2):191-234
    [33] Smets Philippe, Ristic Branko. Kalman filter and joint tracking and classification basedon belief functions in the TBM framework [J]. Information Fusion,2007,8(1):16-27
    [34]李烨.基于支持向量机的集成学习研究[D]:上海交通大学,2007
    [35] Kuncheva L.I. Combining classifiers: Soft computing solutions [J]. Pattern Recognition:From Classical to Modern Approaches,2001:427-451
    [36]张春霞.集成学习中有关算法的研究[D]:西安交通大学,2010
    [37] Mitani Y, Hamamoto Y. A local mean-based nonparametric classifier [J]. PatternRecognition Letters,2006,27(10):1151-1159
    [38] Ho Tin Kam. Nearest neighbors in random subspaces [J]. Advances in PatternRecognition,1998:640-648
    [39] Ho Tin Kam. The random subspace method for constructing decision forests [J]. IEEETransactions on Pattern Analysis and Machine Intelligence,1998,20(8):832-844
    [40] UCI Machine Learning Repository [DB]2007. Available from: http://www.ics.uci.edu/mlearn/MLRepository. html
    [41] Keller James M, Gray Michael R, Givens James A. A fuzzy k-nearest neighbor algorithm[J]. IEEE Transactions on Systems, Man and Cybernetics,1985,(4):580-585
    [42] Li Boyu, Chen Yun Wen, Chen Yan Qiu. The nearest neighbor algorithm of localprobability centers [J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B:Cybernetics,2008,38(1):141-154
    [43] Vincent Pascal, Bengio Yoshua. K-local hyperplane and convex distance nearestneighbor algorithms [J]. Advances in Neural Information Processing Systems,2002,2:985-992
    [44] Nanni Loris. Hyperplanes for predicting protein–protein interactions [J].Neurocomputing,2005,69(1):257-263
    [45] Nanni Loris. A novel ensemble of classifiers for protein fold recognition [J].Neurocomputing,2006,69(16):2434-2437
    [46] Yang Tao, Kecman Vojislav. Adaptive local hyperplane classification [J].Neurocomputing,2008,71(13):3001-3004
    [47] Yang Tao, Kecman Vojislav, Cao Longbing. Classification by ALH-Fast algorithm [J].Tsinghua Science&Technology,2010,15(3):275-280
    [48] Bi Yaxin, Guan Jiwen, Bell David. The combination of multiple classifiers using anevidential reasoning approach [J]. Artificial Intelligence,2008,172(15):1731-1751
    [49] Bi Yaxin, Bell David, Guan Jiwen, editors. Combining evidence from classifiers in textcategorization[A]. Knowledge-Based Intelligent Information and Engineering Systems[C].2004:521-528
    [50] Bi Yaxin, Bell David, Wang Hui, et al. Combining multiple classifiers using dempster’srule of combination for text categorization [J]. Modeling Decisions for ArtificialIntelligence,2004:11-22
    [51] Bi Yaxin, Dubitzky Werner, editors. An evidential approach in ensembles[A].Proceedings of the2006ACM symposium on Applied computing[C].2006:1-6
    [52] Jiawei Han Michiline Kamber. Data Mining: Concepts and Techinques[M].2ed.北京:机械工业出版社,2007.3
    [53] Breiman L. Random forests [J]. Machine learning,2001,45(1):5-32
    [54] Webb G.I. Multiboosting: A technique for combining boosting and wagging [J]. Machinelearning,2000,40(2):159-196
    [55] Rodriguez J.J., Kuncheva L.I., Alonso C.J. Rotation forest: A new classifier ensemblemethod [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(10):1619-1630
    [56] Abdel Hady Mohamed Farouk, Schwenker Friedhelm, Palm Günther.2010Special Issue:Semi-supervised learning for tree-structured ensembles of RBF networks withCo-Training [J]. Neural networks,2010,23(4):497-509
    [57] Al-Ani Ahmed, Deriche Mohamed. A new technique for combining multiple classifiersusing the Dempster-Shafer theory of evidence [J]. Journal of Artificial IntelligenceResearch,2002,7:333-361
    [58] Almeida Rui Jorge, Kaymak Uzay. TS-Models from Evidential Clustering [J].Information Processing and Management of Uncertainty in Knowledge-Based SystemsTheory and Methods,2010:228-237
    [59] Alpaydin Ethem. Introduction to Machine Learning[M].北京:机械工业出版社,2009.6
    [60] Antoine Violaine, Quost Benjamin, Masson M-H, et al. CECM: Constrained evidentialC-means algorithm [J]. Computational Statistics&Data Analysis,2012,56(4):894-914
    [61] Aregui Astride, Den ux Thierry, editors. Fusion of one-class classifiers in the belieffunction framework[A].10th International Conference on Information Fusion[C].2007:1-8
    [62] Aregui Astride, Den ux Thierry. Constructing consonant belief functions from sampledata using confidence sets of pignistic probabilities [J]. International Journal ofApproximate Reasoning,2008,49(3):575-594
    [63] Aregui Astride, Denoeux Thierry, editors. Novelty detection in the belief functionsframework[A]. Proceedings of IPMU[C].2006:412-419
    [64] Banerjee Tribeni Prasad, Das Swagatam. Multi-sensor data fusion using support vectormachine for motor fault detection [J]. Information Sciences,2012.
    [65] Bell D, Guan J, Bi Y. An Evidential Approach to Classification Combination for TextCategorisation [J]. Knowledge Mining,2005:13-22
    [66] Benmokhtar Rachid, Huet Benoit, editors. Perplexity-based evidential neural networkclassifier fusion using MPEG-7low-level visual features[A]. Proceedings of the1st ACMinternational conference on Multimedia information retrieval[C].2008:336-341
    [67] Bostrom H, Johansson Ronnie, Karlsson Alexander, editors. On evidential combinationrules for ensemble classifiers[A].11th International Conference on InformationFusion[C].2008:1-8
    [68] Breiman L. Bagging predictors [J]. Machine learning,1996,24(2):123-140
    [69] Castro Juan Luis. Local distance-based classification [J]. Knowledge-Based Systems,2008,21(7):692-703
    [70] Chen Chao, Liaw Andy, Breiman Leo. Using random forest to learn imbalanced data [J].University of California, Berkeley,2004
    [71] Chou Te-Shun, Yen Kang K, An Liwei, et al., editors. Fuzzy belief pattern classificationof incomplete data[A]. IEEE International Conference on Systems, Man andCybernetics[C].2007:535-540
    [72] C me Etienne, Oukhellou Latifa, Denoeux Thierry, et al. Learning from partiallysupervised data using mixture models and belief functions [J]. Pattern recognition,2009,42(3):334-348
    [73] Dave Deepika, Vashishtha Sumit, editors. Efficient Intrusion Detection with KNNClassification and DS Theory[A]. Proceedings of All India Seminar on BiomedicalEngineering[C].2012:173-188
    [74] Den ux Thierry. Reasoning with imprecise belief structures [J]. International Journal ofApproximate Reasoning,1999,20(1):79-111
    [75] Den ux Thierry. Conjunctive and disjunctive combination of belief functions induced bynondistinct bodies of evidence [J]. Artificial Intelligence,2008,172(2):234-264
    [76] Den ux Thierry, Masson Marie-Hélène. Evidential reasoning in large partially orderedsets [J]. Annals of Operations Research,2012,195(1):135-161
    [77] Denoeux Thierry, Smets Philippe. Classification using belief functions: relationshipbetween case-based and model-based approaches [J]. IEEE Transactions on Systems,Man, and Cybernetics, Part B: Cybernetics,2006,36(6):1395-1406
    [78] Den ux Thierry, Younes Zoulficar, Abdallah Fahed. Representing uncertainty onset-valued variables using belief functions [J]. Artificial Intelligence,2010,174(7):479-499
    [79] Den ux Thierry, Zouhal Lalla Meriem. Handling possibilistic labels in patternclassification using evidential reasoning [J]. Fuzzy Sets and Systems,2001,122(3):409-424
    [80] Du Pufeng, Cao Shengjiao, Li Yanda. SubChlo: Predicting protein subchloroplastlocations with pseudo-amino acid composition and the evidence-theoretic K-nearestneighbor (ET-KNN) algorithm [J]. Journal of theoretical biology,2009,261(2):330-335
    [81] Elouedi Zied, Mellouli Khaled, Smets Philippe. Belief decision trees: theoreticalfoundations [J]. International Journal of Approximate Reasoning,2001,28(2):91-124
    [82] Fang Li, Yi Chen, Chong Wang, editors. An Evidence Theory Decision Tree Algorithmfor Uncertain Data[A].3rd International Conference on Genetic and EvolutionaryComputing[C].2009:393-396
    [83] Gao Haidi, Shen Xiangjun, Jiang Zhongqiu, et al., editors. Image SubcategoryClassification Based on Dempster-Shafer Evidence Theory[A]. International Conferenceon Computer Science&Service System[C].2012:2289-2292
    [84] Gao Qing-Bin, Wang Zheng-Zhi. Center-based nearest neighbor classifier [J]. Patternrecognition,2007,40(1):346-349
    [85] Garcia Eric K, Feldman Sergey, Gupta Maya R, et al. Completely lazy learning [J]. IEEETransactions on Knowledge and Data Engineering,2010,22(9):1274-1285
    [86] Geurts Pierre, Ernst Damien, Wehenkel Louis. Extremely randomized trees [J]. Machinelearning,2006,63(1):3-42
    [87] Geurts Pierre, Louppe Gilles, editors. Learning to rank with extremely randomizedtrees[A]. JMLR: Workshop and Conference Proceedings[C].2011
    [88] Gromisz Marcin, Zadro ny S awomir, editors. Combining the results in pairwiseclassification using dempster-shafer theory: a comparison of two approaches[A].Artificial Intelligence and Soft Computing[C].2010:339-346
    [89] Haenni Rolf, editor. Shedding new light on Zadeh's criticism of Dempster's rule ofcombination[A].8th International Conference on Information Fusion[C].2005
    [90] Han Deqiang, Han Chongzhao, Yang Yi, editors. Multiple k-NN Classifiers Fusion Basedon Evidence Theory[A]. IEEE International Conference on Automation and Logistics[C].2007:2155-2159
    [91] Huang Kaizhu, Yang Haiqin, King Irwin, et al. Maxi–min margin machine: Learninglarge margin classifiers locally and globally [J]. IEEE Transactions on Neural Networks,2008,19(2):260-272
    [92] Huynh Van-Nam, Nguyen Tri Thanh, Le Cuong Anh. Adaptively entropy-basedweighting classifiers in combination using Dempster–Shafer theory for word sensedisambiguation [J]. Computer Speech&Language,2010,24(3):461-473
    [93] Ian H. Witten Eibe Frank. Data Mining: Practical Machine Learning Tools andTechniques[M].2ed.北京:机械工业出版社,2006.2
    [94] Je drzejowicz Joanna, Je drzejowicz Piotr. Constructing Ensemble Classifiers fromGEP-Induced Expression Trees [J]. Next Generation Data Technologies for CollectiveComputational Intelligence,2011:167-193
    [95] Kavousi Kaveh, Sadeghi Mehdi, Moshiri Behzad, et al. Evidence Theoretic Protein FoldClassification Based on the Concept of Hyperfold [J]. Mathematical Biosciences,2012
    [96] Kessentini Yousri, Burger Thomas, Paquet Thierry. Constructing dynamic frames ofdiscernment in cases of large number of classes [J]. Symbolic and QuantitativeApproaches to Reasoning with Uncertainty,2011:275-286
    [97] Kosinov Serhiy, Titov Ivan, Marchand-Maillet Stéphane. Large margin multiplehyperplane classification for content-based multimedia retrieval [J]. Learning Techniquesfor Processing Multimedia Content,2005
    [98] Le Cuong Anh, Huynh Van-Nam, Shimazu Akira, et al. Combining classifiers for wordsense disambiguation based on Dempster–Shafer theory and OWA operators [J]. Data&Knowledge Engineering,2007,63(2):381-396
    [99] Lee Honglak, Battle Alexis, Raina Rajat, et al. Efficient sparse coding algorithms [J].Advances in Neural Information Processing Systems,2007
    [100] Lefevre Eric, Colot Olivier, Vannoorenberghe Patrick. Belief function combination andconflict management [J]. Information Fusion,2002,3(2):149-162
    [101] Lefevre Eric, Colot Olivier, Vannoorenberghe P, et al., editors. A generic framework forresolving the conflict in the combination of belief structures[A]. Proceedings of the ThirdInternational Conference on Information Fusion[C]:2000
    [102] Leistner Christian, Saffari Amir, Santner Jakob, et al., editors. Semi-supervised randomforests[A]. IEEE12th International Conference on omputer Vision[C].2009:506-513
    [103] Lepetit Vincent, Fua Pascal. Keypoint recognition using randomized trees [J]. IEEETransactions on Pattern Analysis and Machine Intelligence,2006,28(9):1465-1479
    [104] Leverington DW, Moon WM, editors. An evaluation of consensus neural networks andevidential reasoning algorithms for image classification[A]. IEEE InternationalGeoscience and Remote Sensing Symposium [C].2002:3474-3476
    [105] Li Bicheng, Wang Bo, Wei Jun, et al., editors. Efficient combination rule of evidencetheory[A]. Multispectral Image Processing and Pattern Recognition[C].2001:237-240
    [106] Liang Xun, Chen Rong-Chang, Guo Xinyu. Pruning support vector machines withoutaltering performances [J]. IEEE Transactions on Neural Networks,2008,19(10):1792-1803
    [107] Liu Ye-Zheng, Jiang Yuan-Chun, Liu Xiao, et al. CSMC: A combination strategy formulti-class classification based on multiple association rules [J]. Knowledge-BasedSystems,2008,21(8):786-793
    [108] Liu Zhun-ga, Pan Quan, Dezert Jean. A new belief-based K-nearest neighborclassification method [J]. Pattern recognition,2012
    [109] Mejdoubi Mustapha, Aboutajdine Driss, Kerroum Mounir Ait, et al., editors. Combiningclassifiers using Dempster-Shafer evidence theory to improve remote sensing imagesclassification[A].2011International Conference on Multimedia Computing and Systems[C].2011:1-4
    [110] Monney Paul-André, Chan Moses, Romberg Paul. A belief function classifier based oninformation provided by noisy and dependent features [J]. International Journal ofApproximate Reasoning,2011,52(3):335-352
    [111] Ni Qingshan, Wang Zhengzhi, Wang Xiaomin, editors. Kernel K-local hyperplanes forpredicting protein-protein interactions[A]. Fourth International Conference on NaturalComputation[C].2008:66-69
    [112] Olshausen Bruno A. Emergence of simple-cell receptive field properties by learning asparse code for natural images [J]. Nature,1996,381(6583):607-609
    [113] Olshausen Bruno A, Field David J. Sparse coding of sensory inputs [J]. Current opinionin neurobiology,2004,14(4):481-487
    [114] Olshen L.B.J.H.F.R.A., Stone C.J. Classification and Regression Trees [J]. WadsworthInternational Group,1984
    [115] Oukhellou Latifa, Debiolles Alexandra, Den ux Thierry, et al. Fault diagnosis inrailway track circuits using Dempster–Shafer classifier fusion [J]. EngineeringApplications of Artificial Intelligence,2010,23(1):117-128
    [116] Ozuysal Mustafa, Calonder Michael, Lepetit Vincent, et al. Fast keypoint recognitionusing random ferns [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2010,32(3):448-461
    [117] Ozuysal Mustafa, Fua Pascal, Lepetit Vincent, editors. Fast keypoint recognition in tenlines of code[A]. IEEE Conference on Computer Vision and Pattern Recognition[C].2007:1-8
    [118] P Kontschieder Peter, Rota Bulò Samuel, Bischof Horst, et al. Structured Class-labels inRandom Forests for Semantic Image Labeling [J].2011
    [119] Pal Nikhil R., Ghosh Susmita. Some classification algorithms integratingDempster-Shafer theory of evidence with the rank nearest neighbor rules [J]. IEEETransactions on Systems, Man and Cybernetics, Part A: Systems and Humans,2001,31(1):59-66
    [120] Petit-Renaud Simon, Denoeux Thierry. Nonparametric regression analysis of uncertainand imprecise data using belief functions [J]. International Journal of ApproximateReasoning,2004,35(1):1-28
    [121] Polzlbauer G, Lidy Thomas, Rauber Andreas. Decision manifolds—a supervisedlearning algorithm based on self-organization [J]. IEEE Transactions on Neural Networks,2008,19(9):1518-1530
    [122] Quost Benjamin, Denoeux Thierry, Masson Mylene, et al., editors. One-against-allclassifier combination in the framework of belief functions[A]. Proceedings of IPMU[C].2006:356-363
    [123] Quost Benjamin, Den ux Thierry, Masson Marie-Hélène. Pairwise classifiercombination using belief functions [J]. Pattern Recognition Letters,2007,28(5):644-653
    [124] Quost Benjamin, Masson Marie-Hélène, Den ux Thierry. Classifier fusion in theDempster–Shafer framework using optimized t-norm based combination rules [J].International Journal of Approximate Reasoning,2011,52(3):353-374
    [125] Schwing Alexander G, Zach Christopher, Zheng Yefeng, et al., editors. Adaptive randomforest—How many “experts” to ask before making a decision?[A]. IEEE Conference onComputer Vision and Pattern Recognition[C].2011:1377-1384
    [126] Shoyaib Mohammad, Abdullah-Al-Wadud M, Zahid Ishraque S, et al. Facial ExpressionClassification Based on Dempster-Shafer Theory of Evidence [J]. Belief Functions:Theory and Applications,2012:213-220
    [127] Tabassian Mahdi, Ghaderi Reza, Ebrahimpour Reza. Combining complementaryinformation sources in the Dempster-Shafer framework for solving classificationproblems with imperfect labels [J]. Knowledge-Based Systems,2011
    [128] Tabassian Mahdi, Ghaderi Reza, Ebrahimpour Reza. Combination of multiple diverseclassifiers using belief functions for handling data with imperfect labels [J]. ExpertSystems With Applications,2012,39(2):1698-1707
    [129] Tulyakov Sergey, Jaeger Stefan, Govindaraju Venu, et al. Review of classifiercombination methods [J]. Machine Learning in Document Analysis and Recognition,2008:361-386
    [130] Vannoorenberghe Patrick, editor. Reasoning with unlabeled samples and belieffunctions[A]. The12th IEEE International Conference on Fuzzy Systems [C].2003:814-818
    [131] Vannoorenberghe Patrick. On aggregating belief decision trees [J]. Information Fusion,2004,5(3):179-188
    [132] Vannoorenberghe Patrick, Denoeux Thierry, editors. Likelihood-based vs.distance-based evidential classifiers[A]. The10th IEEE International Conference onFuzzy Systems[C].2001:320-323
    [133] Vannoorenberghe P, Denoeux T, editors. Handling uncertain labels in multiclassproblems using belief decision trees[A]. Proceedings of IPMU[C].2002:1919-1926
    [134] Villamizar Michael, Moreno-Noguer Francesc, Andrade-Cetto Juan, et al., editors.Shared random ferns for efficient detection of multiple categories[A].20th InternationalConference on Pattern Recognition[C].2010:388-391
    [135] Wang Hui, Bell David. Extended k-nearest neighbours based on evidence theory [J].The Computer Journal,2004,47(6):662-672
    [136] Wang Lei, Khan Latifur, Thuraisingham Bhavani, editors. An effective evidence theorybased k-nearest neighbor (knn) classification[A]. IEEE/WIC/ACM InternationalConference on Web Intelligence and Intelligent Agent Technology[C].2008:797-801
    [137] Wang Xiaodong, Liu F, Jiao LC, et al. An evidential reasoning based classificationalgorithm and its application for face recognition with class noise [J]. Pattern recognition,2012
    [138] Wu Zhaofu, Gao Fei, editors. Image Classification Based on Dempster-Shafer EvidenceTheory and Neural Network[A]. Second WRI Global Congress on Intelligent Systems[C].2010:296-298
    [139] Xiao Zhongzhe, Dellandrea Emmanuel, Dou Weibei, et al. ESFS: A new embeddedfeature selection method based on SFS [J]. Rapports de recherché,2008.
    [140] Xin GUAN, Xiao YI, You HE. An improved Dempster-Shafer algorithm for resolvingthe conflicting evidences [J]. International Journal of Information Technology,2005,11(12):68-75
    [141] Yazdani Ashkan, Ebrahimi Touradj, Hoffmann Ulrich, editors. Classification of EEGsignals using Dempster Shafer theory and a k-nearest neighbor classifier[A].4thInternational IEEE/EMBS Conference on Neural Engineering[C].2009:327-330
    [142] Zheng Wenming, Zhao Li, Zou Cairong. Locally nearest neighbor classifiers for patternclassification [J]. Pattern recognition,2004,37(6):1307-1309
    [143] Zhou Xiaofei, Shi Yong. Nearest neighbor convex hull classification method for facerecognition [J]. Computational Science–ICCS2009,2009:570-577
    [144] Zhou Zhi-Hua, Yu Yang. Ensembling local learners ThroughMultimodal perturbation [J].IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics,2005,35(4):725-735
    [145]程丽丽.支持向量机集成学习算法研究[D]:哈尔滨工程大学,2009
    [146]郭山清,高丛,姚建等人.基于改进的随机森林算法的入侵检测模型(英文)[J].软件学报,2005,(08):1490-1498
    [147]李烨,蔡云泽,尹汝泼等人.基于证据理论的多类分类支持向量机集成[J].计算机研究与发展,2008,(04):571-578
    [148]孙怀江,胡钟山,杨静宇.基于证据理论的多分类器融合方法研究[J].计算机学报,2001,(03):231-235
    [149]王爱平,万国伟,程志全等人.支持在线学习的增量式极端随机森林分类器[J].软件学报,2011,(09):2059-2074
    [150]王清.集成学习中若干关键问题的研究[D]:复旦大学,2011
    [151]王肖霞,杨风暴.基于冲突强度和非正则化的证据合成方法研究[J].计算机工程与应用,2006,(30):78-80
    [152]邢清华,雷英杰,刘付显.一种按比例分配冲突度的证据推理组合规则[J].控制与决策,2004,(12):1387-1390

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700