用户名: 密码: 验证码:
混合数据知识发现的粗糙计算模型和算法
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
机器学习和知识发现是人工智能最重要的研究方向,而复杂环境下信息的不确定性和不一致性是知识发现面临的主要困难。粗糙集理论模拟了人类认知推理中粒化和近似的特点,是刻画分类数据的不一致性程度的有效数学工具,已经成功应用于符号数据知识发现,但是还没有系统研究广泛存在的符号、数值和模糊变量共存的复杂分类问题。本文提出在人类的思维中存在6种决策的一致性假设。基于粗糙计算方法论中粒化和近似的思想,本文分别建立了这些一致性假设的数学模型,并给出了一般形式。具体从以下几个方面进行了探索:
     第一,提出了度量空间多粒度分类学习的邻域粗糙计算模型和算法。度量空间中点的δ邻域形成了论域的一种粒化结构,基于邻域粒化建立了度量空间的邻域粗糙集模型,形成了度量空间分类分类一致性的粗糙计算模型。邻域的大小可视为分析分类的粒度,改变邻域的大小可形成混合数据分类一致性的多粒度分析工具。基于邻域粗糙集模型设计了边界样本选择算法和混合数据属性约简算法。
     第二,提出了混合数据分类分析的核模糊粗糙计算模型和算法。当前模糊粗糙集的研究主要集中于模糊近似算子的构造,忽略了对模糊粒化结构的分析。研究发现一大类核函数计算的核矩阵都满足模糊等价关系的性质,从而可引入这些核函数为模糊粗糙计算建立模糊粒化结构。本文提出了基于核函数粒化的核模糊粗糙集模型,建立了分类模糊一致性分析的数学模型。设计了基于核近似的混合属性重要度评价指标,探讨了模糊依赖度函数和特征评价算法ReliefF之间的关联,提出了抗噪声的属性约简算法和大样本集的样本加权重采样方法。
     第三,提出了混合数据描述的有序决策问题的模糊偏好粗糙分析模型。有序分类学习是一大类分类学习任务,在多标准决策分析中具有重要的地位。本文引入多标准决策分析中广泛使用的模糊偏好关系,并将其与广义的模糊粗糙集模型结合起来,从而建立了混合数据排序一致性分析的模糊粗糙计算模型。
     第四,给出了一系列粗糙计算模型的一般形式,统一了Pawlak粗糙集、邻域粗糙集、核粒化粗糙集和模糊偏好粗糙集,从而建立了粗糙数据分析的统一视角。并且基于一般模型,提出了各种近似空间的不确定性的统一度量模型。分析表明多种近似空间的不确定性程度都可以采用这一信息函数进行刻画。由此,本文给出了混合数据描述的一般分类问题和有序决策问题的一般信息度量理论。
     第五,本文分析了各种粗糙集属性评价指标的参数和样本稳定性。研究表明,信息熵和模糊信息熵是非常稳定的属性评价指标,少量样本的扰动不会对属性约简产生影响,而邻域依赖度和邻域一致性是不稳定的评价函数,评价结果易受样本扰动影响。
     第六,设计了混合数据约简的算法平台,测试了各种算法在真实分类中的性能,并提出采用选择性集成方法利用多个约简的互补信息。某些决策系统可以得到一组约简,每个约简都保持了原始数据分类的一致性,提供了分类数据的一种理解视角。基于选择性多分类器集成的研究成果,本文提出有选择地集成部分约简训练的分类器构造多分类器系统,并且设计了前向贪心选择和后剪枝的分类器选择策略,试验表明该方法能够获得相对紧凑并且分类能力很强的多分类器系统。
     本文的研究建立了符号和数值数据共存的混合决策系统的粗糙计算模型。基于邻域粗糙集模型和核粒化的模糊粗糙集模型,本文建立了混合数据一般分类问题的统一计算模型。接下来又基于模糊偏好粗糙模型建立了混合数据有序分类问题的粗糙计算模型。最后,本文基于广义的粗糙模型统一了一般分类问题和有序分类问题的粗糙计算模型,并为各种粗糙计算模型提出了统一的信息度量理论,从而形成了一大类决策问题的粗糙计算理论。
Machine learning and knowledge discovery is one of the most important issues to be addressed in artificial intelligence. And uncertainty and inconsistency are the key problems in knowledge discovery from complex data. Rough set theory, which simulates the capability of granulation and approximation in human cognition, has proven to be an effective mathematical tool to characterize incosistency in classification data. This theory has been applied in knowledge discovery from symbolic data. However, most of data sets in real-world applications are numerical, fuzzy or their mixture. Not much work has been devoted to discussing knowledge discovery from heterogeneous data with rough sets so far. It is proposed that there are six types of consistency in human’s reasoning in this work. The mathematical models of these types consistency are built based on granulation and approximation in rough sets. Moreover, uniform model and algorithm are developed for knowledge discovery from heterogeneous data are developed. The main contributions of the work are listed as follows.
     First, Neighborhood rough set model and algorithms in general metric spaces are constructed. The objects described with numerical attributes can be considered as points in metric spaces. The neighborhoods of these points form a structure of granulation of the universe. Based on neighborhood granulation, a rough set model is developed for classification analysis in metric spaces. Neighborhood rough sets construct a framework for analyzing consistency of classification with numerical or symbolic features. If the size of neighborhood is looked as the granularity in data analysis, a multi-granularity data analysis tool is developed by varying the size of neighborhood. Algorithms for sample and feature reduction are constructed based on the neighborhood model.
     Second, a kernelized fuzzy rough model is developed for rough computation with heterogeneous data. The current researches on fuzzy rough sets are focused on construction of fuzzy rough approximation operators. Howerver little attention is paid to fuzzy granulation. It is found that a class of kernel functions can be used to compute the fuzzy T-equivalence relations between samples. Then these kernel functions can be used to build fuzzy granular structures for fuzzy rough sets. Based on this observation, a kernelized fuzzy rough set model is proposed for analyzing consistency in the fuzzy case. The connections between fuzzy dependency and ReliefF are shown. We introduce the idea in ReliefF to reduce the influence of noise in fuzzy rough sets based attribute reduction and we construct a generalized classification certainty measure.
     Third, a rough set model for fuzzy preference analysis is developed. Ordered classification is one class of learning tasks in decision modeling and multi-criterion analysis. Fuzzy preference relations, which are widely in multi-criterion analysis, are introduced and combined with general fuzzy rough set model, thus a fuzzy preference rough set model is proposed and algorithms for dependency analysis and attribute reduction are developed.
     Fourth, a general fuzzy rough set model is discussed to give a uniform definition of lower and upper approximations for all kinds of rough sets. Therefore, a uniform viewpoint for thereotical analysis and algorithm design is introduced. Moreover, based on the general model, we design a uniform information measure for Pawlak rough sets, neighborhood rough set, fuzzy rough sets and fuzzy preference rough sets.
     Fifth, the stability of attribute evlaution functions and attribute reduction algorithms proposed in this work is evaluated. It is found that Shannon entropy and fuzzy entropy are more robust than dependency and consistency, while neighborhood consistency and neighborhood dependency are the most instable.
     Sixth, a system is developed for rough set based knowledge discovery from heterogeneous data. Systematically comparative experiments are conducted. The results validate the effectiveness of the proposed techniques. Moreover, a multiple classifier system is designed by selectively combining a set of classifiers trained with rough set based reducts. In most cases, a set of reducts, rather than one reduct can be obtained from a decision system. Each reduct is a viewpoint to analyze the classification task. The information in different reducts is distinct and complement. Based on the theoretical results of classifier ensemble, a selective ensemble algorithm is developed based on a strategy of forward greedy selection and post-pruning. The experiments show the proposed algorithm can get a compact and effective classification system.
     This work develops a uniform rough set model for symbolic and numerical data analysis. Based on neighborhood rough sets and kernelized fuzzy rough sets, we develop a uniform model for classification learning from heterogeneous data. Then fuzzy preference rough sets show a uniform model for fuzzy preference learning with heterogeneous data. Finally, we construct a general rough set model and an information measure model for classification and preference learning.
引文
1 E. Mjolsness, D. DeCoste. Machine learning for science: state of the art and future prospects. Science. 2001, 293(14): 2051~2055
    2 A. Asuncion, D. J. Newman. UCI Machine Learning Repository [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Irvine, CA: University of California, School of information and computer science. 2007
    3 S. K. M. Wong, L. S. Wang, Y. Y. Yao. On modeling uncertainty with interval structures, Computational intelligence. 1995, 11(2): 406~426
    4人工智能:回顾与展望.涂序彦,韩力群编.科学出版社. 2006,北京
    5 R. O. Duda, P. E.Hart, D. G. Stork. Pattern classification. Wiley-Interscience, 2000
    6 A. L. Blum, P. Langley. Selection of relevant features and examples in machine learning. Artificial intelligence. 1997, 97(1-2) 245~271
    7 H. Stoppiglia, G. Dreyfus, R. Dubois, Y. Oussar. Ranking a random feature for variable and feature selection. Jounral of machine learning researches. 2003, 3: 1399~1414
    8 R. Caruana, D. Freitag, Greedy attribute selection. Proc. Int’l Conf. Machine learning. 1994: 28~36
    9 D. B. Skalak. Prototype and feature selection by sampling and random mutation hill climbing algorithms. In machine learning: proceedings of the eleventh international conference. Morgan Kaufmann, 1994
    10 P. Narendra, K. Fukunaga. Branch and bound algorithm for feature subset selection. IEEE Transactions on computers. 1977, 26 (9): 917~922
    11 P. Pudil, J. Novovicova, J. Kittler. Floating search methods in feature-selection, Pattern recognition letters. 1994, 15 (11): 1119~1125
    12 H. Liu and R. Setiono. A probabilistic approach to feature selection: A filter solution. In Machine Learning: Proceedings of the thirteenth international conference on machine learning. Morgan Kaufmann, 1996
    13 J. H. Yang, V. Honavar. Feature subset selection using a genetic algorithm, IEEE Intelligent systems & their applications. 1998, 13(2): 44~49
    14 H. B. Zhang, G. Y. Sun. Feature selection using tabu search method. Patternrecognition. 2002, 35(3): 701~711
    15 X. Y. Wang, J. Yang, X. L. Teng, et al. Feature selection based on rough sets and particle swarm optimization. Pattern recognition letters. 2007, 28(4): 459~471
    16 I. Guyon, A. Elisseeff. An introduction to variable and feature selection. Journal of machine learning research. 2003, 3: 1157~1182
    17 R. Kohavi and G. John. Wrappers for feature selection. Artificial intelligence. 1997, 97(1-2): 273~324
    18 T. K. Ho, M Basu. Complexity Measures of supervised classification problems. IEEE transactions on pattern analysis and machine intelligence. 2002, 24(3): 289~300
    19 R. Thawonmas, S. Abe. A novel approach to feature selection based on analysis of class regions. IEEE transactions on systems man and cybernetics Part B-cybernetics. 1997, 27( 2): 196~207
    20 S. Abe, R. Thawonmas, Y. Kobayashi. Feature selection by analyzing class regions approximated by ellipsoids. IEEE transactions on systems, man, and cybernetics—part c: applications and reviews. 1998, 28(2): 282~287
    21 K. Kira and L. A. Rendell. A practical approach to feature selection. In Machine learning: proceedings of the ninth international conference, 1992.
    22 M. Robnik-Sikonja, I. Kononenko. Theoretical and empirical analysis of ReliefF and RReliefF, Machine learning. 2003, 53(1-2): 23~69
    23 I. Guyon, J. Weston, S. Barnhill, et al. Gene selection for cancer classification using support vector machines, Machine learning. 2002, 46(1-3): 389~422
    24 M. A. Hall. Correlation-based feature selection for discrete and numeric class machine learning, Proc. 17th Int’l Conf. Machine Learning, 2000: 359~366
    25 R. Battiti. Using mutual information for selecting features in supervised neural-net learning, IEEE transactions on neural networks. 1994, 5 (4): 537~550
    26 H. Wang, D. Bell, F. Murtagh. Axiomatic Approach to Feature Subset Selection Based on Relevance. IEEE transactions on pattern analysis and machine intelligence. 1999, 21(3): 271~277
    27 L. Yu, H. Liu. Efficient feature selection via analysis of relevance andredundancy. Journal of machine learning research. 2004, 5: 1205~1224
    28 N. Kwak, C.-H. Choi. Input feature selection by mutual information based on Parzen window. IEEE transactions on pattern analysis and machine intelligence. 2002, 24(12): 1667~1671
    29 M. Hall. Correlation based feature selection for machine learning. PhD thesis, University of Waikato, Dept. of computer science, 1999
    30 K. Torkkola. Feature Extraction by non-parametric mutual information maximization. Journal of machine learning research. 2003, 3: 1415~1438
    31 H. Peng, F. Long, C. Ding. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE transactions on pattern analysis and machine intelligence. 2005, 27(8): 1226~1238
    32 X. Hu, Nick Cercone. Learning in Relational Databases: A Rough Set Approach. Computational intelligence.1995, 11 (2) 323~338
    33 R. W. Swiniarski, A. Skowron. Rough set methods in feature selection and recognition. Pattern recognition letters. 2003, 24(6): 833~849
    34 M. Dash, H. Liu Consistency based search in feature selection. Artificial intelligence. 2003, 151 (1-2): 155~176
    35 R. Setiono and H. Liu. Chi2: Feature selection and discretization of numeric attributes. In Proceedings of the Seventh IEEE international conference on tools with artificial intelligence, 1995
    36 G. A. Babich, O. I. Camps. Weighted Parzen windows for pattern classification. IEEE transactions on pattern analysis and machine intelligence.1996, 18(5): 567~570
    37 D. Randall Wilson, Tony R. Martinez. Improved Heterogeneous Distance Functions. Journal of artificial intelligence research.1997, 6: 1~34
    38 H. Wang. Nearest neighbors by neighborhood counting. IEEE transactions on pattern analysis and machine intelligence. 2006, 28 (6): 942~953
    39 W. Pedrycz, G. Vukovich. Feature analysis through information granulation and fuzzy sets. Pattern recognition. 2002, 35(4): 825~834
    40 S. J. Raudys, A. K. Jain. Small sample size effects in statistical pattern recognition: recommendations for practitioners. IEEE transactions on pattern analysis and machine intelligence. 1991, 13(3): 252~264
    41 A. Jain, D. Zongker. Feature selection: Evaluation, application, and small sample performance. IEEE transactions on pattern analysis and machine intelligence. 1997, 19(2): 153~158
    42 J. Reunanen. Overfitting in making comparisons between variable selection methods. Journal of machine learning research. 2003, 3: 1371~1382
    43 M. Sebban , R. Nock. A hybrid filter/wrapper approach of feature selection using information theory. Pattern recognition. 2002, 35 (4): 835~84
    44 Z. Zhu, Y.-S. Ong, M. Dash. Wrapper/filter feature selection algorithm using a memetic framework. IEEE transactions on systems, man, and cybernetics-part B: cybernetics. 2007, 37(1): 70~76
    45 ?. Uncu, I.B. Türk?en. A novel feature selection approach: Combining feature wrappers and filters. Information sciences. 2007, 177: 449~466
    46 R. B. Bhatt, M. Gopal. On fuzzy-rough sets approach to feature selection. Pattern recognition letters. 2005, 26(7): 965~975
    47 R. Jensen. Q. Shen. Fuzzy-rough sets assisted attribute selection. IEEE transactions on fuzzy systems. 2007, 15 (1): 73~89
    48 W. S. McCulloch; W. Pitts. A logical calculus of the ideas immanent in nervous activity. Bulletion of mathetical biophysic. 1943, 5(1): 115~133
    49 F. Rosenblatt. The perceptron: a perceiving and recognizing automaton. Project PARA, Technical Report 85-460-1, Cornell Aeronautical Laboratory. 1957
    50 R. Frank, The perceptron: a probabilistic model for information storage and organization in the brain, Cornell Aeronautical Laboratory, Psychological Review, 1958,65, (6): 386~408
    51 F. Rosenblatt. On the convergence of reinforecement procedures in simple perceptrons. Cornell Aeronautical Laboratory Report VG-1196-G-4, 1960
    52 D. S. Broomhead, D. Lowe. Multivariable functional interpolation and adaptive networks. Complex system. 1988 (2): 321~355
    53 V. Vapnik.统计学习论理的本质.张学工译.北京:清华大学出版社, 2000.
    54 E. B. Hunt, J. Marin, P. T. Stone. Experiments in induction. Academ in Press, 1966
    55 R. S. Michalski, J. Larson, Selection of most representative training examples and incremental generation of VL1 Hypotheses: The underlyingmethodology and the description of programs ESEL and AQ11,”Report no. 867, department of computer science, University of Illinois, Urbana, May 1978
    56 L. Breiman, J. H. Friedman, R. A. Olshen, C. J. Stone. 1984. Classification and regression trees. Monterey, Calif., U.S.A.: Wadsworth, Inc.
    57 L. Breiman, et al., Classification and regression trees, Chapman and Hall, Boca Raton, 1993
    58 J. R. Quinlan. Induction of decision trees. Machine learning. 1986, 1(1): 81~106
    59 J. R. Quinlan. C4.5: Programs for Machine learning. Morgan Kaufmann Publishers Inc, 1993
    60 J. Catlett. On changing continuous attributes into ordered discrete attributes, Proc. European working session on learning, 1991, 164~178
    61 J. Y. Ching, Class-dependent discretization of continuous attributes for inductive learning, Master Thesis, University of Waterloo, Canada, 1992
    62 U. M. Fayyad, K.B. Irani. On the handling of continuous-valued attributes in decision tree generation, Machine learning. 1992, 8: 87~102
    63 U. Fayyad, K.B. Irani. Multi-Interval discretization of continuos attributes as preprocessing for classification learning. The 13th international joint conference on artificial intelligence, Morgan Kaufmann, pp. 1022~1027
    64 H. Liu, F. Hussianm, C. L. Tan, M. Dash. Discretization: an enabling technique. Data mining and knowledge discovery, 2002, 6 (4): 393~423
    65 J. Ching, A. Wong, K. Chan. Class-dependent discretization for inductive learning from continuous and mixed-mode data, IEEE transactions on pattern analysis and machine inthlligence, 1995, 17(7): 641~651
    66 M. R. Chmielewski, J. W. Grzymala-Busse, 1994. Global discretization of continuous attributes as preprocessing for machine learning. In Third international workshop on rough sets and soft computing, pp. 294~301
    67 M. R. Chmielewski, J. W. Grzymala-Busse. Global discretization of continuous attributes as preprocessing for machine learning. International journal of approximate reasoning. 1996,15(4): 319~331
    68 H. Liu, R. Setiono. Feature selection via discretization. IEEE transactios on knowledge and data engineering. 1997, 9 (4): 642~645
    69 J. Catlett. On changing continuous attributes into ordered discrete attributes. In proc. fifth European working session on learning. Berlin: Springer-Verlag 1991: 164~177
    70谢宏,程浩忠,牛东晓.基于信息熵的粗糙集连续属性离散化算法.计算机学报. 2005, 28(9): 1570~1574
    71 R. Kerber, Chimerge: Discretization of numeric attributes. In Proc. AAAI92, Ninth national confrerence articial intelligence. AAAI Press/ The MIT Press,1992: 123~128
    72苗夺谦. Rough Set理论中连续属性的离散化方法,自动化学报. 2001, 27 (3): 296~302
    73石红,沈毅,刘志言.电机与控制学报.一种改进的连续属性全局离散化算法. 2004, 8 (3): 268~270
    74 M. A. Hall. Correlation-based feature selection for discrete and numeric class machine learning, In Proc. 17th ICML, 2000: 359~366.
    75 Il-Seok Oh, Jin-Seon Lee, Byung-Ro Moon. Hybrid genetic algorithms for feature selection. IEEE transactions on pattern analysis and machine intelligence. 2004, 26(11): 1424~1437
    76 Z. H. Zhou,C. Z. Qian. Hybrid decision tree. Knowledge based systems. 2002, 15 (8): 515~528
    77 W. Y. Tang, K. Z. Mao. Feature selection algorithm for mixed data with both nominal and continuous features. Pattern recognition letters. 2007, 28 (5): 563~571
    78 D. R. Wilson, T. R. Martinez. Improved heterogeneous distance functions. Journal of artificial intelligence research. 1997, 6: 1~34
    79 Z. Pawlak, 1991. Rough sets–theoretical aspects of reasoning about data. Kluwer Academic, Dordrecht
    80 Z. Pawlak, C. Rauszer, Dependency of attributes in information systems, Bull. Polish Acad. Sci. Math. 1985, 33: 551~559
    81 J. Wang, D. Q. Miao. Analysis on attribute reduction strategies of Rough set. Journal of computer science and technology. 1998, 13(2): 189~192
    82 S. Q. Han, J. Wang. Reduct and attribute order. Journal of computer science and technology. 2004, 19 (4): 429~449
    83 J. Wang, J. Wang. Reduction algorithms based on discernibility matrix: Theordered attributes method. Journal of computer science and technology. 2001, 16 (6): 489~504
    84 G. Y. Wang, J. Zhao, J. J. An, et al. A comparative study of algebra viewpoint and information viewpoint in attribute reduction. Fundamenta informaticae. 2005, 68 (3): 289~301
    85 Y. Leung, D. Y. Li. Maximal consistent block technique for rule acquisition in incomplete information systems. Information sciences, 2003, 153: 85~106
    86 Z. Zheng, G.Y. Wang. RRIA: A rough set and rule tree based incremental knowledge acquisition algorithm. Fundamenta informaticae. 2004, 59 (2-3): 299~313
    87 Y. Y. Yao. Information granulation and rough set approximation. International journal of intelligent systems. 2001, 16 (1): 87~104
    88 T. Y. Lin. Granular computing-Structures, representations, and applications. Lecture notes in computer science. 2003, 2639: 16~24
    89 L. Zadeh. Toward a theory of fuzzy information granulation and its centrality in human reasoning and fuzzy logic. Fuzzy sets and systems. 1997, 19(2): 111~127
    90 L. Zadeh. Toward a generalized theory of uncertainty-an outline. Information science. 2005, 172: 1~40
    91 Y. Y. Yao, S. K. M. Wong, P. Lingras, A decision-theoretic rough set model, Methodologies for Intelligent Systems, Proceedings of the 5th International Symposium on Methodologies for Intelligent Systems, Knoxville, Tennessee, USA, October 25-27, 1990, Ras, Z.W., Zemankova., M., and Emrichm M.L. (Eds.), North-Holland, pp. 17~25
    92 Y. Y. Yao, S. K. M. Wong. A decision theoretic framework for approximating concepts, International Journal of man-machine studies. 1992, 37 (6): 793~809
    93 W. Ziarko. Variable precision rough sets model. Journal of computer and system sciences.1993, 46 (1): 39~59
    94 S. K. M. Wong, W. Ziarko. Comparison of the probabilistic Approximate Classification and the fuzzy set Model, Fuzzy sets and systems. 1987, 21 (3): 357~362
    95 Y. Y. Yao. Probabilistic approaches to rough sets, Expert systems. 2003,20(5): 287~297
    96 W. Ziarko. Decision making with probabilistic decision tables. N. Zhong, A. Skowron, S. Ohsuga (Eds.): RSFDGrC’99, LNAI, 1999,1711: 463~471
    97 D. Slezak. Rough sets and Bayes factor, in: J. Peters, A. Skowron (Eds.), LNCS Transactions on Rough Sets III, Springer, Heidelberg, LNCS, 2005, 3400: 202~229
    98 D. Slezak, W. Ziarko. The investigation of the Bayesian rough set model. International journal of approximate reasoning. 2005, 40: 81~91
    99 D. Dubois, H. Prade. Rough fuzzy sets and fuzzy rough sets. International journal of general systems. 1990, 17(2–3): 191~209
    100 N. N. Morsi, M. M. Yakout, Axiomatics for fuzzy rough set, Fuzzy sets system. 1998, 100: 327~342
    101 J.-S. Mi, W.-X. Zhang. An axiomatic characterization of a fuzzy generalization of rough sets. Information sciences. 2004, 160: 235~249
    102 T.-J. Li et al., Generalized fuzzy rough approximation operators based on fuzzy coverings, International journal of approximate reasoning. 2008, 43(3): 836~856
    103 M. Kryszkiewicz. Rough set approach to incomplete information systems, Information sciences. 1998, 112(1-4): 39~49
    104 M. Kryszkiewicz. Rules in incomplete information systems, Information sciences. 1999, 113 (3-4): 271~292
    105 R. Slowinski, D. Vanderpooten. A generalized definition of rough approximations based on similarity. IEEE transactions on knowledge and data engineering. 2000, 12(2): 331~336
    106 J. Y. Liang, Z. B. Xu. The algorithm on knowledge reduction in incomplete information systems. International journal of uncertainty fuzziness and knowledge-based systems. 2002, 10(1): 95~103
    107 T. Y. Lin, Q. Liu, K J Huang. Rough sets neighborhood systems and approximation. In fifth international symposium on methodologies of intelligent systems. Selected papers 1990
    108 Y. Y. Yao. Relational interpretations of neighborhood operators and rough set approximation operators. Information sciences, 1998,111: 239~259
    109 W.–Z. Wu, W.–X. Zhang. Neighborhood operator systems andapproximations. Inf. sci. 2002, 144(1-4): 201~217
    110 Y. Y. Yao: Neighborhood systems and approximate retrieval. Inf. sci. 2006, 176(23): 3431~3452
    111 S. Greco, B. Matarazzo, R. Slowinski. Rough approximation of a preference relation by dominance relations. European journal of operational research. 1999, 117 (1): 63~83
    112 R. Slowinski, S. Greco, B. Matarazzo, Axiomatization of utility, outranking and decision-rule preference models for multiple-criteria classification problems under partial inconsistency with the dominance principle, Control & cybernetics. 2002, 31 (4): 1005~1035
    113 S. Greco, B. Matarazzo, R. Slowinski. Rough sets methodology for sorting problems in presence of multiple attributes and criteria. European Journal of operational research. 2002,138: 247~259
    114 W. Zhu, F.-Y. Wang. Reduction and axiomization of covering generalized rough sets. Inf. Sci. 2003, 152: 217~230
    115 W. Zhu, F.-Y. Wang. On three types of covering-based rough sets. IEEE trans. knowl. data eng. 2007, 19(8): 1131~1144
    116 C. Wang, C Wu, D. Chen. A systematic study on attribute reduction with rough sets based on general binary relations. Inf. Sci. 2008, 178(9): 2237~2261
    117 J. W. Grzymala-Busse. LERS—a system for learning from examples based on rough sets. In: R. Slowinski(ed.), Intelligent decision support-Handbook of applications and advances of the rough sets theorey, 331~362. Dordrecht: Kluwer.
    118 J. W. Grzymala-Busse. Knowledge acquisition under uncertainty—a rough set approach. Journal of Intelligent and Robotic Systems, 1988, 1: 3~16
    119 A. Skowron, C. Rauszer. The discernibility matrices and functions in information systems. In: R. Slowinski(ed.), Intelligent decision support-Handbook of applications and advances of the rough sets theorey, 331~362. Dordrecht: Kluwer
    120 S. Q. Han, J. Wang. Reduct and attribute order. Journal of computer science and technology. 2004, 19 (4): 429~449
    121 M. Zhao, S. Q. Han, J. Wang. Tree expressions for information systems.Journal of computer science and technology. 2007, 22(2): 297~307
    122 X. H. Hu, N. Cercone. Learning in relational databases: a rough set approach. Computational intelligence. 1995, 11 (2): 323~338
    123 Jelonek J, Krawiec K, Slowinski R. Rough set reduction of attributes and their domains for neural networks. Computational intelligence. 1995, 11 (2): 339~347
    124 D. Q. Miao, J. Wang, Information-based algorithm for reduction of knowledge, Proceedings of the IEEE international conference on intelligent processing systems, ICIPS, 1998, 2: 1155~1158
    125 D. Q. Miao, J. Wang. On the relationships between information entropy and roughness of knowledge in rough set theory. Pattern recognition and artificial intelligence. 1998, 11(1): 34~40
    126王国胤,杨大春.基于条件信息熵的决策表约简.计算机学报, 2002, 25 (7): 759~766
    127 J. Y. Liang, K. S. Chin, C.Y. Dang, R. C. M. Yam. A new method for measuring uncertainty and fuzziness in rough set theory. International journal of general systems. 2002, 31(4): 331~342
    128 M. Las, A. Kandel, O. Maimon. Information-theoretic algorithm for feature selection. Pattern recognition letters. 2001, 22 (6-7): 799~811
    129刘少辉,盛秋戬,吴斌,史忠植,胡斐. Rough集高效算法的研究.计算机学报. 2003, 26 (5): 1~6
    130徐章艳,刘作鹏,杨炳儒,宋威.一个复杂度为max(O(| C| | U| ), O(| C|2 | U/ C| ) )的快速属性约简算法.计算机学报. 2006, 29: 391~399
    131 N. Zhong, J. Dong, and S. Ohsuga, Using Rough sets with heuristics for feature selection, J. intelligent information systems. 2001, 16(3): 199~214
    132 J. Wroblewski, Finding minimal reducts using genetic algorithms, Proc. second ann. joint conf. information sciences. 1995, 186~189
    133 X. Wang, J. Yang, X. Teng, W. Xia, R. Jense. Feature selection based on rough sets and particle swarm optimization. Pattern recognition letters. 2007, 28 (4): 459~471
    134 L. Ke, Z. Feng, Z. Ren. An efficient ant colony optimization approach to attribute reduction in rough set theory. Pattern recognition letters. 2008, 29(9): 1351~1357
    135 X. Hu, N. Cercone: Mining knowledge rules from databases: a rough set approach. ICDE 1996: 96~105
    136苗夺谦,王珏.基于粗糙集的多变量决策树构造方法,软件学报. 1997, 8 (6): 425~429
    137常梨云,王国胤,吴渝.一种基于Rough Set理论的属性约简及规则提取方法.软件学报.1999,10(11):1206~1211
    138 L. I. Kuncheva. Fuzzy rough sets: application to feature selection. Fuzzy sets and systems.1992, 51 (2): 147~ 153
    139 R. Jensen, Q Shen. Fuzzy–rough attribute reduction with application to web categorization. Fuzzy sets and systems. 2004,141(3): 469~485
    140 Q. Shen,R. Jensen. Selecting informative features with fuzzy-rough sets and its application for complex systems monitoring. Pattern recognition. 2004, 37 (7): 1351~1363
    141 R. Jensen, Q. Shen. Semantics-preserving dimensionality reduction: Rough and fuzzy-rough-based approaches. IEEE transactions of knowledge and data engineering. 2004, 16 (12): 1457~1471
    142 R. Jensen, Q. Shen: Fuzzy-rough data reduction with ant colony optimization. Fuzzy sets and systems. 2005,149(1): 5~20
    143 R. Jensen, Q. Shen. Fuzzy-rough sets assisted attribute selection. IEEE transactions on fuzzy systems. 2007,15 (1): 73~89
    144 R. B. Bhatt, M. Gopal. On fuzzy-rough sets approach to feature selection, Pattern recognition letters. 2005, 26(7): 965~975
    145 R. B. Bhatt, M. Gopal. On the extension of functional dependency degree from crisp to fuzzy partitions, Pattern recognition letters. 2006, 27(5): 487~491
    146 T. -P. Hong, T. -T. Wang, S. -L. Wang and B. -C. Chien. Learning a coverage set of maximally general fuzzy rules by rough sets. Expert systems with applications. 2000, 19(2): 97~103
    147 Q. Shen, A. Chouchoulas. A rough-fuzzy approach for generating classification rules, Pattern recognition. 2002, 35(11): 2425~2438
    148 Y.-F. Wang. Mining stock price using fuzzy rough set system. Expert systems with applications. 2003, 24(1): 13~23
    149 R. B. Bhatt, M. Gopal. FRCT: fuzzy-rough classification trees, Patternanalysis and applications. 2008, 11(1): 73~88
    150 A.Mrozek. Rough sets in computer implementation in rule based control in industrial processes. In: R. Slowinski(ed.), Intelligent decision support-Handbook of applications and advances of the rough sets theorey, 19~32. Dordrecht: Kluwer, 1992
    151 A. J. Szladow, W. P. Ziako. Knowledge based process control using rough sets. In: R. Slowinski(ed.), Intelligent decision support-Handbook of applications and advances of the rough sets theorey, 49~60. Dordrecht: Kluwer, 1992
    152黄金杰,李士勇,左兴权.一种T-S型粗糙模糊控制器的设计与仿真.系统仿真学报. 2004, 16 (3): 480~484
    153宋申民,陈兴林,段广仁.基于粗糙集理论与进化计算的时滞系统Smith控制.系统仿真学报. 2006, 18(8): 2247~2249
    154倪敬,项占琴,潘晓弘,吕福在.管捆成形电液系统自学习粗糙~模糊PID控制研究.机械工程学报. 2006, 42(10): 224~228
    155 S. Tsumoto. Automated extraction of medical expert system rules from clinical databases based on rough set theory. Information sciences. 1998, 112(1-4): 67~84
    156 S. Tsumoto. Automated extraction of hierarchical decision rules from clinical databases using rough set model. Expert systems with applications. 2003, 24 (2): 189~197
    157 S. Hirano, S. Tsumoto. Rough representation of a region of interest in medical images. International journal of approximate reasoning. 2005, 40 (1-2): 23~34
    158 X. Wang, J. Yang, R. Jensen, X. Liu. Rough set feature selection and rule induction for prediction of malignancy degree in brain glioma. Computer methods and programs in biomedicine. 2006, 83 (2): 147~156
    159 P. Srinivasan, M. E. Ruiz, D. H. Kraft, J. Chen. Vocabulary mining for information retrieval: rough sets and fuzzy sets. Information processing & management. 2001, 37(1): 15~38
    160 A. Chouchoulas; Q. Shen. Rough set-aided keyword reduction for text categorization. Applied artificial intelligence. 2001, 15(9): 843~873
    161 X. L. Wang, Q. C. Chen, D. S. Yeung. Mining pinyin-to-character conversionrules from large-scale corpus: A rough set approach. IEEE transactions on systems man and cybernetics part B-cybernetics. 2004, 34(2): 834~844
    162胡清华,谢宗霞,于达仁.基于粗糙集加权的文本分类方法研究.情报学报,2005,24(1): 59~63
    163曹长修,孙颖楷,曹龙汉,张邦礼.基于粗糙集理论的内燃机故障诊断专家系统.重庆大学学报(自然科学版), 2001, 24(4): 45~47
    164杜海峰,王孙安,丁国锋.基于粗糙集与模糊神经网络的多级压缩机诊断.西安交通大学学报, 2001, 35(9): 940~944
    165束洪春,孙向飞,司大军.基于粗糙集理论的配电网故障诊断研究.中国电机工程学报. 2001, 21(10): 73~77
    166束洪春,孙向飞,司大军.电力变压器故障诊断专家系统知识库建立和维护的粗糙集方法.中国电机工程学报. 2002, 22(2): 31~35
    167孙海军,蒋东翔,钱立军,战祥森.基于粗糙集理论的旋转机械故障诊断方法. 2004, 24(2): 73~77
    168周瑞,杨建国.基于粗糙集与支持向量机的发动机故障诊断研究. 2006, 24(4): 379~383
    169吕宗平,牛国臣,于咏生.基于不完备知识的飞机发动机故障诊断研究.中国民航学院学报. 2006, (4): 16~19
    170刘金福,于达仁,胡清华,王伟.基于加权粗糙集的代价敏感故障诊断方法.中国电机工程学报, 2007, 27(23): 93~99
    171 W. -Z. Wu, J.-S. Mi, W.-X. Zhang. Generalized fuzzy rough sets. Information sciences. 2003, 151: 263~282
    172 D. S. Yeung, D.-G. Chen, E. C. C. Tsang, J. W. T. Lee, X.-Z Wang. On the generalization of fuzzy rough sets. IEEE transactions on fuzzy systems. 2005,
    13 (3): 343~361
    173 T. -Q. Deng, Y. M. Chen, W. L. Xu and Q. H. Dai. A novel approach to fuzzy rough sets based on a fuzzy covering, Information sciences. 2007, 177: 2308~2326
    174 X. H. Hu, N. Cercone. Data mining via discretization, generalization and rough set feature selection. Knowledge and information systems. 1999,1(1): 33~60
    175王珏,陶卿. Rough set理论与统计学习理论.知识科学与计算科学,陆汝钤主编,清华大学出版社, 2003
    176 L. Zadeh. The concept of a linguistic variable and its applications to approximate reasoning, Part I: Information science, 1975, 8: 199~249; Part II: information sciences, 1975, 8: 301~357; Part III: information sciences, 1975, 9: 43~80
    177 L. Zadeh, fuzzy sets and information granularity, In: M. Gupta, R. Ragad, R. Yager, ed. Advances in fuzzy set theory and applications, Amsterdam: North-Holland, 1979: 3~18
    178 L. Zadeh, Fuzzy logic=computing with words. IEEE transactions on fuzzy systems. 1996, 4: 103~111.
    179粒计算:过去、现在与展望.苗夺谦,王国胤,刘清,林早阳,姚一豫编著,科学出版社, 2007
    180问题求解理论及应用—商空间粒度计算理论及应用(第2版),张铃,张钹著,清华大学出版社, 2007
    181 D. Chen, Q. He, X. Wang. On linear separability of data sets in feature space. Neurocomputing. 2007, 70(13-15): 2441~2448
    182 H. Liu, L. Yu. Towards integrating feature selection algorithms for classification and clustering. IEEE transactions on knowledge and data engineering. 2005, 17 (4): 491~502
    183 S. Singh. Multiresolution estimates of classification complexity. IEEE transactions on pattern analysis and machine intelligence. 2003, 25(12): 1534 ~1539.
    184 X. Z. Wang, E. C. C. Tsang, S. Y. Zhao, et al. Learning fuzzy rules from fuzzy samples based on rough set technique. Information sciences. 2007, 177 (20): 4493~4514
    185 A. Hassanien. Fuzzy rough sets hybrid scheme for breast cancer detection. Image and vision computing. 2007, 25 (2): 172~183
    186 M. G. Genton. Classes of kernels for machine learning: a statistics perspective. Journal of machine learning research, 2001, 2: 299~312
    187 B. Moser. On the T-transitivity of kernels. Fuzzy sets and systems. 2006, 157 (13): 1787~1796
    188 B. Moser. On representing and generating kernels by fuzzy equivalence relations. Journal of machine learning research. 2006, 7: 2603~2620
    189 K. Kira. L. Rendell. A practical approach to feature selection. In Proc Intern Conf on Machine Learning (Aberdeen July 1992) D Sleeman, P Edwards. Morgan Kaufmann, 1992: 249~256
    190 I. Kononenko. Estimating attributes: analysis and extensions of RELIEF. Proceedings of the European conference on machine learning, Lecture notes in computer science. 1994, 784: 171~182
    191 P. M. Narendra, K. Fukunaga, A branch and bound algorithm for feature subset selection, IEEE trans. computers. 1977, 26(9): 917~922
    192 Y. Sun, J. Li, Iterative RELIEF for feature weighting, Proc. 23rd Int’l Conf. machine learning. 2006: 913~920
    193 Y. Sun. Iterative RELIEF for feature weighting: algorithms, theories, and applications. IEEE Transactions on pattern analysis and machine intelligence. 2007, 29(6): 1035~1064
    194 P. Somol, P. Pudil, J. Kittler. Fast branch & bound algorithms for optimal feature selection. IEEE transactions on pattern analysis and machine intelligence.2004, 26 (7): 900~912
    195 T. M. Cover, P. E. Hart, Nearest Neighbor Pattern Classification. IEEE transactions on information theory. 1967, 13 (1): 21~27
    196 J. M. Keller, M. R. Gray, J. A. Givens. A fuzzy k-nearest neighbor algorithm. IEEE transactions on systems man and cybernetics. 1985, 15 (4): 580~585
    197 R. R. Yager. Using fuzzy methods to model nearest neighbor rules. IEEE transactions on systems man and cybernetics part B-cybernetics. 2002, 32 (4): 512~525
    198 D. Wettschereck, D. W. Aha, T. Mohri. A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artificial intelligence review. 1997, 11 (1-5): 273~314
    199 A. K. Ghosh. On nearest neighbor classification using adaptive choice of k. Journal of computational and graphical statistics. 2007, 16 (2): 482~502
    200 Y. Li, S. C. K. Shiu, S. K. Pal. Combining feature reduction and case selection in building CBR classifiers. IEEE transactions on knowledge and data engineering. 2006, 18 (3): 415~429
    201 R. F. Sproull. Refinements to nearest neighbor searching in k-dimensional trees. Algorithmica. 1991, 6: 579~589
    202 B. L. Narayan, C. A. Murthy, S. K. Pal. Maxdiff kd-trees for data condensation. Pattern recognition letters. 2006, 27 (3): 187~200
    203 A. W. Fu, P. M. Chan, Y. L. Cheung, et al. Dynamic vp-tree indexing forn~nearest neighbor search given pair-wise distances. VLDB Journal. 2000, 9 (2): 154~173
    204 B. Zhang, S. N. Srihari. Fast k-Nearest Neighbor classification using cluster-based trees. IEEE transactions on pattern analysis and machine intelligence. 2004, 26(4): 525~528
    205 P. E. Hart. Condensed nearest neighbor rule. IEEE transactions on information theory. 1968, 14 (3): 515+
    206 G. W. Gates. Reduced nearest neighbor rule. IEEE transactions on information theory.1972, 18 (3): 431+
    207 D. L. Wilson. Asymptotic properties of nearest neighbor rules using edited data. IEEE transactions on systems, man, and cybernetics, SMC. 1972, 2(3): 408~421
    208 D. W. Aha, D. Kibler, M. K. Albert. Instance-based learning algorithms. Machine learning. 1991,6: 37~66
    209 D. W. Aha. Tolerating noisy, irrelevant and novel attributes in instance-based learning algorithms. International journal of man-machine studies. 1992, 36 (2): 267~287
    210 D. R. Wilson, T. R. Martinez. Reduction techniques for instance-based learning algorithms. Machine learning. 2000, 38(3): 257~286
    211 Y. Li, S. C. K. Shiu, S. K. Pal. Combining feature reduction and case selection in building CBR classifiers. IEEE transactions on knowledge and data engineering. 2006, 18 (3): 415~429
    212 C. C. Homes, N. M. Adams. A probabilistic nearest neighbour method for statistical pattern recognition. Journal of royal statistics society B. 2002, 64(2): 295~306
    213 J. Braszczynski, S. Greco, R. Slowinski. Multi-criteria classification– a new scheme for application of dominance-based decision rules. European journal of operational research. 2007, 181:1030~1044
    214 D. Dubois, H. Fargier, H. Prade. Ordinal and Probabilistic Representations of Acceptance. Journal of artificial intelligence research. 2004, 22: 23~56
    215 Y. -M. Wang, C. Parkan. Optimal aggregation of fuzzy preference relations with an application to broadband internet service selection. European journal of operational research. 2008, 187(3): 1476~1486
    216 Z. S. Xu. Multiple-attribute group decision making with different formats of preference information on attributes. IEEE transactions on systems, man, and cybernetics-part B: cybernetics. 2007, 37 (6): 1500~1511
    217 D. Petkov, O. Petkova, T. Andrew, T. Nepal. Mixing multiple criteria decision making with soft systems thinking techniques for decision support in complex situations. Decision support systems. 2007, 43(4): 1615~1629
    218 I. Contreras, A.M. Mármol. A lexicographical compromise method for multiple criteria group decision problems with imprecise information. European Journal of operational rResearch. 2007, 181 (3): 1530~1539
    219 Z. Pawlak. Rough set approach to multi-attribute decision analysis. European Journal of operational research.1994, 72 (3): 443~459
    220 S. Greco, B. Matarazzo, R.Slowinski, 1995. Rough set approach to multi-attribute choice and ranking problems. ICS Research Report 38/95, Warsaw University of Technology, Warsaw; and in: Fandel, G., Gal, T. (Eds.), Multiple Criteria Decision Making,Proceedings of the Twelfth International Conference, Hagen, Germany, 1997, Springer, Berlin, pp. 318~329
    221 S. Greco, B. Matarazzo, R. Slowinski. Rough sets theory for multicriteria decision analysis. European journal of operational research. 2001, 129 (1): 1~47
    222 S. Greco, B. Matarazzo, R. Slowinski. Rough approximation by dominance relations. International journal of intelligent systems. 2002, 17 (2):153~171
    223 M.-W. Shao, W.-X. Zhang. Dominance relation and rules in an incomplete ordered information system, International journal of intelligent systems. 2005, 20 (1): 13~27
    224 J. W. T. Lee, D. S. Yeung, E. C.C. Tsang. Rough sets and ordinal reducts. Soft computing. 2006, 10 (1): 27~33
    225徐伟华.基于优势关系下信息系统的知识约简.粗糙集与概念格(张文修,姚一豫,梁怡主编) 2006年7月,西安交通大学出版社
    226 X. Yang, J. Y. Yang, et al. Dominance-based rough set approach and knowledge reductions in incomplete ordered information system. Information sciences. 2008, 178(4): 1219~1234
    227 Z.-P. Fan, J. Ma, Q. Zhang, An approach to multiple attribute decision making based on fuzzy preference information on alternatives, Fuzzy setsand systems. 2002, 131 (1): 101~106
    228 E. Herrera-Viedma, F. Herrera, F. Chiclana, A consensus model for multiperson decision making with different preference structures, IEEE trans. systems man and cybernet-part A: systems and humans. 2002, 32 (3): 394~402
    229 J. Ma, Z.-P. Fan, Y.-P. Jiang, J.-Y. Mao, Louis Ma. A method for repairing the inconsistency of fuzzy preference relations. Fuzzy sets and systems. 2006,157(1): 20~33
    230 D. D. Wu. Performance evaluation: An integrated method using data envelopment analysis and fuzzy preference relations. European journal of operational research.(2007), doi:10.1016/j.ejor.2007.10.009
    231 E. Herrera-Viedma, F. Herrera, F. Chiclana, M. Luque. Some issues on consistency of fuzzy preference relations. European journal of operational research. 2004, 154 (1): 98~109
    232 E. P. klement, R. Mesiar, E. Pap. Triangular norms. Kluwer academic publishers. 2001
    233 V. Torra, Y. Narukawa. A view of averaging aggregation operators. IEEE transactions on fuzzy systems. 2007,15 (6): 1063~1067
    234 Y. H. Qian, J. Y. Liang, C. Dang. Consistency measure, inclusion degree and fuzzy measure in decision tables. Fuzzy sets and systems. 2008, 159 (18): 2353~2377
    235 C. Shannon, W. Weaver. The mathematical theory of communication, university of Illinois press, Champaign, IL, 1964
    236 C. Sima, S. Attoor, U. Brag-Neto, J. Lowey, E. Suh, et al. Impact of error estimation on feature selection. Pattern recognition. 2005, 38(12): 2472~2482
    237 M. Kudo, J. Sklansky. Comparison of algorithms that select features for pattern classifiers. Pattern recognition. 2000, 33(1): 25~41
    238 A. Kalousis, J. Prados, M. Hilario. Stability of feature selection algorithms: a study on high-dimensional spaces. Knowledge and information systems. 2007, 12(1): 95~116
    239 Q. Wang, Y. Shen, Y. Zhang, et al. A quantitative method for evaluating the performances of hyperspectral image fusion. IEEE transactions oninstrumentation and measurement. 2003, 52 (4):1041~1047
    240 Q. Wang, Y. Shen, J. Q. Zhang. Nonlinear correlation measure for multivariable data set. Physica D-nonlinear phenomena. 2005, 200(3-4): 287~295
    241 S. Zhao, E. C.C. Tsang On fuzzy approximation operators in attribute reduction with fuzzy rough sets. Information sciences. 2008, 178 (16): 3163~3176
    242 C. Lee, D. A. Landgrebe. Feature Extraction Based on Decision Boundaries. IEEE Transactions on pattern analysis and machine intelligence. 1993, 15, (4): 388~400
    243 X.-Z. Wang, J.-H. Zhai, S.-X Lu. Induction of multiple fuzzy decision trees based on rough set technique. Information sciences. 2008, 178 (16): 3188~3202
    244 H. Liu, H. Motoda, Feature Selection for Knowledge Discovery and Data Mining. Boston: Kluwer Academic, 1998.
    245 J.-S. Mi, Y. Leung, H.-Y. Zhao, T. Feng. Generalized fuzzy rough sets determined by a triangular norm. Information sciences. 2008, 178(16): 3203~3213
    246 Y. H. Qian, J. Y. Liang, C. Y. Dang. Converse approximation and rule extraction from decision tables in rough set theory. Computers & mathematics with applications. 2008, 55(8): 1754~1765
    247 X. Liu, A. Krishnan, A. Mondry. An entropy-based gene selection method for cancer classification using microarray data. BMC bioinformatics. 2005, 6
    248宋国杰,唐世渭,杨冬青,王腾蛟.基于最大熵原理的空间特征选择方法.软件学报. 2003, 14(9): 1544~1552
    249 M. Last, A. Kandel, O. Maimon. Information-theoretic algorithm for feature selection. Pattern recognition letters. 2001, 22 (6-7): 799~811
    250 T. K. Ho. The random subspace method for constructing decision forests, IEEE Trans. Pattern Anal. Machine Intell..1998, 20 (8): 832~844
    251 K. Tumer, Nikunj C. Oza, Input decimated ensembles. Pattern analysis and application. 2003, 6 (1): 65~77
    252 S. Gunter, H. Bunke. Feature selection algorithms for the generation of multiple classifier systems and their application to handwritten word recognition. Pattern recognition letters. 2004, 25 (11): 1323~1336
    253 R. E. Schapire, The strength of weak learnability, Machine learning.1990, 5(2): 197~227
    254 L. Breiman, Bagging predictors, Machine learning.1996, 24 (2): 123~140
    255 L. Breiman. Random forests. Machine learning. 2001, 45 (1): 5~32
    256 J. Kittler, M. Hatef, R. P.W. Duin, J. Matas. On combining classifiers, IEEE trans. pattern anal. machine intell. 1998, 20(3): 226~239
    257 P. M. Granitto, P. F. Verdes, H. A. Ceccatto, Neural network ensembles: evaluation of aggregation algorithms. Artificial intelligence. 2005, 163: 139~162
    258 T. Windeatt. Diversity measures for multiple classifier system analysis and design, Information fusion. 2005, 6 (1): 21~36
    259 M. Aksela, J. Laaksonen, Using diversity of errors for selecting members of a committee classifier, Pattern Recognition. 2006, 39 (4): 608~623
    260 L. I. Kuncheva, C. J. Whitaker. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy, Mach. learn. 2003, 51 (2): 181~207
    261 Z. H. Zhou, J. X. Wu, W. Tang, Ensembling neural networks: many could be better than all. Artificial intelligence. 2002, 137 (1-2): 239~263
    262 H. W. Shin, S.Y. Sohn, Selected tree classifier combination based on both accuracy and error diversity, Pattern recognition. 2005, 38 (2):191~197
    263 Q. Wu, D. Bell, M. Mcginnity, Multiknowledge for decision making, Knowledge and information systems. 2005, 7 (2): 246~266
    264 Z. Suraj, N. Gayar, P. Delimata. A rough set approach to multiple classifier systems. Fundamenta informaticae. 2006, 72 (1-3): 393~406
    265 T.-J. Li, W.-X. Zhang. Rough fuzzy approximations on two universes of discourse. Information sciences. 2008, 178(3): 892~906
    266 Y. Zhao, Y. Y. Yao, F. Luo. Data analysis based on discernibility and indiscernibility. Information sciences, 2007, 177(22): 4959~4976
    267 G. L Ritter, H. B. Woodruff, S. R. Lowry, T. L. Isenhour. An algorithm for a selective nearest neighbor decision rule, IEEE transactions on information theory. 1975, 21(6): 665~669
    268张文修,梁怡,吴伟志.信息系统与知识发现.科学出版社, 2003
    269 T. Beaubouef, F. E. Petry, G. Arora. Information-theoretic measures ofuncertainty for rough sets and rough relational databases Information sciences. 1998, 109 (1-4): 185~195
    270 I. Duntsch, G. Gediga. Uncertainty measures of rough set prediction, Artificial intelligence, 1998, 106 (1): 109~137
    271 J. S. Mi, Y. Leung, W. Z. Wu. An uncertainty measure in partition-based fuzzy rough sets. International journal of general systems. 2005, 34 (1): 77~90
    272何正嘉,訾艳阳,张西宁.现代信号处理及工程应用.西安交通大学出版社, 2007年,西安
    273 John Shawe-Taylor, Nello Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, 2004
    274 T. Golub. Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science. 1999, 286, 531~537
    275 Y. Saeys, I. Inza, P. Larranaga. A review of feature selection techniques in bioinformatics. Bioinformatics. 2007, 23(19): 2507~2517

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700