选择性神经网络集成算法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
集成学习已经成为近年来机器学习领域的热点之一,其中选择性的集成方法由于其在适应性、推广性、组合性等方面的优势成为集成学习的一个重要方向。论文以神经网络集成为研究对象,利用信息论和计算科学等相关领域的理论和方法对选择性集成算法进行了深入的研究,提出了多种高性能的选择性集成的方法,并对算法的机理、性能、参数选择以及差异度等问题进行了深入的探讨。具体来说进行了以下几方面的工作。
     对采用全局优化策略的选择性神经网络集成算法进行了进一步研究。引入两种高性能全局优化算法—粒子群优化(PSO)和蚁群优化(ACO)算法用于神经网络集成的构建,分别提出了基于离散二进制粒子群优化(BPSO)的神经网络优选集成方法和基于蚁群优化的神经网络优选集成算法。基于BPSO的优选集成算法用n维离散0-1空间的一个位置对应于一个可能的神经网络集成,将选择性集成问题转化为粒子在离散二进制空间寻找最优位置的粒子群优化问题。基于ACO的优选集成算法在构建求解模型时,采用信息素反映神经网络个体精确度,差异度启发信息反映神经网络个体的差异度,有效地提高了搜索效率和预测精度。
     对采用聚类选择策略的选择性神经网络集成算法进行了进一步研究。针对传统k-均值聚类方法对数据分布要求严格的局限,采用谱聚类(SC)的思想和方法提出了基于谱聚类的神经网络优选集成算法。算法采用互信息描述神经网络个体的差异程度,并将神经网络个体按相似程度进行聚类后,挑选每一类的一个代表构建神经网络集成。谱聚类方法将所有神经网络个体映射到低维谱空间,保证了聚类的准确性,从而提高了由聚类选择获得的神经网络集成的性能。
     提出了神经网络组合集成的思想,即将神经网络集成作为广义神经网络集成的一个个体,通过调节神经网络个体的加权组合系数的方式调节成员神经网络集成的差异度,进而提高神经网络集成的性能。提出了基于该思想的两种算法:基于最小信息损失的神经网络组合集成—EoE-MIL算法和基于最大独立性的神经网络组合集成方法—EoE-AI算法。EoE-MIL算法以保证集成构建过程中信息最小损失为原则,利用协方差矩阵主要特征值对应的特征向量将神经网络个体进行线性组合,特征向量的线性无关保证组合集成中各个神经网络集成的差异性。EoE-AI算法将Kullback-Leibler信息距离作为各神经网络统计独立性的度量,并以此为基础以保证组合集成的每个个体(神经网络集成)的最大独立性为原则构建神经网络组合集成。两种算法在提高系统预测性能的同时也具有一定的根据问题选择模型的能力。
     此外,论文还对神经网络集成中的差异度进行了讨论。
     未来的研究将包括选择性集成理论的研究,新的高效算法以及选择性方法在新的应用领域的拓展。随着集成学习理论的进一步完善和各种新的方法的出现,选择性集成的思想和方法将会在更广泛的领域发挥更大的作用。
Ensemble learning has become a hot topic in the field of machine learning, and selective ensemble is attracting more and more attentions by the advantage of its applicability and combinability to many learning machines. This dissertation investigates the selective neural network ensemble by means of the theories and methods of relational fields, such as information theory and computing science. Some approaches are proposed to construct selective neural network ensemble with high performance, and then their working mechanism and preferences are discussed in details. The major contributions of this dissertation are emphatically stated as follows.
     Firstly, one category of ensemble methods based on the strategy of global optimization is carefully explored. Two kinds of powerful optimization tools - Ant Colony Optimization (ACO) and Partical Swarm Optimization (PSO) are employed to construct selective ensemble so that selective optimum neural network ensemble based on ACO and selective optimum neural network ensemble based on discrete Binary PSO (BPSO) are proposed. In BPSO-based approach, each candidate ensemble corresponds to a position of n-dimension 0-1 space and the goal of constructing optimum ensemble is achieved by particle optimization in discrete binary space. In ACO based approach, the pheromone reflects the accrracy of ensemble while the diversity heuristic information indicates the diversity of individuls. Both approches show perfectly predictive ability.
     Secondly, clustering-based selective algorithm for constructing neural network ensemble is investigated, where neural networks are clustered according to similarity and the most accurate individual network from each cluster is selected to make up the ensemble. The usage of traditional k-means clustering is limited due to its strict requirements in data distribution. Alternatively, Spectral Cluster (SC) has no prerequest on the global structure of data. So selective optimum neural network ensemble based upon spectal clustering is proposed to improve the predictivity accuracy of selective ensemble, in which mutual information is used to measure the diversity of neural network and the group relationships among data points are preserved as much as possible in a lower dimensional representation.
     Thirdly, an idea named "ensemble of ensembles (EoE)" is proposed. Different from ordinary neural network ensemble, ensemble of neural network ensembles is a two-layered neural network ensemble architecture and employs weighted neural network ensemble as individual of ensemble. The advantage of ensemble of ensembles lies in that individual diversity can be manipulated by adjusting the weights of weighted ensemble rather than the architecture or function of neural network. Two approches based on EoE idea named EoE-MIL (Ensemble of neural Network Ensembles based on Minimum Information Loss) and EoE-AI (Ensemble of neural Network Ensembles based on mAximum Independence) are designed and implemented. In EoE-MIL approach, neural networks are combinated weightedly by the eigenvectors of principal eigenvalues of the covariance to construct individual of EoE accrording to minimum information loss principle, and the diversity among individual of EoE is guaranteed by the linear independence of eigenvectors. In EoE-AL approch, Kullback-Leibler information distance is used to measure statistic independence of individual of EoE, and weighted combination of neural network by the eigenvectors of principal eigenvalues of the correlation matrix becomes the individual of EoE as well according to maximum independence principle. Both approches show the ability of reducing predictive error and that of selecting the model coresponding to particular problem.
     Furthermore, the diversity of the neural network ensemble investigated.
     The future work should include a generalized and deeper theoretical study, to explore new powful constructing algrithms, and to expand the applications to a wider scope.
引文
1 Mjolsness E,Decoste D.Machine Learning for Science:State of the Art and Future Prospects.Science,2001,293(14):2051-2055
    2.王珏.关于机器学习的讨论.In:王珏,周志华,周傲英,ed.机器学习及其应用.北京:清华大学出版社,2006.1
    3.Duda R O,Hart P E,Stork D G.Pattern Classification,Second Edition.John Wiley& Sons,Inc.,2001
    4.Wolpert D H,Macready W G.No Free Lunch theorems for search.Santa Fe Institute,Sante Fe,NM,1995
    5.Wolpert D H,Macready W G.No Free Lunch theorems for optimization.IEEE Trans.on Evolutionary Computation,1997,1(1):67-82
    6.Nilsson N J.Learning Machines:Foundations of Trainable Pattern-Classifying.NY:McGraw Hill,1965
    7.Kanal L.Patterns in Pattern Recognition.IEEE Trans.Information Theory,1974,(IT-20):697-722
    8.Minsky M.Logical versus analogical or symbolic versus connectionist or neat versus scruffy.AI Magazine,1991,(12):34-51
    9.Brown G,Wyatt J,Harris R,Yao X.Diversity creation methods:a survey and categorisation.Information Fusion,2005,6(1):5-20
    10.Woods K,Kegelmeyer W,Bowyer K.Combination of multiple classiers using local accuracy estimates.IEEE Transactions on Pattern Analysis and Machine Intelligence,1997,19:405-410
    11.Wang W,Jones P,Partridge D.Diversity between neural networks and decision trees for building multiple classi.er systems.In:Proceedings of the International Workshop on Multiple Classi.er Systems:Springer,Calgiari,Italy,2000.240-249
    12.Langdon W B,Barrett S J,Buxton B F.Combining decision trees and neural networks for drug discovery.In:Genetic Programming,Proceedings of the 5th European Conference, Kinsale, Ireland, 2002. 60 - 70
    13. Hansen L K, Salamon P. Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990,12(10): 993 - 1001
    14. Sollich P, Krogh A. Learning with ensembles: how over-fitting can be useful. In: Advances in Neural Information Processing Systems 8, Denver, CO: MIT Press, Cambridge, MA, 1996. 190-196
    15. Hansen L K, Liisberg L, Salamon P. Ensemble methods for handwritten digit recognition. In: Proc.IEEE Workshop on Neural Networks for Signal Processing, Helsingoer, Denmark: IEEE Press,Piscataway, NJ, 1992. 333-342
    16. Mao J. A case study on bagging, boosting and basic ensembles of neural networks for OCR. In: Proc, the IEEE Intenational Joint Conference on Neural Network, Anchorage, AK: IEEE Computer Society Press, Los Alamitos, CA, 1998.1828-1833
    17. Gutta S, Wechsler H. Face recognition using hybrid classifier systems. In: Proc, the IEEE Intenational Joint Conference on Neural Network, Washington, DC: IEEE Computer Society Press, Los Alamitos, CA, 1996.1017-1022
    18. Huang F J, Zhou Z H, Zhang H J, Chen T H. Pose invariant face recognition. In: Proc. 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France: IEEE Computer Society Press, Los Alamitos, CA, 2000. 245-250
    19. Schapire R E, Singfer Y. BoosTexter: A boosing-based system for text categorization. Machine Learning, 2000,39(2-3): 135-168
    20. Hampshire J, Waibel A. A novel objective function for improved phoneme recognition using timedelay neural networks. IEEE Transactions on Neural Networks, 1990,1(2): 216-228
    21. Ceccarelli M, Petrosino A. Multi-feature adaptive classifiers for SAR image segmentation. Neurocomputing, 1997,14(4): 345-363
    22. Sharkey A J C, Sharkey N E, Cross S S. Adapting an ensemble approah for the diagnosis of breast cancer. In: Proc. International Conference on Artificial Neural Networks, Skovde, Sweden, 1998.281-286
    23. Dietterich T G. Machine-Learning Research: Four Current Directions. AI Magazine, 1998, 18(4): 97-136
    24. www.diee.unica.it/mcs/mcs2000.
    
    25. www.diee.unica.it/mcs/mcs2001.
    
    26. www.diee.unica.it/mcs/mcs2002.
    
    27. www.diee.unica.it/mcs/mcs2003.
    
    28. www.diee.unica.it/mcs/mcs2004.
    
    29. www.diee.unica.it/mcs/mcs2005.
    
    30.周志华,陈世福.神经网络集成.计算机学报,2002,25(01),1-8
    
    31. Ghosh J. Multiclassifier systems: Back to the future. In: Roli F., Kittier J., ed. Multiple Classifier Systems, Lecture Notes in Computer Science. Springer, 2002.1-15
    
    32. Roli F, Giacinto G, Vernazza G. Methods for Designing Multiple Classifier Systems. In: Kittler J., Roli F., ed. MCS2001, Lecture Notes in Computer Science. Beilin, Heidelberg: Springer-Verlag, 2001. 78-87
    
    33. Sharkey A. On combining artificial neural nets. Connection Science, 1996, 8(3,4): 299-314
    
    34. Sharkey A, Sharkey N. Diversity, selection, and ensembles of artificial neural nets. In: Neural Networks and their Applications(NEURAP' 97), 1997. 205 - 212
    
    35. Dietterich T G. Ensemble methods in machine learning. In: Multiple Classifier Systems: Springer, 2001.1-15
    
    36. Kearns M, Valiant L G. Learning Boolean formulae or factoring. In: Technical Report TR-1488. Cambridge, MA:. Aiken Computation Laboratory, Harvard University, 1988
    
    37. Perrone M P, Cooper L N. When networks disagree: ensemble method for neural networks. In: Artificial Neural Networks for Speech and Vision, Chapman & Hall, New York, 1993.126-142
    
    38. Krogh A, Vedelsby J. Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems 7, Denver, CO: MIT Press, Cambridge, MA, 1995.231-238
    
    39. Kuncheva L, Whitaker C. Measures of diversity in classifier ensembles. Machine Learning, 2003,51:181 -207
    
    40. Breiman L. Bagging predictors. Machine Learning, 1996,24(2): 123-140
    
    41. Schapire R E. The strength of weak learnability. Machine Learning, 1990, 5(2): 197-227
    
    42. Freund Y. Boosting a weak algorithm by majority. Information and Computation, 1995, 121(2): 256-285
    43. Navone H D, Verdes P F, Granitto P M, Ceccatto H A. Selecting Diverse Members of Neural Network Ensemble. In: Proc. 16th Brazilian Symposium on Neural Networks, 2000.255-260
    44. Raviv Y, Intrator N. Bootstrapping with noise: an effective regularisation technique. Connection Science, 1996, 8: 355 - 372
    45. Breiman L. Randomizing outputs to increase prediction accuracy. Statistics Department, University of California, 1998
    46. Dietterich T G, Bakiri G. Error-correcting output codes: a general method for improving multiclass inductive learning programs. In: Proceedings of the Ninth AAAI National Conference on Artificial Intelligence, AAAI Press, Menlo Park, CA, 1991. 572 - 577
    47. Melville P, Mooney R. Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the Eighteenth International Joint Conference on Artificial Intelligence, Mexico, 2003. 505-510
    48. Melville P, Mooney R J. Creating diversity in ensembles using artificial data. Information Fusion, 2005,6: 99-111
    49. Ho T K. The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998,20(8): 832 - 844
    50. Duin R P W, Tax D M J. Experiments with classifier combining rules. In: Proceedings of the International Workshop on Multiple Classifier Systems, Calgiari, Italy: Springer, 2000. 16-19
    51. Liao Y, Moody J. Constructing heterogeneous committees using input feature grouping. Advances in Neural Information Processing Systems, 1999,12:921 - 927
    52. Oza N C, Turner K. Input decimation ensembles: decorrelation through dimensionality reduction. In: Proceedings of the International Workshop on Multiple Classifier Systems, Cambridge, UK: Springer, 2001.238 - 247
    53. Opitz D. Feature selection for ensembles. In: Proceedings of 16th National Conference on Artificial Intelligence (AAAI), 1999. 379 - 384
    54. Cunningham P, Carney J. Diversity versus quality in classification ensembles based on feature selection. In: European Conference on Machine Learning, Berlin: Springer, 2000. 109 - 116
    55. Zenobi G, Cunningham P. Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error.Lecture Notes in Computer Science,2001,2169:576-587
    56.凌锦江,陈兆乾,周志华.基于特征选择的神经网络集成方法.复旦学报(自然科学版),2004,43(05):685-688
    57.Chen H,Yuan S,Jiang K.Wrapper Approach for Learning Neural Network Ensemble by Feature Selection.In:Lecture Notes in Computer Science.2005.526-531
    58.Tsymbal A,Pechenizkiy M,Cunningham P.Diversity in search strategies for ensemble feature selection.Information Fusion,2005,6:83-98
    59.凌锦江,周志华.基于因果发现的神经网络集成方法.软件学报,2004,15(10):1479-1484
    60.Sharkey N,Neary J,Sharkey A.Searching weight space for backpropagation solution types.In:Current Trends in Connectionism:Proceedings of the 1995 Swedish Conference on Connectionism,1995.103-120
    61.Partridge D,Yates W B.Engineering multiversion neural-net systems.Neural Computation,1996,8(4):869-893
    62.Yates W,Partridge D.Use of methodological diversity to improve neural network generalization.Neural Computing and Applications,1996,4(2):114-128
    63.Parmanto B,Munro P W,Doyle H R.Improving committee diagnosis with resampling techniques.Advances in Neural Information Processing Systems,1996,8:882-888
    64.Maclin R,Shavlik J W.Combining the predictions of multiple classifiers:using competitive learning to initialize neural networks.In:Proceedings of the 14th International Joint Conference on Arti.cial Intelligence,Montreal,Canada,1995.524-530
    65.Partridge D.Network generalization differences quantified.Neural Networks,1996,9(2):263-271
    66.Opitz D W,Shavlik J W.Generating accurate and diverse members of a neural network ensemble.In:Advances in Neural Information Processing Systems 8,Denver,CO:MIT Press,Cambridge,MA,1996a.535-541
    67.Opitz D W,Shavlik J W.Actively searching for an effective neural network ensemble.Connection Science,1996b,8(3-4):337-353
    68.Islam M M,Yao X,Murase K.A constructive algorithm for training cooperative neural network ensembles. IEEE Transactions on Neural Networks, 2003,14(4): 820 - 834
    69. Rosen B E. Ensemble learning using decorrelated neural networks. Connection Science-Special Issue on Combining Artificial Neural Networks: Ensemble Approaches, 1996, 8(3&4): 373-384
    70. Liu Y. Negative correlation learning and evolutionary neural network ensembles. Canberra, Australia: The University of New South Wales, Australian Defence Force Academy, 1998
    71. Yong L, Xin Y. A cooperative ensemble learning system. In: IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on Neural Networks Proceedings, 1998.2202-2207
    72. Liu Y, Yao X. Negatively Correlated Neural Networks for Classification. In: Proc, of the Third International Symposium on Artificial Life and Robotics (AROBIII'98), Beppu, Japan, 1998. 736-739
    73. Yong L, Xin Y. Simultaneous training of negatively correlated neural networks in n ensemble. Systems, IEEE Transactions on Man and Cybernetics, Part B, 1999,29(6): 716-725
    74. Brown G. Diversity in neural network ensembles. PhD dissertation of University of Birmingham, 2004
    75. Mckay R, Abbass H. Anticonelation measures in genetic programming. In: Australasia-Japan Workshop on Intelligent and Evolutionary Systems, 2001a. 45-51
    76. Mckay R, Abbass H. Analyzing anticorrelation in ensemble learning. In: Proceedings of 2001 Conference on Arti.cial Neural Networks and Expert Systems, Otago, New Zealand, 2001b. 22-27
    77. Jang M, Cho S. Observational Learning Algorithm for an Ensemble of Neural Networks. Pattern Analysis & Applications, 2002, 5(2): 154-167
    78. Chandra A, Yao X. DIVACE: Diverse and Accurate Ensemble Learning Algorithm. In: Lecture Notes in Computer Science, 2004. 619-625
    79. Abbass H A. A memetic pareto evolutionary approach to artificial neural networks. In: Proc, the 14th Australian Joint Congference on Artificial Intelligence, Berlin: Springer-Verlag, 2000. 1-12
    80.王正群,陈世福,陈兆乾.并行学习神经网络集成方法.计算机学报,2005a,28(03): 402-408
    81.王正群,陈世福,陈兆乾.一种主动学习神经网络集成方法.计算机研究与发展,2005b,42(3):375-380
    82.Qin Z,Liu Y,Heng X,Wang X.Negatively Correlated Neural Network Ensemble with Multi-population Particle Swarm Optimization.In:Lecture Notes in Computer Science,2005.520-525
    83.Zhou Z H,Wu J,Tang W.Ensembling neural networks:Many could be better than all.Artificial Intelligence,2002,137:239-263
    84.Lazarevic A,Obradoric Z.Effective pruning of neural network classifier ensembles.In:Proc.Internation Joint Conference on Neural Networks,2001.796-801
    85.Giacinto G,Roli F.Design of effective neural network ensembles for image classification purposes.Image and Vision Computing,2001,19:699-707
    86.Bakker B,Heskes T.Clustering ensembles of neural network models.Neural Networks,2003,16:261-269
    87.Fu Q,Hu S X,Zhao S Y.Clustering-based selective neural network ensembles.Journal of Zhejiang University SCIENCE,2005,6A(5):387-392
    88.Li K,Huang H,Ye X,Cui L.A Selective Approach to Neural Network Ensemble Based on Clustering Technology.In:Proc.the 3rd International Conference on Machine Learning and Cybernetics,ShangHai,China,2004.3229-3233
    89.李凯,黄厚宽.一种基于聚类技术的选择性神经网络集成方法.计算机研究与发展.2005,42(04):594-598
    90.Wu J,Zhihua Z,Zhaoqian C.Ensemble of GA-based Selective Neural Network Ensembles.In:Proc.the 8th International Conference on Neural Information Processing(ICONIP'01),Shanghai,China,2001.1477-1482
    91.施彦,黄聪明,侯朝桢.基于随机梯度法的选择性神经网络二次集成.计算机工程,2004,30(16):133-135
    92.施彦,黄聪明,侯朝桢.选择性神经网络二次集成方法在定量构效关系建模中的应用研究.计算机与应用化学,2005,22(02):153-156
    93.Yao X,Liu Y.Making use of population information in evolutionary artificial neural networks.IEEE Transactions on Systems,Man and Cybernetics-Part B:Cybernetics,1998,28(3): 417-425
    94.Liu Y,Yao X,Higuchi T.Evolutionary Ensembles with Negative Correlation Learning.IEEE Transaction on Evolutionary Computation,2000,4(4):380-387
    95.Yong L,Xin Y,Qiangfu Z,Higuchi T.Evolving a cooperative population of neural networks by minimizing mutual information.In:Proceedings of the 2001 Congress on Evolutionary Computation,2001.384-389
    96.Liu Y,Yao X.Learning and Evolution by Minimization of Mutual Information.In:Lecture Notes in Computer Science,2002.495-504
    97.Khare V,Yao X.Artificial speciation of neural network ensembles.In:Proceedings of the 2002 UK Workshop on Computational Intelligence(UKCI'02),University of Birmingham,UK,2002.96-103
    98.傅向华,冯博琴,马兆丰,何明.增量构造负相关异构神经网络集成的方法.西安交通大学学报,2004,38(08):796-799
    99.孟江,王耀才,王天成,巩敦卫.神经网络集成的免疫学习算法.中国矿业大学学报,2005,34(04):486-489
    100.郑建军,刘玉树,刘琼昕,孙曼.一种动态性神经网络的集成方法.计算机工程,2004,30(03):49-50
    101.Oliveira L S,Sabourin R,Bortolozzi F,Suen C Y.A methodology for feature selection using multi-objective genetic algorithm for handwritten digit string recognition.Internation Journal of Pattern Recognition and Aritificial Intelligence,2003,17(6):903-930
    102.Luiz O,Morita M,Sabourin R.Feature Selection for Ensembles Using the Multi-Objective Optimization Approach.In:Studies in Computational Intelligence,2006.49-74
    103.Yaochu J,Okabe T,Sendhoff B.Neural network regularization and ensembling using multi-objective evolutionary algorithms.In:Congress on Evolutionary Computation 2004.1-81
    104.Abraham A,Grosan C,Han S,Gelbukh A.Evolutionary Multiobjective Optimization Approach for Evolving Ensemble of Intelligent Paradigms for Stock Market Modeling.In:Lecture Notes in Computer Science,2005.673-681
    105.Garcia-pedrajas N,Hervas-martinez C,Ortiz-boyer D.Cooperative coevolution of artificial neural network ensembles for pattern classification.IEEE Transactions on Evolutionary Computation,2005,9(3):271-302
    106.Shen Z Q,Kong F S.Optimizing Weights by Genetic Algorithm for Neural Network Ensemble.In:Lecture Notes in Computer Science,2004.323-331
    107.吴建鑫,陈兆乾,周志华.基于最优权值的选择性神经网络集成方法.模式识别与人工智能,2001,14(4):476-480
    108.沈掌泉,孔繁胜.基于个体选择的动态权重神经网络集成方法研究.计算机工程与应用,2005a,41(12):8-11
    109.沈掌泉,孔繁胜.基于广义回归网络的动态权重回归型神经网络集成方法研究.计算机应用研究,2005b,22(12):41-43
    110.Yang S,Browne A,Picton P D.Multistage Neural Network Ensembles.In:Lecture Notes in Computer Science,2002.91
    111.Wanas N,Kamel M.Weighted combination of neural network ensembles.In:International joint conference on neural networks(IJCNN'02),Honolulu,HI,USA,2002.1748-1752
    112.Jimenez D.Dynamically weighted ensemble neural networks for classification.In:Proc.IJCNN-98,Anchorage,AK:IEEE Computer Society Press,Los Alamitos,CA,1998.753-756
    113.Ueda N.Optimal linear combination of neural networks for improving classification performance.IEEE Trans.Pattern Analysis and Machine Intelligence,2000,22(2):207-215
    114.王正群,陈世福,陈兆乾.带偏置的选择性神经网络构造方法.计算机科学,2005,32(7):152-155
    115.王正群,陈世福,陈兆乾.优化分类型神经网络线性集成.软件学报,2005,16(11):1902-1908
    116.Freund Y,Schapire R E.A decision-theoretic generalization of on-line learning and an application to boosting.In:Proc.EuroCOLT-94,Barcelona,Spain:Springer-Verlag,Berlin,1995.23-37
    117.Drucker H.Boosting using neural nets.In:Combining artificial neural nets:ensemble and modular multi-net systems:Springer-Verlag,London,1999.51-77
    118.Opitz D,Maclin R.Popular ensemble methods:an empirical study.Journal of Artificial Intelligence Research,1999,11:169-198
    119.Bauer E,Kohavi R.An empirical comparison of voting classification algorithms:Bagging, Boosting,and variants.Machine Learning,1999,36(1-2):105-139
    120.German S,Bienenstock E,Doursat R.Neural networks and the bias/variance dilemma.Neural Computation,1992,4(1):1-58
    121.Kohavi R,Wolpert D H.Bias plus variance decomposition for zero-one loss functions.In:Machine Learning:Proceedings oft he Thirteenth International Conference:Morgan Kaufmann,1996.275-283
    122.Breiman L.Bias,variance,and arcing classifiers.Berkeley,CA:Statistics Department,University of Califomia,1996
    123.Kong E B,Dietterich T G.Error-correcting output coding corrects bias and variance.In:Proceedings of the 12th International Conference on Machine Learning:Morgan Kaufmann,1995.313-321
    124.Domingos P.A unified bias variance decomposition and its applications.In:Proceedings of the 17th International Conference on Machine Learning:Morgan Kaufmann,San Francisco,CA,2000.231-238
    125.James G.Variance and bias for general loss functions.Machine Learning,2003,51:115-135
    126.Ueda N,Nakano R.Generalization error of ensemble estimators.In:Proceedings of International Conference on Neural Networks,1996.90-95
    127.Yule G D.On the association of attributes in statistics.Phil.Trans.,1900,194:257-319
    128.Partridge D,Krzanowski W J.Software diversity:practice statistics for its measurement and exploitations.Information $ Software Technology,1997,39:707-717
    129.Kuncheva L.That elusive diversity in classifier ensembles.In:First Iberian Conference on Pattern Recognition and Image Analysis(IbPRIA),2003.1126-1138
    130.胡劲松.新型快速高精度全局优化算法及应用的研究.华南理工大学博士论文,2002
    131.Kennedy J,Eberhart R C,Shi Y.Swarm Intelligence.San Francisco:Morgan Kaufman Publishers,2001
    132.Kennedy J,Eberhart R C.Particle Swarm Optimization.In:Proc.IEEE International Conference on Neural Network,Perth,Australia,1995.1942-1948
    133.Yoshida H,Kawata K,Fukuyama Y.A particle swarm optimization for reactive power and voltage control considering voltage security assessment. Trans, of the Institute of Electrical Engineering of Japan, 1999,119B(12): 1462-1469
    134.吴启迪,王镭.智能微粒群算法研究及应用.江苏教育出版社,2005
    135. Eberhart R C, Shi Y. Comparison between Genetic Algorithm and Particle Swarm Optimization. Evolutionary Programming VII(1998), Lecture Notes in Computer Science 1447, 1998,: 611-616
    136. Kennedy J, Eberhart R C A Discrete Binary Version of the Paticle Swarm Optimization. In: Proc. IEEE International Conference on Cybernetics and Simulation, Piscataway, NJ: IEEE, 1997.4104-4108
    137. Dietterich T G. Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms. Neural Computation, 1998,10(7): 1895-1923
    138. Colorni A, Dorigo M, Maniezzo V. Distributed optimization by ant colonies. In: In:Proc of 1st European conf. Artificial Life, Pans, France: Elsevier, 1991.134-142
    139. Colorni A, Dorigo M, Maniezzo V. An investigation of some properties of an ant algorithm. In: Proc, of parallel Problem Solving from Nature(PPSN), France: Elsiver, 1992. 509-520
    140. Colorni A, Dorigo M, Maniezzo V. Ant system for job-shop scheduling. Belgian J of Operations Research Statistics and Computer Science,, 1994,34(1): 39-53
    141. Dorigo M, Caro G D, Gambardella L M. Ant Algorithms for Discrete Optimization. Artificial Life, 1999, 5(3): 137-172
    142. Dorigo M, Maniezzo V, Colorni A. Ant System: Optimization by a Colony of Cooperating Agents. IEEE Transactions on Systems, Man, and Cyhernetics-part B, 1996,26(1): 29-41
    143. Gambardella L M, Tailard E D, Dorigo M. Ant Colonies for the quadratic assignment problem. Journal of the Operational Research Sociaty, 1999, 50(2): 167-176
    144. Costa D, Hertz A. Ants can colour graphs. Journal of the Operatioanal Research Sociaty, 1997,48(3): 295-305
    145. Ahn S H, Lee S G, Chung T C. Modified ant colony system for coloring graphs. In: Proc, of 2003 Joint Conference of the 4th International Conference on Information, Communication and Signal Processing and the 4th Pacific Rim Conference on Multimedia, 2003. 1849-1853
    146. Gomez J F, Khodr H M, Deoliveira P M. Ant colony system algorithm for the planning of primary distribution circuits. IEEE Transactions on Power Systems, 2004,19(2): 996-1004
    147. Weiss Y. Segmentation using eigenvetors: a unifying view. In: IEEE International Conference on Computer Vision, 1999. 975-982
    148. Meila M, Shi J. Learning segmentation by random walks. Neural Information Processing Systems, 2001,13
    149. Ng A Y, Jordan M I, Weiss Y. On spectral clustering: Analysis and an algorithm. In: Dietterich T. G., Becker S., Ghahramani Z., ed. Advances in Neural Information Processing Systems 14. Cambridge, MA: MIT Press, 2002. 849 - 856
    150. Hendrickson B, Leland R. An improved spectral graph partitioning algorithm for mapping parallel computations. SIAM J. Sci. Comput., 1995,16(2): 452 - 459
    151. Alpert C J, Kahng A B. Multiway partitioning via geometric embeddings, orderings and dynamic programming. IEEE Transactions on Computer-aided Design of Integrated Circuits and Systems, 1995,14(11): 1342 - 1358
    152. Malik J, Belongie S, Leung T, Shi J. Contour and texture analysis for image segmentation. International Journal of Computer Vision, 2001,43(1):7-27
    153. Shi J, Malik J. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000,22(8): 888-905
    154. Dhillon I S. Co-clustering documents and words using Bipartite Spectral Graph Partitioning. In: KDD 2001, San Francisco,California, USA
    155. Fiedler M. Algebraic connectivity of graphs. Czechoslovak Math, 1973,23:298-305
    156. Hagen L, Kahng A B. New spectral methods for ratio cut partitioning and clustering. IEEE trans. Computer-Aided Design, 1992,11(9): 1074-1085
    157. Ding C, He X, Zha H, Gu M, Simon H. Spectral Min-Max Cut for Graph Partition and Data Clustering. In: Proc, of 1st IEEE Int'l Conf. Data Mining, San Jose, CA, 2001
    158. Gu M, Zha H, Ding C, He X, Simon H. Spectral relaxation models and structure analysis for k-way graph Clustering and bi-clustering. Penn State Univ Tech Report CSE-01-007., 2001
    159. Meila M, Shi J. Learning segmentation by random walks. In: Leen T. K., Dietterich T. G., Tresp V., ed. Advances in Neural Information Processing Systems 13. MIT Press, 2001. 873-879
    160.Scott G L,Longuet-higgins H C.Feature grouping by relocalsation of eigenvectors of the proximity matrix.In:British Machine Vision Conference,1990.103-108
    161.Verma D,Meila M.A comparison of spectral clustering algorithms.UW CSE Technical report 03-05-01,2003
    162.田娟,王崇峻,李静,陈兆乾.一个基于谱图分割的简单聚类方法.复旦学报(自然科学版),2004,43(5):810-814
    163.Chan P K,Schlag D F,Zhen J.Spectral k-way ratio-cut partitioning and clustering.IEEE Tran.on Computer-Aided Design of Intergrated Circuits and Systems,1994,13(9):1088-1096
    164.Arndt C.Information Measures:Informations and its Description in Science and Engineering.Berlin:Springer,2001
    165.Cover T M,thomas J A.Element of Informaion Thoery.Chichester:John Wiley&Sons,Inc.,1991
    166.Jaynes E T.Information Theory and Statistical Machanics Ⅱ.Physical Review,1957a,108(2):171-190
    167.Jaynes E T.Information Theory and Statistical Machanics Ⅰ.Physical Review,1957b,106(4):620-630
    168.仇佩亮.信息论及其应用.杭州:浙江大学出版社,1999
    169.Kullback S,Leibler R A.On information and sufficiency.Ann.Math.Stat.,1951,22:79-86
    170.Kulback S.Information Theory and Statistics.NewYork:John Wiley&Sons,Inc.,London:Cahpman&Hall,Limited,1958
    171.Jacobs R A.Bias/variance analyses of mixture-of-experts architectures.Neural Computation,,1997,9:369-383
    172.Kapur J N,Kesavan H K.Entropy Optimization Principles with Applications.Academic Press,Inc,1992
    173.Kuncheva L.Combining Pattern Classifiers:Methods and Algorithms.Hoboken,New Jersey:John Willey& Sons,Inc.,2004
    174.Strehl A,Ghosh J.Cluster ensembles-A knowledge reuse framework for combining multiple partitions.Journal of Machine Learning Research,2002,3:583-618
    175.Zhou Z,Zhang M.Solving multi-instance problems with classifier ensemble based on constructive clustering. Knowledge and Information System, 2007,11(2): 155-170
    176. Turner K. Ensembles to Collectives:The Changing Face of MCS.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700