用户名: 密码: 验证码:
演化算法的计算复杂性研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
演化算法是受进化论启发而提出的一大类随机优化算法。该类算法将进化论“物竞天择,适者生存”的思想用于求解优化问题,在计算机科学界得到了广泛而持久的关注。它兴起于上世纪六十年代,蓬勃发展于八十年代和九十年代,最终形成人工智能领域的一个分支—演化计算。总体来说,演化计算是一门实验科学,它的绝大部分工作都立足于模拟实验和实际应用。究其原因,这类算法复杂的随机行为使得严密的理论研究(尤其是计算复杂性研究)难以开展。为了更好地理解演化算法复杂的随机行为从而设计出更高效的演化算法,本文从多个方面开展演化算法的计算复杂性研究,以期增强演化计算领域的理论基础:我们不仅关注更贴近实际的经典演化算法(遗传算法),也关注学术界流行的新型演化算法;我们不仅关注求解传统静态优化问题的演化算法,也关注求解动态优化问题的演化算法;我们不仅关注算法的理论研究,也关注理论结果的实际意义。本文的主要贡献包括:
     (1)研究了种群经典演化算法的计算复杂性。早期的理论研究常常关注简单但脱离实际的算法模型,这些简化的算法模型距离实际应用中的经典演化算法还有相当大的差距。而本文考虑在算法形式以及运行机制上更贴近实际的种群经典演化算法,为该类算法的时间复杂度分析提出了一种新颖的、系统的方法,并将此方法应用于分析种群规模不同的经典演化算法在若干单峰和多峰优化问题上的时间复杂度。在演化计算领域,上述工作第一次给出了(N+N)经典演化算法(算法的父代种群和子代种群规模相等)时间复杂度的专用分析方法,从而从理论上分析了不同的种群规模对种群演化算法性能的影响。
     (2)研究了一类新型演化算法(分布评估算法)的计算复杂性。与直接进化种群的经典演化算法不同,这类算法通过在线调整概率模型来间接地进化种群。在分布评估算法兴起和发展的近二十年来,绝大部分研究都局限于实验观察。虽然有少量研究者关注种群分布评估算法的收敛性,但还没有研究者成功地对这类算法的时间复杂度进行过严密的理论研究。本文为分布评估算法的时间复杂度分析提出一种一般性的研究思路。基于该思路,本文分析了独立边缘分布算法(分布评估算法的一种实例)的时间复杂度,并从理论上验证了对算法的概率模型进行“松弛”的必要性和有效性。在演化计算领域,这项工作第一次为分布评估算法的计算复杂性分析提出了一般性的研究思路,并从理论上严密地分析了种群分布评估算法的时间复杂度,为分布评估算法的计算复杂性研究作出了突破性的贡献。
     (3)研究了在动态优化问题背景下随时间可变的变异率策略(随着算法的运行,变异率参数可发生变化)对经典演化算法性能的影响。在演化计算领域,许多研究者对引入适应性变异率策略、自适应变异率策略来提升算法性能抱有较大期望。而本文通过计算复杂性研究发现,在某些动态优化问题上,任意随时间可变的变异率策略都不比固定变异率策略具本质上的优势。基于这个结果,本文建议在求解动态优化问题时,如果缺乏先验知识,演化算法最好不要使用自适应变异率等可变变异率策略,而是按照奥坎姆剃刀原则使用简单的固定变异率策略。在演化计算领域,这项工作第一次系统地分析了任意随时间可变的(如自适应)变异率对算法计算复杂性的影响,显著加强了演化动态优化方向的理论基础,并有助于深入理解演化算法的性能和变异率间的关系。
Evolutionary Algorithms (EAs) are a kind of stochastic optimization algorithms following Darwin's principle of "Survival of the fittest". EAs have received extensive investigations since 1960s, especially in 1980s and 1990s. Years of efforts spent on designing and studying various EAs have established a branch of artificial intelligence, Evolutionary Computation (EC). Generally speaking, EC is usually considered as a branch of computer science which mainly relies on empirical investigations. That is because carrying out rigorous theoretical analysis on the complex stochastic behaviors of EAs is very difficult. To gain more insight into the stochastic behaviors of EAs so as to design efficient algorithms, this thesis focuses on the computational complexity analysis of EAs. Concretely, we study the computational complexity of EAs from broad perspectives:we not only concern the classic EAs (Genetic Algorithms) that are close enough to practical algorithms, but also concern a novel type of EAs that are popular in academia; we not only study EAs on traditional static optimization problems, but also investigate EAs on dynamic optimization problems; we not only concern the theoretical investigation of the algorithms, but also care about the practical implications of theoretical results. The main contributions of this thesis are summarized as follows:
     (1) We study the computational complexity of population-based classic EAs. As shown in the first chapter, early theoretical time complexity investigations often focused on simplified but impractical EAs. Unlike these studies, we concentrate on population-based EAs that are closer to practical algorithms utilized in real-world applications. To be specific, we propose a novel approach for analyzing the time complexities of population-based EAs, which will be utilized to analyze classic EAs with different population sizes on both unimodal and multimodal optimization problems. In EC field, this is the first time that a approach dedicated to the time complexity analysis of (N+N) classic EAs (where the parent size equals the offspring size) is proposed. It is also the first time that the impact of different population sizes on the performances of EAs is investigated theoretically.
     (2) We study the computational complexity of a novel type of EAs named Es-timation of Distribution Algorithms (EDAs). Unlike EAs that evolve populations di-rectly, EDAs indirectly evolves populations by employing online-adjustable probabilis-tic models. Over the past two decades, various EDAs have been proposed and studied. However, most investigations were carried out on the basis of empirical observations. Although some scholars have concerned the convergence properties of EDAs, there is no investigation that analyzes successfully the time complexity of population-based EDAs by rigorous mathematical discussions. In this thesis, we propose a general ap-proach to analyzing the time complexity of EDAs. Following this approach, we an-alyze the time complexity of an instance of EDAs, Univariate Marginal Distribution Algorithm (UMDA). In addition to several time complexity results, it is discovered that the relaxation on the probabilistic model of UMDA is effective for promoting the performance of UMDA, and is necessary for avoiding premature convergence of the probabilistic model. Our investigation has made exceptional contributions to the com-putational complexity of EDAs:this is the first time that a general approach is proposed for analyzing the computational complexities of population-based EDAs; this is also the first time that the time complexity of a population-based EDA is analyzed rigorously.
     (3) We study the impact of time-variable mutation rate schemes on the perfor-mance of a classic EA in the context of dynamic optimization problems. In EC field, many researchers expect that time-variable mutation rate schemes can potentially pro-mote the performance of classic EAs. However, according to our computational com-plexity analysis, on certain dynamic optimization problems, any time-variable mutation rate scheme is not significantly better than fixed mutation rates. According to this result, this thesis suggests that when solving dynamic optimization problems in the absence of prior knowledge, it would be better to give up self-adaptive mutation rate schemes, and follow the well-known Occam's razor by utilizing a fixed mutation rate. In EC field, it is the first time that some general results for any time-variable (e.g., self-adaptive) mutation rate scheme are given and proven rigorously, which substantially increases the understanding of the relationship between time-variable mutation rate scheme and the performance of EA, and the role of mutation in the context of dynamic optimization problems.
引文
[1]陈国良等.遗传算法及其应用.人民邮电出版社,2001.
    [2]J. H. Holland. Adaptation in natural and artificial systems. The University of Michigan Press, 1975.
    [3]J. H. Holland. An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. Cambridge MA:MIT Press,1992.
    [4]L. J. Fogel, A. J. Owens, and M. J. Walsh. Artificial Intelligence through Simulated Evolution. John Wiley,1966.
    [5]L. J. Fogel. Intelligence through Simulated Evolution:Forty Years of Evolutionary Program-ming. John Wiley,1999.
    [6]I. Rechenberg. Evolutionsstrategie-Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. PhD thesis, Technical University of Berlin,1971.
    [7]J. R. Koza. Genetic programming:A paradigm for genetically breeding populations of com-puter programs to solve problems. Technical report, CA, USA,1990.
    [8]K.-H. Liang, X. Yao, C. S. Newton, and D. Hoffman. A new evolutionary approach to cutting stock problems with and without contiguity. Computers and Operations Research, 29(12):1641-1659,2002.
    [9]M. Tang and X. Yao. A memetic algorithm for vlsi floorplanning. IEEE Transactions on Systems, Man and Cybernetics, Part B:Cybernetics,37(1):62-69,2007.
    [10]J. X. Yu, X. Yao, C.-H. Choi, and G. Gou. Materialized view selection as constrained evolu-tionary optimization. IEEE Transactions on Systems, Man and Cybernetics, Part C,33(4):458-467,2003.
    [11]G. Rudolph. Convergence Properties of Evolutionary Algorithms. Verlag Dr. Kovac, Hamburg, 1997.
    [12]R. Motwani and P. Raghavan. Randomized Algorithms. Cambridge University Press,1995.
    [13]S. Droste, T. Jansen, and I. Wegener. On the optimization of unimodal functions with the (1+1) evolutionary algorithm. In Proceedings of Parallel Problem Solving from Nature V (PPSN V), pages 13-22,1998.
    [14]S. Droste, T. Jansen, and I. Wegener. A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with boolean inputs. In Proceedings of the IEEE International Conference on Evolutionary Computation (ICEC'98), pages 499-504,1998.
    [15]J. He and X. Yao. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence,127(1):57-85,2001.
    [16]J. He and X. Yao. Towards an analytic framework for analysing the computation time of evolutionary algorithms. Artificial Intelligence,145(1-2):59-97,2003.
    [17]A. E. Eiben, R. Hinterding, and Z. Michalewicz. Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation,3(2):124-141,1999.
    [18]P. S. Oliveto, J. He, and X. Yao. Analysis of the (1+1)-ea for finding approximate solutions to vertex cover problems. IEEE Transactions on Evolutionary Computation,13(5):1006-1029, 2009.
    [19]X. Yao, Y. Liu, and G. Lin. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation,3(2):82-102,1999.
    [20]N. Hansen, S. D. Muller, and P. Koumoutsakos. Reducing the time complexity of the deran-domized evolution strategy with covariance matrix adaptation (cma-es). Evolutionary Com-putation,11(1):1-18,2003.
    [21]F. Corno, E. Sanchez, and G. Squillero. Evolving assembly programs:How games help mi-croprocessor validation. IEEE Transactions on Evolutionary Computation,9(6):695-706, 2005.
    [22]P. Larra naga and J. A. Lozano. Estimation of Distribution Algorithms:A New Tool for Evo-lutionary Computation. Norwell, MA:Kluwer,2001.
    [23]H. Miihlenbein and G. Paaβ. From recombination of genes to the estimation of distribution i. binary parameters. In Proceedings of 4th International Conference on Parallel Problem Solving from Nature (PPSN Ⅳ), pages 178-187,1996.
    [24]M. I. Jordan (Ed.). Invitation to Dynamical Systems. Cambridge MA:MIT Press,1999.
    [25]F. V. Jensen. Bayesian Networks and Decision Graphs. Springer,2001.
    [26]H. Miihlenbein and D. Schlierkamp-Voosen. Predictive models for the breeder genetic algo-rithm, i:continuous parameter optimization. Evolutionary Computation,1(1):25-49,1993.
    [27]H. Miihlenbein. The equation for response to selection and its use for prediction. Evolutionary Computation,5(3):303-346,1997.
    [28]S. Baluja and R. Caruana. Removing the genetics from the standard genetic algorithm. Tech-nical report, Pittsburgh, PA, USA,1995.
    [29]J. S. De Bonet, C. L. Isbell Jr., and P. A. Viola. Mimic:Finding optima by estimating prob-ability densities. In Proceedings of Advances in Neural Information Processing Systems 9 (NIPS'96), pages 424-430,1996.
    [30]R. Etxeberria and P. Larra naga. Global optimization using bayesian networks. In Proceedings of Second Symposium on Artificial Intelligence (CIMAF'99), pages 332-339,1999.
    [31]M. Pelikan, D. E. Goldberg, and E. Cantu-Paz. Boa:The bayesian optimization algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO'99), pages 525-532,1999.
    [32]H. Muhlenbein. Fda—a scalable evolutionary algorithm for the optimization of additively decomposed functions. Evolutionary Computation,7(4):353-376,1999.
    [33]D. E. Goldberg (Ed.). Genetic Algorithms for Search, Optimization, and Machine Learning. Addison-Wesley,1989.
    [34]K. Deb and D. E. Goldberg. Analyzing deception in trap functions. In Proceedings of the 2nd International workshop on Foundations of genetic algorithms (FOGA'92), pages 93-108, 1992.
    [35]X. Yao and G.-J. Li. General simulated annealing. Journal of Computer Science and Technol-ogy,6(4):329-338,1991.
    [36]G. Rudolph. Convergence analysis of canonical genetic algorithms. IEEE Transactions on Neural Networks,5(1):96-101,1994.
    [37]G. Rudolph. Finite markov chain results in evolutionary computation:A tour d'horizon. Fundamenta Informaticae,35(1-4):67-89,1998.
    [38]Q. Zhang. On the convergence of a class of estimation of distribution algorithms. IEEE Transactions on Evolutionary Computation,8(2):127-136,2004.
    [39]T. Back, D. B. Fogel, and Z. Michalewicz. Handbook of Evolutionary Computation. IOP Publishing Ltd.,1997.
    [40]D. H. Wolpert and W. G. Macready. Coevolutionary free lunches. IEEE Transactions on Evolutionary Computation, 1(1):67-82,1997.
    [41]D. H. Wolpert and W. G. Macready. Coevolutionary free lunches. IEEE Transactions on Evolutionary Computation,9(6):721-735,2005.
    [42]Y. C. Ho and D. L. Pepyne. Simple explanation of the no-free-lunch theorem and its implica-tions. Journal of Optimization Theory and Applications,115(3):549-570,2002.
    [43]S. Wright. The roles of mutation, inbreeding, crossbreeding, and selection in evolution. In Proceedings of the Sixth International Congress on Genetics, pages 355-366,1932.
    [44]T. Jones and S. Forrest. Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In Proceedings of the 6th International Conference on Genetic Algorithms, pages 184-192,1995.
    [45]C. R. Reeves and J. E. Rowe. Genetic Algorithms:Principles and Perspectives:A Guide to GA Theory. Kluwer Academic Publisher.
    [46]T. Jones. Evolutionary Algorithms, Fitness Landscapes and Search. PhD thesis, University of New Mexico,1995.
    [47]E. Alba and M. Tomassini. Parallelism and evolutionary algorithms. IEEE Transactions on Evolutionary Computation,6(5):443-462,2002.
    [48]J. He and X. Yao. From an individual to a population:An analysis of the first hitting time of population-based evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 6(5):495-511,2002.
    [49]R. L. Graham, D. E. Knuth, and O. Patashnik. Concrete Mathematics:A Foundation for Computer Science. Addison-Wesley Longman Publishing Co., Inc.,1994.
    [50]G. Rudolph. How mutation and selection solve long path problems in polynomial expected time. Evolutionary Computation,4(2):195-205,1996.
    [51]J. Suzuki. A markov chain analysis on simple genetic algorithms. IEEE Transactions on Systems, Man and Cybernetics,25(4):655-659,1995.
    [52]J. Suzuki. A further result on the markov chain model of genetic algorithms and its application to a simulated annealing-like strategy. IEEE Transactions on Systems, Man and Cybernetics, Part B,28(1):95-102,1998.
    [53]M. Iosifescu. Finite Markov Processes and Their Applications. Wiley, Chichester,1980.
    [54]R. Syski. Passage Times for Markov Chains. Amsterdam:IOS Press,1992.
    [55]J. He and X. Yao. A study of drift analysis for estimating computation time of evolutionary algorithms. Natural Computing,3(1):21-35,2004.
    [56]B. Hajek. Hitting time and occupation time bounds implied by drift analysis with applications. Advances in Applied Probability,14,1982.
    [57]G. H. Sasaki and B. Hajek. The time complexity of maximum matching by simulated anneal-ing. Journal of the ACM,35(2):387-403,1988.
    [58]R. W. Hamming. Error detecting and error correcting codes. Bell System Technical Journal, 26(2):147-160,1950.
    [59]S. Droste, T. Jansen, and I. Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science,276(1-2):51-81,2002.
    [60]A. Papoulis. Probability, Random Variables, and Stochastic Processes. New York:McGraw-Hill,1984.
    [61]J. Horn, D. E. Goldberg, and K. Deb. Long path problems. In Proceedings of Parallel Problem Solving From Nature Ⅲ (PPSN Ⅲ), pages 149-158,1994.
    [62]T. Jansen and I. Wegener. Evolutionary algorithms — how to cope with plateaus of constant fitness and when to reject strings of the same fitness. IEEE Transactions on Evolutionary Computation,5(6):589-599,2001.
    [63]J. Garnier, L. Kallel, and M. Schoenauer. Rigorous hitting times for binary mutations. Evolu-tionary Computation,7(2):167-203,1999.
    [64]J. Gamier and L. Kallel. Statistical distribution of the convergence time of evolutionary algo-rithms for long path problems. IEEE Transactions on Evolutionary Computation,4(1):16-30,2000.
    [65]Y. Yu and Z.-H. Zhou. A new approach to estimating the expected first hitting time of evolu-tionary algorithms. Artificial Intelligence,172(15):1809-1832,2008.
    [66]O. Giel and I. Wegener. Evolutionary algorithms and the maximum matching problem. In Proceedings of the 20th Annual Symposium on Theoretical Aspects of Computer Science (STACS'03),2003.
    [67]J. Scharnow, K. Tinnefeld, and I. Wegener. Fitness landscapes based on sorting and shortest paths problems. In Lecture Notes in Computer Science 2939:Parallel Problem Solving from Nature (PPSN VII), pages 54-63,2002.
    [68]S. Baswana, S. Biswas, B. Doerr, T. Friedrich, P. P. Kurur, and F. Neumann. Computing single source shortest paths using single-objective fitness. In Proceedings of the tenth ACM SIGEVO workshop on Foundations of genetic algorithms (FOGA'09),2009.
    [69]B. Doerr, N. Hebbinghaus, and F. Neumann. Speeding up evolutionary algorithms through asymmetric mutation operators. Evolutionary Computation,15(4),2007.
    [70]F. Neumann and I. Wegener. Randomized local search, evolutionary algorithms, and the min-imum spanning tree problem. Theoretical Computer Science,378(1),2007.
    [71]T. Friedrich, N. Hebbinghaus, F. Neumann, J. He, and C. Witt. Approximating covering prob-lems by randomized search heuristics using multi-objective models. In Proceedings of 2007 Genetic and Evolutionary Computation Conference (GECCO'07),2007.
    [72]Y. Zhou and J. He. A runtime analysis of evolutionary algorithms for constrained optimization problems. IEEE Transactions on Evolutionary Computation,11(5):608-619,2007.
    [73]P. K. Lehre and X. Yao. Runtime analysis of the (1+1) ea on computing unique input output sequences. Information Sciences, accepted in February 2010.
    [74]T. Chen, J. He, G. Chen, and X. Yao. Choosing selection pressure for wide-gap problems. Theoretical Computer Science,411(6):926-934,2010.
    [75]R. Storn and K. Price. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization,11(4):341-359,1997.
    [76]T. Jansen and I. Wegener. On the utility of populations in evolutionary algorithms. In Proceed-ings of the Genetic and Evolutionary Computation Conference (GECCO'01), pages 1034-1041,2001.
    [77]C. Witt. Runtime analysis of the (μ+1) ea on simple pseudo-boolean functions. Evolutionary Computation,14(1):65-86,2006.
    [78]T. Jansen and I. Wegener. Real royal road functions:where crossover provably is essential. Discrete Applied Mathematic,149(1-3),2005.
    [79]C. Witt. Population size versus runtime of a simple evolutionary algorithm. Theoretical Computer Science,403(1),2008.
    [80]P. S. Oliveto, J. He, and X. Yao. Analysis of population-based evolutionary algorithms for the vertex cover problem. In Proceedings of 2008 IEEE World Congress on Computational Intelligence (WCCI'08), pages 1563-1570,2008.
    [81]T. Jansen, K. A. D. Jong, and I. Wegener. On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation,13(4):413-440,2005.
    [82]O. Giel. Expected runtimes of a simple multi-objective evolutionary algorithm. In Proceedings of the 2003 IEEE Congress on Evolutionary Computation (CEC'03),2003.
    [83]M. Laumanns, L. Thiele, and E. Zitzler. Running time analysis of multiobjective evolutionary algorithms on pseudo-boolean functions. IEEE Transactions on Evolutionary Computation, 8(2):170-182,2004.
    [84]F. Neumann. Expected runtimes of a simple evolutionary algorithm for the multi-objective minimum spanning tree problem. In Proceedings of 8th International Conference on Parallel Problem Solving from Nature (PPSN Ⅷ), pages 81-90,2004.
    [85]F. Neumann T. Friedrich, N. Hebbinghaus. Plateaus can be harder in multi-objective optimiza-tion. Theoretical Computer Science,411(6):854-864,2009.
    [86]J. Branke. Evolutionary Optimization in Dynamic Environments. Kluwer Academic Publisher, 2001.
    [87]Y. Jin and J. Branke. Evolutionary optimization in uncertain environments — a survey. IEEE Transactions on Evolutionary Computation,9(3):303-317,2005.
    [88]T. Blackwell and J. Branke. Multiswarms, exclusion, and anti-convergence in dynamic envi-ronments. IEEE Transactions on Evolutionary Computation,10(4):459-472,2006.
    [89]D. Parrott and X. Li. Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transactions on Evolutionary Computation,10(4):440-458, 2006.
    [90]S. Yang and X. Yao. Population-based incremental learning with associative memory for dynamic environments. IEEE Transactions on Evolutionary Computation,12(5):542-561, 2008.
    [91]J. Branke, M. Middendorf, G. Noeth, and M. Dessouky. Waiting strategies for dynamic vehicle routing. Transportation Science,39:298-312,2005.
    [92]C.-F. Chien and C.-H. Chen. Using ga and ctpn for modeling the optimization-based schedule generator of a generic production scheduling system. International Journal of Production Research,45(8):1763-1789,2007.
    [93]S. Droste. Analysis of the (1+1) ea for a dynamically bitwise changing onemax. In Pro-ceeding of 2003 Genetic and Evolutionary Computation Conference (GECCO'03), pages 909-921,2003.
    [94]X. Yu, K. Tang, T. Chen, and X. Yao. Empirical analysis of evolutionary algorithms with immigrants schemes for dynamic optimization. Memetic Computing,1(1):3-24,2009.
    [95]S. A. Stanhope and J. M. Daida. (1+1) genetic algorithm fitness dynamics in a changing envi-ronment. In Proceedings of the 1999 IEEE Congress on Evolutionary Computation (CEC'99), pages 1851-1858,1999.
    [96]J. Branke and W. Wang. Theoretical analysis of simple evolution strategies in quickly changing environments. In Proceedings of 2003 Genetic and Evolutionary Computation Conference (GECCO'03), pages 537-548,2003.
    [97]S. Droste. Analysis of the (1+1) ea for a dynamically changing onemax-variant. In Pro-ceeding of the 2002 IEEE Congress on Evolutionary Computation (CEC'02), pages 55-60, 2002.
    [98]J. Jagerskiipper. Analysis of a simple evolutionary algorithm for minimization in euclidean spaces. In Proceedings of the 30th International Colloquium on Automata, Languages and Programming (ICALP'03)),2003.
    [99]J. Jagerskiipper. Rigorous runtime analysis of the (1+1) es:1/5-rule and ellipsoidal fit-ness landscapes. In Proceedings of the 8th International workshop on Foundations of genetic algorithms (FOGA'05), pages 260-281,2005.
    [100]J. Jagerskiipper and C. Witt. Rigorous runtime analysis of a (μ+1) es for the sphere function. In Proceedings of the 2005 conference on Genetic and evolutionary computation (GECCO'05),2005.
    [101]J. Jagerskupper. Probabilistic runtime analysis of (1+λ) es using isotropic mutations. In Pro-ceedings of the 8th annual conference on Genetic and evolutionary computation (GECCO'06), 2006.
    [102]T. Chen, J. He, G. Sun, G. Chen, and X. Yao. A new approach to analyzing average time complexity of population-based evolutionary algorithms on unimodal problems. IEEE Trans-actions on Systems, Man and Cybernetics, Part B:Cybernetics,39(5):1092-1106,2009.
    [103]T. Chen, K. Tang, G. Chen, and X. Yao. Analysis of computational time of simple estimation of distribution algorithms. IEEE Transactions on Evolutionary Computation,14(1):1-22, 2010.
    [104]T. Chen, K. Tang, G. Chen, and X. Yao. On the analysis of average time complexity of estimation of distribution algorithms. In Proceedings of 2007 IEEE Congress on Evolutionary Computation (CEC'07), pages 453-460,2007.
    [105]T. Chen, K. Tang, G. Chen, and X. Yao. Rigorous time complexity analysis of univariate
    marginal distribution algorithm with margins. In Proceedings of 2009 IEEE Congress on Evolutionary Computation (CEC'09), pages 2157-2164,2009. [106] T. Chen, P. K. Lehre, K. Tang, and X. Yao. When is an estimation of distribution algorithm better than an evolutionary algorithm. In Proceedings of 2009 IEEE Congress on Evolutionary Computation (CEC'09), pages 1470-1477,2009. [107] T. Chen, K. Tang, X. Yu, G. Chen, and X. Yao. Adaptation of mutation rate is not a panacea. NICAL Technical Report,2009. [108] D. E. Goldberg and K. Deb. A comparative analysis of selection scheme used in genetic algorithms. In Foundations of Genetic Algorithms, pages 69-93,1991. [109] P. S. Oliveto, J. He, and X. Yao. Evolutionary algorithms and the vertex cover problem. In Proceedings of 2007 IEEE Congress on Evolutionary Computation (CEC'07), pages 1870-1877,2007. [110] M. Pelikan, K. Sastry, and D. E. Goldberg. Evolutionary algorithms+graphical models= scalable black-box optimization. Technical report,2001. [111] R. Rastegar and M. R. Meybodi. A study on global convergence time complexity of estima-tion of distribution algorithms. In Lecture Notes in Artificial Intelligence 3641:Rough Sets, Fuzzy Sets, Data Mining and Granular Computing (RSFDGrC'05), pages 441-450,2005. [112] M. Pelikan, K. Sastry, and D. E. Goldberg. Scalability of the bayesian optimization algorithm. International Journal of Approximate Reasoning,31(3):221-258,2002. [113] H. Muhlenbein and T. Mahnig. Evolutionary optimization and the estimation of search dis-tributions with applications to graph bipartitioning. International Journal of Approximate Reasoning,31(3):157-192,2002. [114] S. Droste. A rigorous analysis of the compact genetic algorithm for linear functions. Natural Computing,5(3):257-283,2006. [115] C. Gonzalez. Contributions on Theoretical Aspects of Estimation of Distribution Algorithms. PhD thesis, the University of Basque Country,2005. [116] C. Gonzalez, A. Ramirez, J. A. Lozano, and P. Larra naga. Average time complexity of esti-mation of distribution algorithms. In Proceedings of 8th International Workshop-Conference on Artificial Neural Networks (IWANN'05), pages 42-49,2005. [117] J. He, C. Reeves, and X. Yao. A discussion on posterior and prior measures of problem difficulties. In Proceedings of PPSN IX Workshop on Evolutionary Algorithms — Bridging Theory and Practice,2006. [118] J. He and L. Kang. On the convergence rate of genetic algorithms. Theoretical Computer Science,229(1-2):23-39,1999. [119] E. R. Sheinerman. Invitation to Dynamical Systems. Prentice-Hall,1996.
    [120]W. Hoeffding. Probability inequalities for sums of bounded random variables. Journal of the American Statistical Association,58,1963.
    [121]R. J. Serfling. Probability inequalities for the sum in sampling without replacement. Annals of Statistics,2(1):39-48,1974.
    [122]M. Rudnick. Genetic algorithms and fitness variance with an application to the automated design of artificial neural networks. PhD thesis, Oregon Graduate Institute of Science and technology,1992.
    [123]D. Thierens, D. E. Goldberg, and A. G. Pereira. Domino convergence, drift, and the temporal-salience structure of problems. In Proceedings of 1998 IEEE International Conference on Evolutionary Computation, pages 535-540,1998.
    [124]J. A. Rice. Mathematical Statistics and Data Analysis. Duxbury Press,1994.
    [125]B. Cestnik. Estimating probabilities:A crucial task in machine learning. In Proceedings of 1990 European Conference in Artificial Intelligence, pages 147-149,1990.
    [126]T. Back. Optimal mutation rates in genetic search. In Proceedings of the Fifth International Conference on Genetic Algorithms, pages 2-8,1993.
    [127]T. Fogarty. Varying the probability of mutation in the genetic algorithm. In Proceedings of the Third International Conference on Genetic Algorithms, pages 104-109,1989.
    [128]T. Jansen and I. Wegener. On the choice of the mutation probability for the (1+1) ea. In Proceedings of the 6th Parallel Problem Solving from Nature (PPSN VI), pages 89-98,2000.
    [129]J. Smith and T. Fogarty. Self adaptation of mutation rates in a steady state genetic algorithm. In Proceedings of the 1996 IEEE International Conference on Evolutionary Computation, pages 318-323,1996.
    [130]S. A. Stanhope and J. M. Daida. Optimal mutation and crossover rates for a genetic algorithm operating in a dynamic environment. In Evolutionary Programming VII, LNCS 1447, pages 693-702,1998.
    [131]D. Thierens. Adaptive mutation rate control schemes in genetic algorithms. In Proceedings of the 2002 IEEE Congress on Evolutionary Computation (CEC'02), pages 980-985,2002.
    [132]D. V. Arnold and H.-G. Beyer. Optimum tracking with evolution strategies. Evolutionary Computation,14(3):291-308,2006.
    [133]N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strate-gies. Evolutionary Computation,9(2):159-195,2001.
    [134]H.-P. Schwefel. Evolution and Optimum Seeking (Sixth-Generation Computer Technology Series). John Wiley & Sons Inc.,1995.
    [135]T. Back. An overview of parameter control methods by self-adaptation in evolutionary algo-rithms. Fundamenta Informaticae,34(1):1-15,1998.
    [136]R. Hinterding, Z. Michalewicz, and G. Eiben. Adaptation in evolutionary computation:A survey. In Proceeding of the 1997 IEEE Congress on Evolutionary Computation, pages 65-69,1997.
    [137]T. Back. Self-adaptation in genetic algorithms. In Proceedings of the First European Con-ference on Artificial Life, pages 263-271,1992.
    [138]M. Srinivas and L. M. Patnaik. Adaptive probabilities of crossover and mutation in genetic algorithms. IEEE Transactions on Systems, Man and Cybernetics,24(4):656-667,1994.
    [139]S. Droste, T. Jansen, and I. Wegener. Dynamic parameter control in simple evolutionary algorithms. In Proceedings of Foundations of Genetic Algorithms Workshop 2000 (FOGA'00), pages 275-294,2000.
    [140]T. Jansen and I. Wegener. On the analysis of a dynamic evolutionary algorithm. Journal of Discrete Algorithms,4(1):181-199,2006.
    [141]G. Rudolph. Self-adaptive mutations may lead to premature convergence. IEEE Transactions on Evolutionary Computation,5(4):410-414,2001.
    [142]P. S. Oliveto and C. Witt. Simplified drift analysis for proving lower bounds in evolutionary computation. In Proceedings of 10th Parallel Problem Solving from Nature (PPSN X), pages 82-91,2008.
    [143]T. Back. On the behavior of evolutionary algorithms in dynamic environments. In Proceed-ings of the 1998 IEEE International Conference on Evolutionary Computation (ICEC'08), pages 446-451,1998.
    [144]J. Branke, T. Kauβler, C. Schmidth, and H. Schmeck. A multipopulation approach to dynamic optimization problems. In Proceedings of the Adaptive Computing in Design and Manufac-turing, pages 299-308,2000.
    [145]J. Kennedy and R. C. Eberhart. Swarm Intelligence. Morgan Kaufmann,2001.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700