粒子群算法在最优化问题中的研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
优化技术是一种以数学为基础,用于求解各种组合优化问题的应用技术。最优化问题是人们在工程技术、科学研究、和经济管理等诸多领域中经常碰到的问题,它是指在满足一定的约束条件下,寻找一组参数值,使目标函数达到最大或最小。最优化问题根据其目标函数、约束条件的性质以及优化变量的取值范围可以分为许多类型,例如:根据目标函数和约束条件是否均为线性表达式,把最优化问题划分为线性规划问题和非线性规划问题。针对不同的最优化问题,提出了许多不同的优化方法,如牛顿法、共轭梯度法、Polar-Ribiere法、拉格朗日乘子法等。这些优化算法能很好地找到问题的局部最优点,是成熟的局部优化算法。
     但是随着人类生存空间的扩大以及认识与改造世界范围的拓展,人们发现由于问题的复杂性、约束性、非线性、建模困难等特点,解析性优化算法已不能满足人们的要求,需要寻找一种适合于大规模并行且具有智能特征的优化算法。
     现代进化类方法如人工神经网络、遗传算法、禁忌搜索法、模拟退火法和蚁群算法等在解决大规模的问题时体现出强大的潜力,它们可以在合理的时间限制内逼近优化问题的较好可行解。其中,遗传算法和蚁群算法被称为智能优化算法,其基本思想是通过模拟自然界生物的行为来构造随机优化算法。
     近年来,另一种智能优化算法—粒子群算法(particle swarm optimization,简称PSO)越来越受到学者的关注。粒子群算法是美国社会心理学家James Kennedy和电气工程师Russell Eberhart在1995年共同提出的,它是受到鸟群社会行为的启发并利用了生物学家Frank Heppner的生物群体模型而提出的。它用无质量无体积的粒子作为个体,并为每个粒子规定简单的社会行为规则,通过种群间个体协作来实现对问题最优解的搜索。由于算法收敛速度快,设置参数少,容易实现,能有效地解决复杂优化问题,在函数优化、神经网络训练、图解处理、模式识别以及一些工程领域都得到了广泛的应用。
     不过,尽管粒子群算法发展有十几年了,但是无论在理论上还是在实践上都尚未成熟。粒子群算法也和其它全局优化算法一样,有易陷入局部极值点,进化后期收敛慢,精度较差等缺点。如何加快粒子群算法的收敛速度和提高算法的收敛精度,一直是大多数研究者关注的重点。加快收敛速度的措施主要有如何选择最优的算法参数,以及与其它优化算法结合来对粒子群算法的主要框架加以修正。在提高收敛精度,防止粒子早熟方面,主要有设法保持种群的多样性,或引入跳出局部最优点的机制等措施。现已有的改进粒子群算法有模糊自适应PSO算法(FAPSO),杂交PSO算法(HPSO),离散二进制PSO算法,协同PSO算法,免疫粒子群优化算法等。
     本文在综述了粒子群算法及其发展过程的基础上,对现有文献进行了研究和分析,针对连续问题和离散问题分别提出了两种改进算法。
     在对连续问题的改进算法中,用一种无约束条件的随机变异操作代替速度公式中的惯性部分,并且使邻居最优粒子有条件地对粒子行为产生影响,提高了粒子间的多样性差异,从而改善了算法能力。本文主要以函数优化为例,通过对Sphere、Rosenbrock、Girewank等几类经典测试函数进行测试,来说明算法的有效性。
     PSO算法虽然被广泛应用于连续问题的优化,但在求解离散优化问题方面还是一种全新的尝试。本文在对离散问题的分析中,以矩形件优化排样具体问题为例,提出了针对离散问题的改进算法,该算法对解码方式进行了改进,并且融合了遗传算法中的交叉和变异思想,使其能快速地达到优化目的。
     最后,通过对这两种改进算法的分析研究,发现了几种针对粒子群算法的改进策略。无论是连续问题还是离散问题运用这几种改进策略都可以得到较好的优化。改进策略如下:
     对粒子行为有条件地增加邻居最优粒子的影响,可以提高粒子间的多样性差异。
     增加变异操作。对每个新生成的粒子增加变异操作,使用不同的变异策略对粒子进行变异。
     定义一个阀值,对粒子使用不同的更新策略进行更新。
     总之,论文对粒子群算法做了较为全面深入的分析和讨论,采用了几种改进策略,使其能有效地应用在连续问题和离散问题中。最后,论文进行了总结,并提出了进一步的研究方向。
Optimization technology is based on mathematics and can solve various combinatorial optimization problems. Many problems possess a set of parameters to be optimized, especially in the fields of engineering technology, scientific research and economic management. Optimization is to look for a set of parameters in definite restriction with the aim of minimizing or maximizing the objective function. According to quality of objective function and restrict condition and scope of variable, optimization problem can be divided into lots of types. For example, if objective function and restrict condition are both lineal expression, this problem belongs to linear programming problem, if not, it belongs to nonlinear programming problem. Different methods have been presented to sovle different kinds of problems, such as Newton's method, conjugate gradient method, Polar-Ribiere's method, Lagrange Multiplier Method etc. These methods can nicely find local extreme in different problems.
     However, with the development of human living space and the scope of understanding and transforming the world, people have found that because of the complexity, binding, nonlinear, modeling difficulties characteristic, it is not easy to find a satisfying analytic solutions. It’s necessary to find a optimization algorithm suiting for large-scale parallel Operation with smart features.
     Modern evolution methods such as artificial neural networks, genetic algorithms, Taboo search method, simulated annealing, and ant colony algorithm etc., reflect a strong potential in solving large-scale problems. They can approximate the better feasible solution for the optimization problem within a reasonable period of time. The Genetic Algorithm and ant colony algorithm are known as intelligent optimization algorithm, and their basic idea is to construct stochastic optimization algorithms by simulating the behavior of the natural world.
     In recent years, another kind of intelligent optimization algorithm - PSO algorithm (particle swarm optimization, or PSO) increasingly accesses to the concerns of scholars. PSO algorithm is proposed by American social psychologist James Kennedy and electrical engineer Russell Eberhart in 1995, and it is inspired by bird populations' social behavior and uses the biological group model of biologist Frank Heppner. It uses particles without quality and volumes individuals, provides simple social rules of conduct for each particle, and searches the optimal solution to the problem through individual collaboration among populations. The algorithm converges fast, needing less parameters.Also it is easily achieved, and can effectively solve complex optimization problems. It has been widely used in function optimization, neural network training, graphic processing, pattern recognition as well as some engineering fields.
     Although the PSO algorithm has developed for more than 10 years, it has not been widely used in theory or practice. PSO algorithm, like other global optimization algorithm, has many shortcomings, such as easily falling into the local maximum, having slow convergence speed in the late evolutionary, having poor accuracy. How to accelerate the particle swarm algorithm for the convergence rate and improve its convergence accuracy are topics that most of the researchers have focused on. There are mainly two measures in accelerating the convergence speed. One is how to choose the best parameters, and the other is to combine with other optimization algorithm to amend the main framework. For improving accuracy and avoiding the common defect of premature convergence, some measures need to be taken that the diversity of the population should be maintained and the mechanism of jumping out of the local maximum should be introduced. There have been some methods for improving PSO algorithms, such as fuzzy adaptive PSO algorithm (FAPSO), the hybrid PSO algorithm (HPSO), discrete binary PSO algorithm, coordination PSO algorithm and immune PSO algorithm etc.
     In this paper, the PSO algorithm is summarized, the existing literatures are analysised, then, two improved algorithm are separately presented for continuous and discrete problems.
     In the continuous problems, the improved algorithm uses a random and unconditional mutation strategy which substitutes for previous velocity, and considers the effect which the best position of neighbor particle has conditionally on the particle behavior. It efficiently increases diversity of particles and improves the performance of algorithm. In this paper, as an example to function optimization, it is tested to illustrate the effectiveness of the algorithm through several classic functions, such as the Sphere, Rosenbrock and Girewank.
     At present, particle swarm optimization algorithm (PSO) has been applied to continuous problems for several years, but it is still an attempt to use the algorithm in discrete problems. An improved particle swarm optimization algorithm is proposed to solve the packing problem of rectangles, which belongs to discrete problems. This algorithm modifies the mode of decoding and combines the crossover and mutation in genetic algorithm in order to speed the optimization process.
     Finally, several improvement strategies have been found in this paper by adopting two improved PSO algorithm .Whether the issue is continuous or discrete problem using these strategies can be better optimized. The improved strategies are as follows:
     To consider the effect which the best position of neighbor particle has conditionally on the particle behavior can improve the diversity of particles.
     To increase mutation. To increase mutation for each new generation of particles and different mutation strategies can be used.
     To use different updated strategy updates the particle through defining a threshold in the different generation.
     In short, in this paper, a more comprehensive and in-depth analysis is discussed on particle swarm algorithm, and several improvements strategies are made for discre- te and continuous problems. Finally, the whole research contents are summarized, and further research directions are indicated.
引文
[1] Rao S S . Optimization Theory and Application (Second Edition)[M].NewDelhi:Wilev Eastern Limited,1984.
    [2] 袁亚湘,孙文瑜.最优化理论与方法[M].北京:科学技术出版社,2001.
    [3] 陈宝林.最优化理论与算法(第二版)[M].北京:清华大学出版社,2005.
    [4] Holland J H.Adaption in Nature and Artificial System[M].MIT Press,1991.
    [5] Goldberg D E.Genetic algorithms in search,optimization and machine learning [M].Addison-Wesley Publishing Company, 1989.
    [6] 段海滨.蚁群算法原理及其应用[M].北京:科学出版社,2005.
    [7] Kennedy j,Eberhart R C.Particle Swarm Optimization[A].In:Proc.IEEE Int.Cont:on Neural Networks,IV[C].Piscataway,NJ:IEEE Service Center,1995:1942-1948.
    [8] 申培萍.全局优化方法[M].北京:科学出版社,2006.
    [9] Wolpert D H,Macteady W G.No free lunch theorems for optimization [J].IEEE Transactions on Evolutionary Computation,1997,1(1):67-82.
    [10] Christensen S,Oppacher F.What can we learn from No Free Lunch? A First Attempt to Characterize the Concept of a Searchable Function[A].In:Proceeding of GECCO[C].America:Morgan Kaufmann,2001:1219-1226.
    [11] 王小平,曹立明.遗传算法-理论、应用与软件实现[M].西安:西安交通大学出版社,2002.
    [12] Holland J H.Adaptation in Natural and Artificial Systems[M].Ann Arbor: University of Michigan press,1975.
    [13] 于有名,刘玉树,阎光伟.遗传算法的编码理论与应用[J].计算机工程与应用,2006(3):86-89.
    [14] 杨晓华,陆桂华,杨志峰,哪建强.格雷码加速遗传算法及其理论研究[J].系统工程理论与实践,2003(3):100-106.
    [15] 杨晓华,杨志峰,哪建强.格雷码混合加速遗传算法及其性能分析[J].北京师范大学学报(自然科学版),2004,40(6):831-836.
    [16] 沈艳军,汪秉文.基于实数编码的克隆选择算法及其应用[J].华中科技大学学报(自然科学版),2004,32(2):41-42.
    [17] 余有明,刘玉树,阎光伟.遗传算法的编码理论与应用[J].计算机工程与应用,2006 (3):86-89
    [18] 林丹,李敏强,寇纪淞.基于实数编码的遗传算法的收敛性研究[J].计算机研究与发展,2000,37(11):1321-1327.
    [19] 方丹,王茹,林辉.基于实数编码的多算子演化遗传算法[J].计算机工程与应用,2004(13):87-90.
    [20] 周永华,赵平,毛宗源.实数编码遗传算法离散重组算子分析[J].华南理工大学学报(自然科学版),2003,31(1):70-73.
    [21] 陈辉,张家树,张超.实数编码混沌量子遗传算法[J].控制与决策,2005,20(11) :1300-1303.
    [22] 白丹,王新.基于遗传算法的多孔变径管优化设计[J].农业工程学报,2005,21 (2):42-45.
    [23] 路志英,庞勇,林孔元.遗传神经网复合编码方式的研究[J].计算机工程与设计,2004(11):1979-1981.
    [24] 莫愿斌,陈德钊,胡上序.动态规划粒子群算法求解 PCB 数控钻孔最佳走刀路线问题[J].机床与液压,2006(8):18-21.
    [25] 李敏强,寇纪淞,林丹.遗传算法的基本理论与应用[M].北京:科学出版社,2002.
    [26] 杨业华.分子遗传学[M].北京:中国农业出版社,2001.
    [27] 陈国良,王煦法,庄镇泉.遗传算法及其应用[M].北京:人民邮电出版社,1996.
    [28] Schwefel H P . Adaptation in Natural and Artificial Systems [M].Chichester:John Wiley,1981.
    [29] Fogel L J,Owens A J,Walsh M J.Artificial Intelligence through Simulated Evolution[M].New York:John Wiley,1966.
    [30] 李士勇,陈永强,李研.蚁群算法及其应用[M].哈尔滨:哈尔滨工业大学出版社,2004.
    [31] D Costa,A Hertz.Ants Can Colour Graphs[J].J of the Opnl Res Soc,1997,48(3):295-305.
    [32] P Kuntz,P Layzell,D Snyers.A colony of ant-like agents forpartitioning in VLSI technology[A].In:Proc. of the 4th European Conf.On Artificial Like[C].Boston:MIT Press,1997:417-424.
    [33] R Schoonderwoerd, O Holland,J Bruten.Ant like agents for loadbalancing in telecommunications networks[A].In:proc.of Agents’97[C].CA:ACM Press,1997:209-216.
    [34] G Di Caro,M Dorigo.Mobile Agents for Adaptive Routing[A].In:Proc. of the 31st Hawaii Int. Conf. on System[C].CA:IEEE Computer Society Press,1998:74-83.
    [35] M Dorigo , G Di Caro . Ant Algorithms for Discrete Optimi- zation[J].Artificial Life,1999,5(3):137-172.
    [36] 张纪会,高齐圣,徐心如.自适应蚁群算法[J].控制理论与应,2000,17(1):1-3.
    [37] Stutzle T,Hoos H H.MAX-MIN ant system and local search for the traveling salesman problem[A] . In : IEEE Int’1 Conf . on Evolutionary Computation[C].Indianapolis:IEEE Press,1997:309-314.
    [38] Reynolds C W.Flocks, herds and schools:A distributed behavioral model[J].Computer Graphics,1987,21(4):25-34.
    [39] Hepper F,U Grenander.A stochastic nonlinear model for coordina- ted bird flocks[A] . In : Krasner S. (ed.) , The Ubiquity of Chaos[C].Washington DC:AAAS Publications,1990.
    [40] Shi Y,Eberhart R.A modified particle swarm optimizer[A].In:IEEE World Congress on Computational Intelligence[C].Piscataway,NJ:IEEE Press,1998:69-73.
    [41] Shi Y , Ebethart R C . Empirical study of Particle swarm optimization[A] . In : Proceedings of the 1999 Congress on Evolutionary Computation[C]. Piscataway,NJ:IEEE Service Center,1999:1945-1950.
    [42] Shi Y , Eberhart R C . Fuzzy Adaptive Particle Swarm Optimi- zation[A] . In : Proc. Congress on Evolutionary Computation 2001[C].Seoul,Korea:IEEE Service Center,2001.
    [43] Clerc M.The swarm and queen:towards a deterministic and adaptive particle swarm optimization[A].In:Proceedings of the IEEE Congress on Evolutionary Computation[C].1999:1951-1957.
    [44] Corne D,Dorigo M,Glover F.New Ideas in Optimization[M]. New Y- ork: McGraw Hill,1999:379-387.
    [45] Krink T,Vesterstroem J S,Riget J.Particle optimisation with spat- ial particle extension[A]. Proceedings of the IEEE Congress on Evolutionary Computation (CEC)[C].Honolulu,Hawaii USA,2002:1474-1479.
    [46] Kennedy J,Eberhart R C.A discrete binary version of the particle swam algorithm[A].Proceedings of the 1997 International Conference on Systems,Man and Cybernetics[C].New York,NY,USA:IEEE Press,1997:4104-4109.
    [47] Potter M A,de Jong K A.A cooperative coevolutionary approach to function optimization[A].In The Third Parallel Problem Solving From Nature[C].Berlin,Germany:Springer-Verlag,1994:249-257.
    [48] van den Bergh F,Engelbrecht A P.Using cooperative particle swarm optimization to train product unit neural networks[R] . IEEE International Joint Conference on Neural Networks,Washinnton D C,USA,2001.
    [49] 王磊,潘进,焦李成.免疫算法[J].电子学报,2000,28(7):74-78.
    [50] 王磊,潘进,焦李成.免疫规划[J].计算机学报,2000,23(8):806-812.
    [51] 高鹰,谢胜利.免疫粒子群优化算法[J].计算机工程与应用,2004:4-5.
    [52] Clerc M,Kennedy J.The Particle Swarm:Explosion,Stability,and Convergence in a Multi-Dimensional Complex Space[J] . IEEE Transaction on Evolutionary Computation,2002,6(1):58-73.
    [53] Ozcan E,Mohan C.Particle swarm optimization: Surfing the waves [A] . In : Proc of the Congress on Evolutionary Computation [C].Washington DC,1999:1939-1944.
    [54] Trelea I C.The particle swarm optimization algorithm:convergence analysis and parameter selection [J] . Information Processing Letters,2003,85,317-325.
    [55] Eberhart R C,Hu Xiaohui.Human tremor analysis using particle swarm optimization [A] . In : Proceedings of the IEEE Congress on Evolutionary Computation [C].Piscataway,1999:1927-1930.
    [56] Yoshida H,Kawata K,Fukuyama Y,et al.A particle swarm optimization for reactive power and voltage control considering voltagestability [A].In:Proc. of IEEE Int. Conf. on Intelligent System Application to Power Systems [C].Rio de Janeiro,1999:117-121.
    [57] Abido M A.Optimal power flow using particle swarm optimization [J].Electrical Power and Energy Systems,2002,24,563-571.
    [58] 万福才,江定伟,李彦平.微粒群优化算法在相关新产品组合投入的应用[J].控制与决策,2004,19(5):520-524.
    [59] 齐洁,汪定伟.求解网络广告资源优化模型的改进微粒群算法[J].控制与决策,2004,19(8):881-884.
    [60] Xie XF , Zhang WJ , Yang ZL . A dissipative particle swarm optimization[A].In: Proc. of the IEEE Int’l Conf. on Evolutionary Computation[C].Honolulu: IEEE Inc.,2002:1456-1461.
    [61] Ratnaweera A,Halgamuge SK,Watson HC.Self-Organizing hierarchical particle swarm optimizer with time-varying acceleration coeffici- ents[J].IEEE Trans. on Evolutionary Computation,2004,8(3):240-255.
    [62] 胡建秀,曾建潮.具有随机惯性权重的 PSO 算法[J].计算机仿真,2006,23(8):164-167.
    [63] 曾建潮,崔志华.一种保证全局收敛的 PSO 算法[J].计算机研究与发展,2004,41(8):1333-1338.
    [64] 崔耀东.计算机排样技术及其应用[M].北京:机械工业出版社,2004.
    [65] Kantorovich L V.Mathematical method of organizing and planning production[J] (an English translation of a Russian paper published in 1939).Management Science.1960,6:363-422.
    [66] Hopper E,Turton B.Application of Genetic Algorithms to Packing Problems-A Review[A].Proceeding of the 2nd On-line World Conference on Soft Computing in Engineering Design and Manufactur- ing[C].Springer Verlag,London,1997:279-288.
    [67] Bo Li,Zhi-Yan Zhao,Ju-Dong Li.A hybrid algorithm for nesting problems[A].Proceedings of the Second International Conference on Machine and Cybernetics[C].Xi'an,2003:1424-1429.
    [68] M Solimanpur,P Vrat,R Shankar. Ant colony optimization algorithm to the inter-cell layout problem in cellular manufacturing [J].European Journal of Operational Research.2004,157:592-606.
    [69] 高尚,韩斌等.求解旅行商问题的混合粒子群优化算法[J].控制与决策,2004,19(11):1286-1289.
    [70] 李明,宋成芳,周泽魁.一种二维不规则零件优化排样算法[J].四川大学学报,2005,37(4):134-138.
NGLC 2004-2010.National Geological Library of China All Rights Reserved.
Add:29 Xueyuan Rd,Haidian District,Beijing,PRC. Mail Add: 8324 mailbox 100083
For exchange or info please contact us via email.