快速群智能优化算法的研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文针对经典的遗传算法和粒子群算法求解复杂适应度优化问题时间代价过高的不足,对群智能优化算法的适应度估计问题做了深入的研究,利用吸引子传播聚类算法和支持向量回归机改进了上述算法,具体研究内容包括:
     1.提出了基于吸引子传播聚类算法的遗传算法,首先利用吸引子传播算法将群体中编码相似的染色体聚到一起,然后利用聚类中心染色体的适应度和聚类信息估计其它染色体的适应度,从而减少适应度计算次数来加快遗传算法的速度;
     2.根据遗传算法的模式理论,提出了快速遗传算法,该算法在基于吸引子传播聚类算法的遗传算法基础上,提出了模式发现方法,并利用所提出的模式发现方法对估计的适应度进行再次修正,从而不仅仅提高了遗传算法的运行速度,还提高了算法估计适应度的精度;
     3.将支持向量回归机引入到遗传算法中,每次迭代利用已知真实适应度的染色体训练支持向量回归机,然后利用该模型预测染色体适应度,从而减少适应度计算次数,这个算法适用于适应度无法用确切函数表示的优化问题;
     4.将上述适应度估计策略扩展到粒子群算法,提出了基于吸引子传播聚类算法的粒子群算法、快速粒子群算法和基于支持向量回归机的粒子群算法,改进的算法均显著减少了经典算法的适应度计算次数;
     5.利用标准测试函数验证了上述算法,实验结果表明新提出的算法的速度以及优化结果的精度和稳定性均高于经典的遗传算法和粒子群算法;
     6.将快速粒子群算法应用于油藏数据历史拟合问题,将快速遗传算法应用于静力作用下的穹顶结构优化问题,将基于支持向量回归机的遗传算法应用于大肠杆菌染色体超螺旋位点预测问题。这些实际应用问题的结果表明,本文提出的算法能够显著地降低适应度计算次数,具有良好的收敛稳定性,从而为适应度计算耗时的优化问题提供了应用群智能优化技术进行求解的新思路。
The optimization problem is one of most important branch of applied mathematics andoperations research. It comes from an ancient problem, finding the minimized or maximized valueof one function or problem. In simple terms, the optimization problem is how to find the solutionto solve the problem. Especially, in the2ndhalf of20thcentury, with the great advances of societyand science, optimization theory became a new discipline. Driven by the rapid development ofcomputer technology, optimization theory has been applied to various area of society, such as carvehicle body structure design, aircraft cabin optimization, history matching for reservoir data, anddistribution logistics.
     Optimization algorithm for solving optimization problem has been a research focus ofmathematics for several centuries. From Newton proposed Newton method which was a kind ofgradient decent method in17thcentury, lots of great mathematicians had dedicated themselves tostudy newer and more efficient optimization algorithms and a number of excellent algorithmswere invented, for example, conjugate gradient method, quasi-Newton method, trust regionmethod, least squares method and so on. As these classic optimization algorithms processed fastand efficiently, they had been widely used.
     Recently, more and more people paid more attention to optimization algorithm. In2011, CCWResearch had published a report about optimization algorithm market named “2010-2011appliedmarket research report of information optimization algorithm”. From the report, optimizationalgorithm had been used and researched from universities and research institutes to enterprises andgovernment in last three years. The amount procurement of optimization algorithm increasedevery year, and the increasing speed of market was more than35%each year. It’s expected that themarket would be reached to230million Yuan in2013. So, optimization algorithms are not onlyimportant for research, but also bring huge economic benefits.
     However, the global search ability is poor for classic optimization algorithms, so it’s easy to fallinto local minima when using this kind of algorithms, and lead to premature convergence. Theinitial states of optimization problem can also decide whether these algorithms can find optimalsolutions. As a result, the users of classic optimization algorithms have a good background inmathematics and a priori knowledge about the optimization problem. It’s really a big limitation tothe application of these classic optimization algorithms. Since the1950s, with the greatdevelopment of computer technology and bionics, the research of swarm intelligence optimizationtheory became a focus, and a lot of swarm intelligence optimization algorithms were constantlyraised, such as genetic algorithms (GA), particle swarm optimization (PSO), and ant colonyalgorithm (ACO).
     Swarm intelligence optimization algorithms mimic the nature biological group behaviors, andthey have the following advantages compared to classic optimization algorithms: this kind ofalgorithms is a global optimization process and doesn’t depend on the initial state; they can beapplied widely without prior knowledge on the optimization problems; the ideas and theimplements of these algorithms are quite simple, the steps are standardization, and it’s veryconvenient to integrate them with other algorithms; these algorithms are based on the swarmintelligence theory, and they have very good potential parallelism.
     There is a common feature in swarm intelligence optimization algorithms. It’s that fitness valueis used to exchange information in the population, and guides the population to close the optimalsolution. Therefore, a mount of fitness should be calculated in swarm intelligence optimizationalgorithms in order to find the optimal solution or an approximate one. However, when thecalculation of the fitness is quite complex, the time cost of this kind of algorithms will be too large.What’s more, the fitness of optimization problems in the real world is often difficult to calculate.Addressing this problem, a series of solutions ware proposed in the paper.
     Inspired by k-means clustering based genetic algorithm (KMGA) and fitness inheritancestrategies based genetic algorithm (IGA), a genetic algorithm based on affinity propagationclustering (APGA) was firstly proposed. The basic idea of APGA is as followed: beforecalculating the fitness in GA, AP clustering is employed to cluster similar chromosomes firstly,and then the fitness values are estimated by the clustering information. A fitness estimationstrategy is also proposed in this paper. It’s using the fitness of cluster center and the distance fromother chromosome’s in the cluster to the cluster center to estimate the fitness value, a regulationfactor is introduced to this strategy to prevent oscillation in the estimation of fitness value. Asinherited the good features of AP clustering, stable, efficient, and determining clustering sizeautomatically, APGA overcame the shortage of KMGA and IGA as dependence on the initial stateand too large fitness estimation error. The experiments show that: when APGA, KMGA, IGA andclassic GA (CGA) were set same maximum iteration, APGA calculated least fitness value and theresult were closed to CGA; when the4algorithms calculated some number of fitness value, theoptimization accuracy and stability was better than other algorithms; and fitness estimation errorwas lower than others as well.
     After the study on pattern theory of GA, a novel fitness estimation strategy based on patterndiscovery was proposed, and an efficient GA (EGA) based on this strategy and APGA was alsoproposed. The fitness estimation strategy was originated from schema theorem and building blockhypothesis. During the pattern theory, short, low-order, above-average schemas (building block)receive exponentially increasing trials in subsequent generations. Under the operation of geneticoperators, building blocks would form longer, higher-order and higher fitness schema, and formoptimal solution in the end. In EGA, We maintained local best and worst chromosome sets, andcreated a strategy to discover the best and worst schema using the two sets. And then the schemaswere used to adjust the fitness value of chromosomes, including increasing the fitness value ofchromosomes which contained best schema and decreasing the fitness value of chromosomeswhich contained worst schema. The fitness correction strategy can promote the reproductionchance of good chromosomes; while at the same time accelerate the elimination of badchromosomes. It can also cover the shortage that the fitness values estimated by APGA always lower than cluster centers. Experiments proved that with the same number of iterations, the resultof EGA was closer to optimal solution than APGA, and the curve figures show that the fitnessestimation errors were much lower. There results could also prove the correctness of patterntheory.
     For some kinds of engineering optimization problem which the fitness function can’t be defined,the calculation of fitness always needs to call other software to simulate, and then get a value toevaluate the chromosomes, but the simulation process is always time-consuming. In this situation,regression analysis was introduced to help estimating fitness value in GA, and support vectorregression (SVRGA) based GA was also proposed. It was different from former research that, wedidn’t used GA to optimize the parameters of SVR or SVM, but use SVR to estimate fitnessvalues of partly chromosomes in order to reduce the calculation number of fitness values. SVR isa machine learning method which can get good performance with a small training set. Wemaintained a proper size training set in SVRGA. When the size of the set grew to a fix number, aSVR model would be trained each iteration. The majority of chromosomes’ fitness values wouldbe estimated by the SVR model, other chromosomes’ would be calculated really, and added thesechromosomes to the training set. Experiments show that, the training cost of SVR was quite lowwith the training set we maintained, and the estimation displayed well. So SVRGA’s fitnessestimation accuracy was higher than KMGA and IGA, and the speed of SVRGA was also veryfast.
     In this paper, we also extended the fitness estimation strategies above to PSO, and proposedPSO based on affinity propagation clustering (APSO), efficient PSO (EPSO), and PSO based onSVR (SVRPSO). The idea of EPSO was learnt from pattern theory of GA. Although there was notconvergence theory in PSO, the performance of EPSO was proved to be better than APSO throughexperiments. From the experiments, it also can be seen that: these PSO could prevent most fitnesscalculation to speed up PSO; in the sight of algorithm result, EPSO is similar to SVRPSO andbetter than APSO. After analyzing the curve figures of fitness estimation error, it could be seenthat SVRPSO displayed well on the single peak and smooth function, while EPSO did well inmulti-peak function. The successful application of fitness estimation strategies to PSO had adirection meaning to design other efficient swarm intelligence algorithms.
     In the end of this paper, in order to verify the effect of our algorithms in the actual productionand research area, we applied EPSO, EGA and SVRGA to three different areas of practicalproblems respectively, which were the history matching for reservoir data, the dome structureoptimization under static stress and the prediction of E. coli chromosome supercoiling site.Regarding the problem of history matching for reservoir data, AP clustering effectively gatheredsimilar chromosomes which represented reservoir geological parameters, the clusteringinformation and the reservoir data which were get from reservoir numerical simulation were usedto estimate the fitness of a large number of chromosomes. Thereby EPSO reduced the number ofcalling reservoir numerical simulator, and accelerated the speed of history matching. For theproblem of dome structure optimization under static stress, we fully considered the characteristicsof dome structure and the bar material, and designed a continuous fitness function. Then EGA wasused to reduce the calling number of finite element static analysis, and also significantly increasedthe efficiency of the optimization algorithm. Regarding the prediction of E. coli chromosome supercoiling site, according to the characteristics of supercoiling site which described in theexisting literature, we designed a fitness function. Because the long coding in supercoiling siteprediction, just matched the dimension-independent characteristics of SVR, it was suitable forgetting results by SVRGA. And the results also show that the solving speed of SVRGA is fasterthan CGA. The three typical practical problems were employed to verify the effectiveness of thealgorithms we proposed in this paper, and it was proved that the algorithms possessed goodpractical potential.
     In summary, we have researched the problem how to estimate the fitness of genetic algorithmand particle swarm algorithm deeply, and have proposed a variety of fitness estimation strategy.The experimental results show that these algorithms can ensure the optimization accuracy is closeto the optimal solution at the same time can significantly reduce the fitness calculation times.The successful application of these algorithms in three different areas verified the efficiency andstability of our algorithms.
引文
[1] Wikipedia. Mathematical optimization.[EB/OL].[2012-03-06]. http://en.wikipedia.org/wiki/Mathematical_optimization#Optimization_problems.
    [2] CNET科技资讯网.国内首份管理信息化优化算法应用市场研究报告出炉.[EB/OL].
    [2011-08-15]. http://www.cnetnews.com.cn/2011/0815/2052674.shtml.
    [3] CNET科技资讯网.2010-2011年信息化优化算法应用市场研究报告发布.[EB/OL].
    [2011-08-16]. http://www.cnetnews.com.cn/2011/0816/2052989.shtml.
    [4]袁亚湘,孙文瑜.最优化理论与方法[M].北京:科学技术出版社,2001.
    [5]王子若,陈永昌.优化计算方法[M].北京:机械工业出版社,1989.
    [6]陈宝林.最优化理论与算法[M].北京:清华大学出版社,2005.
    [7] Fletcher R, Reeves CM. Function minimization by conjugate gradients [J]. The ComputerJournal.1964,7(2):149-154.
    [8] Magnus RH, Eduard S. Methods of conjugate gradients for solving linear systems [J]. J.Research Nat. Bur. Standards.1952,49,409-436.
    [9] Davidon WC. Variable Metric Method for Minimization [J]. A.E.C. Res. and Devel.1959,Rep. ANL-5990(Rev.).
    [10] Davidon WC, Variance algorithm for minimization [J]. Comput. J.1967/68,10(36):406-410.
    [11] Fletcher R, Powell MJD. A rapidly convergent descent method for minimization [J]. Comput.J.1963/64,6(27):163-168.
    [12] Powell MJD. A new algorithm for unconstrained optimization [C]. In: Rosen,J. B.,Mangasarian O. L., Ritter, K., eds. Nonlinear Programming. New York: Academic Press, pp.31-66,1970.
    [13] Powell MJD. Convergence properties of a class of minimization algorithms [C]. In:Mangasarian, O. L., Meyer, R., Robinson, S., eds. Nonlinear Programming2. New York:Academic Press, pp.1-27,1975.
    [14] Powell MJD, Yuan Y. A trust region algorithm for equality constrained optimization [J]. Math.Programming.1991,49:189-211.
    [15]马昌凤.最优化方法及其Matlab程序设计[M].北京:科学出版社,2010.
    [16] Hackwood S, Beni G. Self-organization of sensors for Swarm Intelligence [C]. IN: IEEEInternational conference on Robotics and Automation. Piscataway, NJ: IEEE Press,1992:819-829.
    [17] Holland J. Adaptation in Natural and Artificial Systems [M]. New York: MIT Press,1992.
    [18]梁艳春.群智能优化算法理论与应用[M].北京:科学出版社,2009.
    [19]周春光,梁艳春.计算智能优化算法理论与应用[M].长春:吉林大学出版社,2009.
    [20] Kennedy J., Eberhart R. Particle Swarm Optimization [C]. Proceedings of IEEE InternationalConference on Neural Networks. IV. pp.1942-1948,1995.
    [21] Shi Y, Eberhart RC. A modified particle swarm optimizer [C]. Proceedings of IEEEInternational Conference on Evolutionary Computation. pp.69-73,1998.
    [22] Kennedy J. The particle swarm: social adaptation of knowledge [C]. Proceedings of IEEEInternational Conference on Evolutionary Computation. pp.303-308,1997.
    [23] Dorigo M. Optimization, Learning and Natural Algorithms [D]. Politecnico di Milano, Italie,1992.
    [24] Dorigo M, Gambardella LM. Ant Colony System: A Cooperative Learning Approach to theTraveling Salesman Problem [J]. IEEE Transactions on Evolutionary Computation.1997,1(1):53-66.
    [25] Jerne NK. Towards a Network Theory of the Immune System [J]. Annual Immunology.1974,125C:373-389.
    [26] Farmer JD, Kauffman SA, Packard NH, Perelson AS. Adaptive dynamic networks as modelsfor the immune system and autocatalytic sets [J]. Perspectives in Biological Dynamics andTheoretical Medicine, Ann. NY Acad. Sci.1987,504:118-131.
    [27] Bersini H, Varela FJ. Hints for adaptive problem solving gleaned from immune networks [C].Parallel Problem Solving from Nature, First Workshop PPSW1, Dortmund, FRG, October,1990.
    [28] Jerne NK. The immune system [J]. Scientific American.1973,229(l):52-60.
    [29] Ishiguro A, Watanabe Y, Kondo T, Shirai Y, Uchikawa H. Emergent construction of artificialimmune networks for autonomous mobile robots [C]. IEEE International Conference onSystem, Man and Cybernetics, Orlando, USA, IEEE press,1997,1222-1228.
    [30] Forrest S, Perelson AS, Allen L, Cherukuri R. Self-nonself discrimination in a computer [C].IEEE Symposium on Research in Security and Privacy, Oakland, CA, IEEE ComputerSociety Press,1994,202-212.
    [31] Forrest S, Hofmeyr S A, Somayaji A, Longstaff TA. A sense of self for unix processes [C].IEEE Symposium on Research in Security and Privacy, Los Alamitos, CA, IEEE ComputerSociety Press,1996,120-128.
    [32] De Castro LN, Von Zuben FJ. The clonal selection algorithm with engineering applications
    [C]. Proceedings of Genetic and Evolutionary Computation Conference, Las Vegas, USA,2000,36-37.
    [33] De Castro LN, Von Zuben FJ. Learning and optimization using the clonal selection principle[J].IEEE Transactions on Evolutionary Computation, Special Issue on Artificial ImmuneSystems.2002,6(3):239-251.
    [34] Chun JS, Lim JP, Jung HK, Yoon JS. Optimal design of synchronous motor with parametercorrection using immune algorithm [J]. IEEE Transactions on Energy Conversion.1999,14(3):610-615.
    [35] Chun JS, Kim MK, Jung HK, Hong SK. Shape optimization of electromagnetic devices usingImmune Algorithm [J]. IEEE Transactions on Magnetics.1997,33(2):1876-1879.
    [36]孙吉贵,刘杰,赵连宇.聚类算法研究[J].软件学报.2008,19(1):48-61.
    [37] Jain AK, Murty MN. Data clustering: a review [J]. ACM Computing Surveys.1999,31(3):264-323.
    [38] Alpaydin E著.范明,昝红英等译.机器学习导论[M].北京:机械工业出版社,2009.
    [39] Jain AK, Flynn PJ. Image segmentation using clustering [C]. In: Ahuja N, Bowyer K, eds.Advances in Image Understanding: A Festchrift for Azriel Rosenfeld. Piscataway: IEEE Press,1996.65-83.
    [40] Cades I, Smyth P, Mannila H. Probabilistic modeling of transactional data with applicationsto profiling, visualization and prediction, sigmod [C]. In: Proc. of the7th AC M SIGKDD.San Francisco: ACM Press,2001,37-46.
    [41] Jain AK, Dubes RC. Algorithms for Clustering Data [J]. Prentice-Hall Advanced ReferenceSeries.1988.1-334.
    [42] Jain AK, Duin RPW, Mao JC. Statistical pattern recognition: A review [J]. IEEE Trans. onPattern Analysis and Machine Intelligence.2000,22(1):4-37.
    [43] Sambasivam S, Theodosopoulos N. Advanced data clustering methods of mining Webdocuments [J]. Issues in Informing Science and Information Technology.2006,3:563-579.
    [44] Mahalanobis PC.On the generalised distance in statistics [J]. Proceedings of the NationalInstitute of Sciences of India.1936,2(1):49-55.
    [45] Krause EF. Taxicab Geometry: An Adventure in Non-Euclidean Geometry [M].New York:Dover,1986.
    [46] Richard H. Error-detecting and error-correcting codes [J].Bell System Technical Journal.1950,29(2):147-160.
    [47] Marques JP, Written, Wu YF, Trans. Pattern Recognition Concepts, Methods and Applications[C].2nd ed., Beijing: Tsinghua University Press,2002.5174(in Chinese).
    [48] Fred ALN, Leit o JMN. Partitional vs hierarchical clustering using a minimum grammarcomplexity approach [C]. In: Proc. of the SSPR&SPR2000. LNCS1876,2000.193202.
    [49] Gelbard R, Goldman O, Spiegler I. Investigating diversity of clustering methods: Anempirical comparison [J]. Data&Knowledge Engineering.2007,63(1):155-166.
    [50] Kumar P, Krishna PR, Bapi RS, De SK. Rough clustering of sequential data [J]. Data&Knowledge Engineering.2007,3(2):183-199.
    [51] Guha S, Rajeev R, Kyuseok S. CURE: An efficient clustering algorithm for large databases[C]. SIGMOD Record (ACM Special Interest Group on Management of Data). Croatian SocChem Eng, Zagreb, Croatia,1998,73-84.
    [52] Guha S, Rajeev R, Kyuseok S. ROCK: a robust clustering algorithm for categorical attributes[C]. Proceedings-International Conference on Data Engineering,1999,512-521.
    [53] Guha S, Rajeev R, Kyuseok S. Rock: a robust clustering algorithm for categorical attributes[J]. Information Systems.2000,25(5):345-366.
    [54] Zhang T, Ramakrishnan R, Livny M. BIRCH: An efficient data clustering method for verylarge databases [C]. SIGMOD Record (ACM Special Interest Group on Management ofData). ACM,1996,103-114.
    [55] Karypis G, Han EH, Kumar V. Chameleon: Hierarchical clustering using dynamic modeling[J]. Computer.1999,32(8):68-75.
    [56] MacQueen J. B. Some Methods for classification and Analysis of Multivariate Observations[C]. Proceedings of5-th Berkeley Symposium on Mathematical Statistics and Probability,Berkeley, University of California Press,1:281-297,1967.
    [57] Hartigan J. A. Clustering Algorithms [M]. New York: Wiley Press,1975.
    [58] Hartigan J.A, Wong M. A. A K-Means Clustering Algorithm, Applied Statistics.1979,28(1):100-108.
    [59] Huang Z. Extensions to the k-means algorithm for clustering large data sets with categoricalvalues [J]. Data Mining and Knowledge, Discovery II.1998,(2):283-304.
    [60] Huang Z, Ng MA. Fuzzy k-modes algorithm for clustering ca tegorical data [J]. IEEE Trans.Fuzzy Systems.1999,7(4):446-452.
    [61] Chaturvedi AD, Green PE, Carroll JD. K-modes clustering [J]. Journal of Classification.2001,18(1):35-56.
    [62] Lyer NS, Kandel A, Schneider M. Feature-Based fuzzy classification for interpretation ofmammograms [J]. Fuzzy Sets System.2000,114(2):271-280.
    [63] Yang MS, Hu YJ, Lin KCR, Lin CCL. Segmenttation techniques for tissue differentiation inMRI of ophthalmology using fuzzy clustering algorithm [J]. Journal of Magnetic ResonanceImaging.2002,(20):173-179.
    [64] Brendan JF, Delbert D. Clustering by passing messages between data points [J]. Science.2007,315(5814):972-976.
    [65] Ester M, Kriegel HP, Sander J. A density-based algorithm for discovering clusters in largespatial databases [C]. Knowledge Discovery and Data Mining (KDD'96). AAAI Press,1996,226-231.
    [66] Sander J, Este Mr, Kriegel HP, Xu XW. Density-based clustering in spatial databases: Thealgorithm gdbscan and its applications [J]. Data mining and knowledge discovery.1998,2(2):169-194.
    [67] Ankerst M, Markus MB, Hans-Peter K, Jorg S. OPTICS: ordering points to identify theclustering structure [C]. Proceedings of the ACM SIGMOD International Conference onManagement of Data. ACM Press,1999,49-60.
    [68] Hinneburg A, Keim DA. An efficient approach to clustering in large multimedia databaseswith noise [C]. Proceedings of the4th International Conference on Knowledge Discoveryand Data Mining KDD'98. AAAI Press,1998,58-65.
    [69] Agrawal R, Gehrke JE, Gunopulos D, Prabhakar R. Automatic subspace clustering of highdimensional data for data mining applications [C]. Proceedings of the ACM-SIGMOD’98Int.Conf. Management of Data. ACM Press,1998,94-105.
    [70] Wang W, Yang J, Muntz R. STING: A statistical information grid approach to spatial datamining [C]. Proc.23rd Int. Conf. on Very Large Data Bases. IEEE Press,1997,186-195.
    [71] Zhao YC, Song J. GDILC: A grid-based density isoline clustering algorithm [C]. In: ZhongYX, Cui S, Yang Y, eds. Proc. of the Internet Conf. on Info-Net. Beijing: IEEE Press,2001.140145.
    [72] Ma WM, Chow E, Tommy WS. A new shifting grid clustering algorithm [J]. PatternRecognition.2004,37(3):503514.
    [73] Pilevar AH, Sukumar M. GCHL: A grid-clustering algorithm for high-dimensional very largespatial data bases [J]. Pattern Recognition Letters.2005,26(7):9991010.
    [74] Liu RJ, Wang YH, Baba T, et al. SVM-based active feedback in image retrieval usingclustering and unlabeled data [J]. Pattern Recognition.2008,41(8):2645-2655.
    [75] Winters-Hilt S., Merat S. SVM clustering [J]. BMC Bioinformatics.2007,8(Suppl7): S18.
    [76] Ma JM, Minh NN, Jagath C. Rajapakse. Gene Classification Using Codon Usage andSupport Vector Machines [J]. Computational Biology and Bioinformatics, IEEE/ACMTransactions.2009,6(1):134-143.
    [77] Bicego M, Mario ATF. Soft clustering using weighted one-class support vector machines [J].Pattern Recognition.2009,42(1):27-32.
    [78] Francesco C, Verri A. A novel Kernel method for clustering [J]. IEEE Transactions on PatternAnalysis and Machine Intelligence.2005,27(5):801-805.
    [79] Filippone M, Francesco C, Francesco M, Stefano R. A survey of kernel and spectral methodsfor clustering [J]. Pattern Recognition.2008,41(1):176-190.
    [80] Tushir M, Smriti S. A new Kernelized hybrid c-mean clustering model with optimizedparameters [J]. Applied Soft Computing.2010,10(2):381-389.
    [81] Liang L, Lin TS, Li B. MRI brain image segmentation and bias field correction based on fastspatially constrained kernel clustering approach [J]. Pattern Recognition Letters,2008,29(10):1580-1588.
    [82] Das Swagatam, Ajith Abraham, Amit Konar. Automatic kernel clustering with a Multi-ElitistParticle Swarm Optimization Algorithm [J]. Pattern Recognition Letters.2008,29(5):688-699.
    [83] Amadou BH, Stéphane L, Salah M. SAKM: Self-adaptive kernel machine A kernel-basedalgorithm for online clustering [J]. Neural Networks.2008,21(9):1287-1301.
    [84] Kim DW, Lee KY, Lee D, Lee KH. A kernel-based subtractive clustering method [J]. PatternRecognition Letters,2005,26(7):879-891
    [85] Lai Jim ZC., Liaw YC, Liu J. Fast k-nearest-neighbor search based on projection andtriangular inequality [J]. Pattern Recognition.2007,40(2):351-359.
    [86] Luxburg Ulrike. A tutorial on spectral clustering [J]. Statistics and Computing.2007,17(4):395-416.
    [87] Dhillon IS., Guan YQ, Kulis B. Weighted graph cuts without eigenvectors a multilevelapproach [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence.2007,29(11):1944-1957.
    [88] Luxburg U Von, Bousquet O, Belkin M. Limits of spectral clustering [C]. Advances in neuralinformation processing systems17. MIT Press,2005,1020-1027.
    [89] Zelnik-Manor L, Perona P. Self-tuning spectral clustering [C]. Advances in NeuralInformation Processing Systems16. MIT Press,2004,1601-1608.
    [90] Yu SX, Shi JB. Multiclass spectral clustering [C]. Proceedings of the IEEE InternationalConference on Computer Vision. IEEE Press,2003,313-319.
    [91] Ng A, Jordan M, Weiss Y. On Spectral Clustering: Analysis and an algorithm. Advances inneural information processing systems [M]. MIT Press,2002,849-856.
    [92] Ahmed A, Xing EP. Recovering time-varying networks of dependencies in social andbiological studies. Proc Natl Acad Sci USA.2009,106(29):11878-11883.
    [93] Fortunato S, Barthelemy M. Resolution limit in community detection [J]. Proc Natl Acad SciUSA.2007,104(1):36-41.
    [94] Girvan M, Newman ME. Community structure in social and biological networks [J]. ProcNatl Acad Sci USA.2002,99(12):7821-7826.
    [95] Ostling A, John H, Jessica G. Self-Similarity and Clustering in the Spatial Distribution ofSpecies [J]. Science.2000,290(5492):671a.
    [96] Jeong H, Tombor B, Albert R, Oltvai ZN, Barabasi AL. The large-scale organization ofmetabolic networks [J].Nature.2000,407(6804):651-654.
    [97] Maslov S, Sneppen K. Specificity and stability in topology of protein networks [J]. Science.2002,296(5569):910-913.
    [98] Mitchell TM. Machine Learning [J]. New York: McGraw-Hill,1997.
    [99] Vapnik V. The Nature of statistical learning theory. Springer, NY,1995.张学工译,统计学习理论的本质[M].北京:清华大学出版社,2000.
    [100]陈希儒,王松桂.近代回归分析[M].合肥:安徽教育出版社,1987.
    [101] Holland JH. Outline for a logical theory of adaptive systems [J]. Journal of the Associationfor Computing Machinery.1962,9(3):297-31.
    [102] Goldberg DE. Genetic Algorithms: in Search Optimization&Machine Learning [M]. MA:Addison Wesley.1989.
    [103] Davis L. Handbook of Genetic Algorithms [M]. New York: Van Nostrand Reinhold.1991.
    [104] Koza JR. Genetic Programming, on the Programming of Computers by Means of NaturalSelection [M]. MIT Press.1992.
    [105] DeJong KA, Spears WM, Gordon DF. Using genetic algorithms for concept learning [J].Machine Learning.1993,13(2/3):161-188.
    [106] Janikow CZ. A knowledge-intensive genetic algorithm for supervised learning [J]. MachineLearning.1993,13(2/3):189-228.
    [107] Fogarty TC. Varying the probability of mutation in the genetic algorithm [C]. Proceedingsof the Third International Conference on Genetic Algorithms, Morgan Kaufmann Publishers,San Mateo, CA,1989,104-109.
    [108] Blum C. Beam-ACO--hybridizing ant colony optimization with beam search: anapplication to open shop scheduling [J]. Computers&Operations Research.2005,32(6):1565-1591.
    [109] Bell JE., McMullen PR. Ant colony optimization techniques for the vehicle routingproblem. Advanced Engineering Informatics.2004,18(1):41-48.
    [110] Dorigo M, Gambardella LM. Ant colonies for the travelling salesman problem [J].Biosystems.1997,43(2):73-81.
    [111]陈国良,王煦法,庄镇泉,王东生.遗传算法及其应用[M].人民邮电出版社,北京,1996.
    [112]刘勇,康立山,陈毓屏.非数值并行算法(第二册)—遗传算法(第二版)[M].科学出版社,北京,1997.
    [113]焦李成,保铮.进化计算与遗传算法—计算智能的新方向[J].系统工程与电子技术.1995,(6):20-32.
    [114]王强,邵惠鹤.遗传算法在甲醛生产过程优化中的应用[J].控制理论与应用.1996,13(4):477-481.
    [115]王黎,马光文.桁架结构优化设计的遗传算法[J].成都科技大学学报.1996,5:25-30.
    [116] Bhanu B, Ming J. Adaptive Image Segmentation Using a Genetic algorithm [J]. IEEE Trans.On Systems, Man and Cybernetic.1995,25(12):1543-1567.
    [117] Hirafuji M, Hagan S. A global optimization algorithm based on the process of evolution incomplex biological systems [J]. Computers and Electronics in Agriculture.2000,29(1-2):125-134.
    [118] Paul T.K., Iba H., Gene selection for classification of cancers using probabilistic modelbuilding genetic algorithm [J]. Biosystems.2005,82(3):208-225.
    [119] Terfloth L., Gasteiger J., Neural networks and genetic algorithms in drug design [J]. DrugDiscovery Today.2001,6(Supplement2):102-108.
    [120] Wiese K.C., Glen E., A permutation-based genetic algorithm for the RNA folding problem:a critical look at selection strategies, crossover operators, and representation issues [J].Biosystems.2003,72(1-2):29-41.
    [121] John H.M., Carter T.B., David R. Communication and cooperation [J]. Journal of EconomicBehavior&Organization.2002,47(2):179-195.
    [122] Reynold CW. Flock, herds and schools: a distributed behavioral model [J]. ComputerGraphics.1987,21(4):25-34.
    [123] Hepper F, Grenander U. A stochastic nonlinear model for coordinated bird flocks [M]. In SKrasner Ed. The Ubiquity of Chaos, AAAS Publications, Washington DC,1990.
    [124] Hirotaka Y, Kenichi K, Yoshikazu F, Yosuke N. A particle Swarm Optimization for ReactivePower and Voltage Control Considering Voltage Stability [C]. IEEE International Conferenceon Intelligent System Applications to Power Systems, Rio de Janeiro,1999
    [125] Voss MS, Feng X. Arma Model Selection Using Particle Swarm Optimization and AicCriteria [C].15th Triennial World Congress, Barcelona Spain,2002
    [126] Parsopoulos KE, Vrahatis MN. Particle Swarm Optimization Method in MultiobjectiveProblems [C]. Proceedings of the2002ACM Symposium on Applied Computing,2002.603-607
    [127] Van den Bergh F, Engelbrecht AP. Cooperative Learning in Neural Networks using ParticleSwarm Optimizers [J]. South Africam Computer Journal.2000,26,84-90.
    [128] Arabshahi P, Gray A, Kassabalidis I, Das A. Adaptive routing in wireless communicationnetworks using swarm intelligence [C]. AIAA19th Annual Satellite Communications SystemConference Toulouse, France.2001.
    [129] Clerc M., Kennedy J. The particle swarm-explosion, stability and convergence in amultidimensional complex space [J]. IEEE Trans. on Evolutionary Computation,2002,6(1):58-73.
    [130] Van den Bergh F, Engelbrecht AP. A cooperative approach to particle swarm optimisation[J]. IEEE Transactions on Evolutionary Computation,2004,3(6):225-239.
    [131] Shi YH, Eberhart RC. Empirical study of particle swarm optimization [C]. Proceedings ofthe IEEE Congress on Evolutionary Computation,1999,1945-1950
    [132] Kennedy J, Mendes R. Neighborhood topologies in fully informed and best-of-neighborhood particle swarms [J]. IEEE Trans. on Systems, Man and Cybernetics-Part C:Applications and Reviews.2006,36(4):515-519.
    [133] Ding C, He X. K-Nearest-Neighbor in data clustering: Incorporating local information intoglobal optimization [C]. In: Proc. of the ACM Symp. Applied Computing. Nicosia: ACMPress,2004.584589.
    [134] Wu XD, Kumar V, Quinlan JR, et al. Top10algorithms in data mining [J]. Knowledge andInformation Systems.2008,14(1):1-37.
    [135] Ma a-López MJ, Buenaga MD, Gómez-Hidalgo JM. Multidocument summarization: Anadded value to clustering in interactive retrieval [J]. ACM Transactions on InformationSystems.2004,22(2):215–241.
    [136] Jing LP, Ng MK, Huang JZ. An entropy weighting k-means algorithm for subspaceclustering of high-dimensional sparse data [J]. IEEE Transactions on Knowledge and DataEngineering.2007,19(8):1026-1041.
    [137] Huang S, Chen Z, Yu Y, Ma WY. Multi-type features co-selection for web documentclustering [J]. IEEE Transactions on Knowledge and Data Engineering.2006,18(4):448-458.
    [138] Kanungo T, Mount DM, Netanyahu NS, Piatko CD, Silverman R, Wu AY. An efficientk-means clustering algorithm: Analysis and implementation [J]. IEEE Transactions PatternAnalysis and Machine Intelligence.2002,24(7):881–892.
    [139] Brusco MJ, Hans-Friedrich K. Comment on Clustering by Passing Messages Between DataPoints [J]. Science.2008,319(5864):726c
    [140] Frey BJ, Delbert D. Response to Comment on Clustering by Passing Messages BetweenData Points [J]. Science.2008,319(5864):726d
    [141] Frey BJ, Dueck D. Non-metric affinity propagation for unsupervised image categorization
    [C]. In Proceedings of IEEE11th International Conference on Computer Vision. Rio deJaneiro, Brazil,2007.
    [142] Guan RC, Shi XH, Marchese M, Yang C, Liang YC. Text Clustering with Seeds AffinityPropagation [J]. IEEE Transactions on Knowledge and Data Engineering.2010,23(4):627-637.
    [143] Vapnik V N. Statistical Learning Theory [M]. New York: Wiley,1989.
    [144] Vapnik V, Chervonenkis A. On the uniform convergence of relative frequencies of events totheir probabilities [J]. Doklady Akademii Nauk USSR.1968,181(4).
    [145] Vapnik V, Chervonenkis A. On the uniform convergence of relative frequencies of events totheir probabilities [J]. Theory of Probability and its Applications.1971,16(2):264–280.
    [146] Vapnik V. Estimation of dependence based on empirical data [M]. New York:Springer-Verlag,1982.
    [147] Cortes C, Vapnik V. Support Vector Networks [J]. Machine Learning.1995,(20):1-25.
    [148] Pontil M, Rifkin R and Evgeniou T. From regression to classification in support vectormachines In M. Verleysen [C]. Processdings ESANN, Brussels, D Facto,1999,255-230.
    [149] Smola A J, Sch lkopf B. On a kernel-based method for pattern recognition, regression,approximation and operator inversion [J]. Algorithmica.1998,(22):211-231.
    [150] Smola AJ, Sch lkopf B, Müller KR. The connection between regularization operators andsupport vector kernels [J]. Neural Netw.1998Jun,11(4):637-649.
    [151] Scho lkopf B., Smola A., Williamson R., Bartlett P. L. New support vector algorithms [J].Neural Computation,2000,(12):1207-1245.
    [152] Smola AJ, Murata N, Sch lkopf B, Müller KR. Asymptotically Optimal Choice of ε-Lossfor Support Vector Machines [C]. Proceedings of the8th International Conference onArtificial Neural Networks, Perspectives in Neural Computing,105-110.
    [153] Suykens JA, Gestel TV, Brabanter JD, Moor BD, Vandewalle J. Least Squares SupportVector Machines [M]. New Jersey, World Scientific,2002.
    [154] Joachims T. Text Categorization with Support Vector Machines: Learning with ManyRelevant Features. Proceedings of ECML-98,10thEuropean Conference on MachineLearning,1998
    [155] Guo GD, Li SZ, Chan K. Face Recognition by Support Vector Machines [C]. Fourth IEEEInternational Conference on Automatic Face and Gesture Recognition. Grenoble, France2000,26-30.
    [156]侯风雷,王炳锡.基于支持向量机的说话人辨认研究[J].通信学报,2002,23(6):61-67.
    [157]胡寿松,王源.基于支持向量机的非线性系统故障诊断[J].控制与决策,2001,16(5):617-620.
    [158] Smola M, Smola AJ, Ratsch G, Scholkopf B, Kohlmorgen J, Vapnik V. Predicting TimeSeries with Support Vector Machines [C]. Proceeding ICANN '97Proceedings of the7thInternational Conference on Artificial Neural Networks,1997,999-1005.
    [159] Jin Y. A comprehensive survey of fitness approximation in evolutionary computation, SoftComputing.20059(1):3-12.
    [160] Branke J, Schmidt C. Faster convergence by means of fitness estimation [J]. SoftComputing.2005,9(1):13-20.
    [161] Grefenstette JJ, Fitzpatrick JM. Genetic search with approximate function evaluations [C].In: Proceedings of an International Conference on Genetic Algorithms and Their Applications,pp.112-120,1985.
    [162] Smith RE, Dike BA, Stegmann SA, Fitness inheritance in genetic algorithms [C]. In:Proceedings of the ACM symposium on Applied computing, pp.345-350,1995.
    [163] Sastry K, Goldberg D.E., Pelikan M. Don’t evaluate, inherit [C]. In: Spector L, GoodmanED, Wu A, Langdon WB, Voigt HM (Eds) Genetic and Evolutionary ComputationConference Morgan Kaufmann, pp.551-558,2001.
    [164] Salami M, Hendtlass T. A Fitness Estimation Strategy for Genetic Algorithms [C]. In: T.Hendtlass and M. Ali (Eds.) Proceedings of the15th international conference on Industrialand engineering applications of artificial intelligence and expert systems: developments inapplied artificial intelligence, IEA/AIE2002, LNAI2358, pp.502-513,2002.
    [165] Kim H.S., Cho S.B. An efficient genetic algorithm with less fitness evaluation by clustering[C]. In Proceedings of IEEE Congress on Evolutionary Computation, pp.887-894,2001.
    [166] Vanderplaats G.N. Numerical Optimization Techniques for Engineering Design: withApplications [M]. McGraw-Hill, New York,1984.
    [167] Els D, Bernard DB, Robert DW. Is Fitness Inheritance Useful for Real-World Applications[C]? In Proceedings of EMO'2003. pp.31~42.
    [168] Wu CG, Gao H, Yu LJ, Liang YC, Xiang RW. Genetic Algorithm with Affinity Propagation[C]. Proceedings of the2010IEEE International Conference on Information and AutomationJune20-23, Harbin, China,2010,159-162.
    [169] De Jong K. An analysis of the behaviour of a class of genetic adaptive systems[D].University of Michigan,1975.
    [170] Yao X, Liu Y, Lin G. Evolutionary Programming Made Faster [J]. IEEE Transactions onEvolutionary Computation.1999,3(2):p.82-102.
    [171] Chellapilla K. Combining mutation operators in evolutionary programming [J]. IEEETransactions on Evoluttionary Computation.1998,2(3):91-96.
    [172] Fogel D.B. System Identification Through Simulated Evolution: A Machine LearningApproach to Modeling [M]. Ginn Press, Boston1991.
    [173] He S, Wu QH, Wen JY, Saunders JR, Paton PC. A particle swarm optimizer with passivecongregation [J]. BioSystems.2004,78:135–147.
    [174] Liang JJ, Qin AK, Suganthan PN, Baskar S. Comprehensive Learning Particle SwarmOptimizer for Global Optimization of Multimodal Functions [J]. IEEE Transactions onEvolutionary Computation,2006,10(3):281-295.
    [175] Kang F, Li JJ, Ma ZY. Rosenbrock artificial bee colony algorithm for accurate globaloptimization of numerical functions [J]. Information Sciences.2011,181:3508-3531.
    [176] Klaus M. Java Genetic Algorithms and Genetic Programming (JGAP).[CP/OL].
    [2012-01-11]. http://jgap.sourceforge.net/
    [177] Chang Chih-Chung, Lin Chih-Jen. LIBSVM: a library for support vector machines [J].ACM Transactions on Intelligent Systems and Technology,2011,2(27):1-27. Softwareavailable at http://www.csie.ntu.edu.tw/~cjlin/libsvm
    [178] Hendtlass T. Fitness estimation and the particle swarm optimisation algorithm [C].Evolutionary Computation,2007. CEC2007. IEEE Congress.4266-4272,25-28Sept.2007
    [179] Cui Z.H, Zeng J.C, Sun G.J. A fast particle swarm optimization [J]. International Journal ofInno-vative Computing, Information and Control,2(2006)1365-1380.
    [180] Cai X, Zeng J, Tan Y, Forecasted Particle Swarm Optimization [C]. vol.4, pp.713-717,Third International Conference on Natural Computation (ICNC2007),2007
    [181]杨韬,邵良杉.采用改进的K均值聚类分析策略的粒子群算法[J].计算机工程与应用,2009,45(12):52-54.
    [182]高鹰,谢胜利,许若宁等.基于聚类的多子群粒子群优化算法[J].计算机应用研究,2006(4):40-41.
    [183]刘衍民,隋常玲,赵庆祯.基于K-均值聚类的动态多种群粒子群算法及其应用[J].控制与决策,2011,26(7):1019-1025.
    [184] Yeh W.C. An efficient memetic algorithm for the multi-stage supply chain network problem[J]. The International Journal of Advanced Manufacturing Technology,2006,29:803-813.
    [185]闫霞,张凯,姚军,李阳.油藏自动历史拟合方法研究现状与展望[J].油气地质与采收率,2010,17(4),69-75.
    [186] Tan T.B, Kalogerakis N. A Fully Implicit, Three-Dimensional, Three-Phase Simulator withAutomatic History-Matching Capability [R]. SPE21205,1991.
    [187] Kalogerakis N, Tomas C. Reliability of Horizontal Well Performance on a Field Scalethrough Automatic History Matching [J]. Journal of Canadian Petroleum Technology,1995,33(2):100-105.
    [188] Zhang F, Reynolds A.C. Optimization Algorithms for Automatic History Matching ofProduction Data[C]. Proceedings of the European Conference on the Mathematics of OilRecovery,2002
    [189] Gao G., Reynolds A.C. An Improved Implementation of the LBFGS Algorithm forAutomatic History Matching [J]. SPE Journal,2006,11(1):5-17
    [190] Tavakoli, R., Reynolds. A.C. History Matching with Parameterization Based on the SVD ofa Dimensionless Sensitivity Matrix [J]. SPE Journal,2010,15(2):495-508.
    [191] Tokuda N, Takahashi S, Watanabe M, Kurose T. Application of genetic algorithm to historymatching for core flooding [R]. SPE88621,2004.
    [192] Cullick AS, JohnsonW, Shi G. Improved and more rapid history matching with a nonlinearproxy and global optimization[R]. SPE101933,2006.
    [193] R. Courant. Variational methods for the solution of problems of equilibrium and vibrations[J].Bull. Amer. Math. Soc.,194349:1–23.
    [194] Turner MJ, Clough RW, Martin HC, Topp LJ. Stiffness and deflection analysis of complexstructures [J]. J Aero Sci1956,23:805-23.
    [195] Clough RW. The finite element method in plane stress analysis [C]. Proc ASCE ConfEletron Computat, Pittsburg, PA,1960.
    [196] Clough RW, Wilson EL. Stress analysis of a gravity dam by the finite element method [C].Proc Symposium on the Use of Computers in Civil Engineering, Lisbon, Portugal,1962.
    [197] Clough RW. The finite element after25years–a personal view [C]. Int Conf onApplications of the Finite Element Method. Veritas Center, Hovik, Norway: Computas,1979.
    [198]冯康.基于变分原理的差分格式[J].应用数学和计算数学,1965,2(4):238-262
    [199]张雄,王天舒.计算动力学[M].清华大学出版社,北京,2007.
    [200] Griswold A. Genome packaging in prokaryotes: The circular chromosome of E. coli [J/OL].Nature Education.2008,1(1). http://www.nature.com/scitable/topicpage/genome-packaging-in-prokaryotes-the-circular-chromosome-9113?ref=nf
    [201] Holmes VF, Cozzarelli NR. Closing the ring: Links between SMC proteins andchromosome partitioning, condensation, and supercoiling [J]. Proc. Natl. Acad. Sci.2000,97:1322–1324.
    [202] Deng S, Stein RA, Higgins NP. Organization of supercoil domains and their reorganizationby transcription [J]. Mol. Microbiol.2005,57:1511–1521.
    [203] Dorman CJ. DNA supercoiling and bacterial gene expression [J]. Sci Prog.2006,89:151-166.
    [204] Xu Y. Computational challenges in deciphering genomic structures of bacteria [J]. Journalof Computer Science and Technology.2010,25(1):53–70.
    [205] Postow L, Hardy CD, Arsuaga J, Cozzarelli NR. Topological domain structure of theEscherichia coli chromosome [J]. Genes Dev.2004,18:1766–1779.
    [206] Zechiedrich EL, Khodursky AB, Cozzarelli NR. Topoisomerase IV, not gyrase, decatenatesproducts of site-specific recombination in Escherichia coli [J]. Genes&Dev.1997,11:2580–2592.
    [207] Yin YB, Zhang H, Xu Y. Genomic arrangement of bacterial operons is constrained bybiological pathways encoded in the genome [J]. Proc Natl Acad Sci USA,2010,107(14),6310-6315.
    [208] National Center for Biotechnology Information, U.S. National Library of Medicine.Genebank [DB/OL].[2011-04-01]. http://www.ncbi.nlm.nih.gov/Genbank/
    [209] Computational Systems Biology Lab. Database of prokaryotic OpeRons (DOOR)[DB/OL].[2009-02-6]. http://csbl1.bmb.uga.edu/OperonDB_10142009/DOOR.php
    [210] Kanehisa Laboratories. Kyoto Encyclopedia of Genes and Genomes (KEGG)[DB/OL].[2011-06-01]. http://www.genome.jp/kegg/.
    [211] National Center for Biotechnology Information, U.S. National Library of Medicine. GeneExpression Omnibus (GEO)[DB/OL].[2011-01-31]. http://www.ncbi.nlm.nih.gov/geo/.
    [212] Rakesh Agrawal and Ramakrishnan Srikant. Fast algorithms for mining association rules inlarge databases [C]. Proceedings of the20th International Conference on Very Large DataBases, VLDB, pages487-499, Santiago, Chile, September1994.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700