几类神经网络的分析与优化及其应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
人工神经网络是人工智能研究领域的重要分支之一,在控制、预测、优化、系统辨识、信号处理和模式识别等方面有广泛的应用。本文分析和研究了几类主要的神经网络模型:进化神经网络Integrate-and-Fire神经网络以及细胞神经网络,并将其应用于函数逼近、模式识别、数据分类和图像处理。论文的主要研究工作可概括为以下几个方面:
     1.对无约束全局优化问题,分别将高斯变异和正交杂交用于差分进化算法的变异算子和杂交算子中,给出了一种插值局部搜索算子,提出了嵌入正交杂交和局部搜索的差分进化算法。对20个标准测试函数作了数值实验。与文献中其它差分进化算法的比较结果表明了算法的有效性。
     由预估-校正法确定前馈神经网络结构的情况下,将改进的差分进化算法和Levenberg-Marquardt (LM)算法相结合,提出了一种混合训练算法,优化前馈神经网络的权值和阈值。将该进化神经网络应用于函数逼近、模式分类和识别问题。
     2.对同时含有二进制变量和实数变量的非线性优化问题,采用二进制和实数混合编码,在差分进化变异算子中引入异或逻辑运算处理二进制变量,将正交杂交引入到杂交算子中,提出了一种协同二进制—实数差分进化算法。对该算法作了大量的数值测试,并与文献中已有的进化算法作了比较,结果表明了该算法性能优良。
     将改进的协同二进制—实数差分进化算法和尺度化共轭梯度反向传播算法相结合,构成两阶段训练算法,同时优化前馈神经网络的结构和权值。将该进化神经网络应用于函数逼近和模式分类问题。
     3.对含有二进制变量和整数变量的非线性离散优化问题,采用二进制和整数混合编码,分别将异或逻辑运算和正交杂交引入到离散差分进化算法的变异算子和杂交算子中,提出了一种协同二进制—整数差分进化算法,对该算法作了大量的数值测试,并与文献中已有算法作了比较,结果表明了该算法的有效性。
     采用协同二进制—整数差分进化算法,同时进化前馈神经网络的结构和整数权值。将该进化神经网络应用于函数逼近和模式分类问题。
     4.针对Lapicque提出的Integrate-and-Fire(IF)模型,给出了一种新的侧抑制连接的IF网络模型,并讨论了其输入—输出关系。与以往的IF模型相比,此模型的活动方程被大大简化了。其运行结果很好地拟合了神经细胞的生理特性,尤其是该模型较好地匹配了突触连接的非线性特性。对其点火机制进行了改进,采用异步点火法,这使得网络的适应性有了很大的提高。
     讨论了指数衰减阈值对高斯白噪声刺激下的IF神经元的影响,主要研究了对点火脉冲时间间隔的均值和标准差的影响。结果显示当阈值衰减缓慢时,不论神经元的点火频率何时与衰减频率相当,脉冲时间间隔的变化系数都能达到最小值。通过改变噪声强度或者输入电流而改变点火频率也可以产生同样的影响。分析了在神经元点火后重新设置膜电位所引起的误差。
     5.提出了一种具有经典条件反射行为的认知模型,该模型以IF神经元为基本元素,互联形成具有反射弧结构的神经网络,能充分表现经典条件反射对时间的依赖性。计算机仿真显示IF模型能成功地模拟习得、遗忘、刺激间隔效应、阻止和二阶条件反射等现象。
     6.提出了一种具有暂态混沌的细胞神经网络,该网络是利用欧拉算法将模型的状态方程转化为离散形式,并引入一项负的自反馈而形成的。对单个神经元的仿真发现该模型具有分叉和混沌的特性。在函数优化中,该网络首先经过一个倍周期倒分叉过程进行混沌搜索,然后进行类似Hopfield网络的梯度搜索。由于它利用了混沌搜索固有的随机性和轨道遍历性,因而具有较强的全局寻优的能力。用两个函数优化的例子验证了这种网络的有效性。
Artificial neural networks (ANNs) are one of the important branches in the artificialintelligence fields. ANNs are the nature-based computing techniques which have beenapplied widely in tasks such as controls, prediction, optimization, system identification,signal processing and pattern recognition, etc. This dissertation is focused on analysisand optimization of some important ANNs including evolutionary ANNs,Integrate-and-Fire ANNs and cellular ANNs, as well as their applications in functionapproximation, pattern recognition, data classification and image processing. The mainworks on these subjects can be summarized as follows:
     1. For solving unconstrained global optimization problems, a differential evolution(DE) with orthogonal crossover and local search is proposed. In the proposed algorithm,the Gaussian mutation and orthogonal crossover are combined with the DE mutationand DE crossover operators, respectively. The simplified quadratic interpolation is thentaken as the local search operator of the proposed algorithm. The simulationexperiments on20benchmark functions are carried out. The comparisons with otherDEs show its effectiveness and superiority.
     After the structures of feedforward ANNs are determined by trial and error, a hybridtraining algorithm, in which the modified DE algorithm is combined with theLevenberg-Marquardt method, is proposed to optimize the network weights and bias.The evolutionary ANNs are used in function approximation, pattern classification andrecognition.
     2. For solving the nonlinear optimization problems involving binary and realvariables, a cooperative binary-real differential evolution is proposed. In this algorithm,a binary-real mixed encoding is used, and the XOR logic operation is introduced in DEmutation to deal with binary variables, as well as the orthogonal crossover is combinedwith DE crossover. Some well-known benchmark problems are used to validate itsefficiency. The proposed algorithm performs well, and the results obtained are verycompetitive when comparing the proposed algorithm against other existing algorithms.
     To optimize simultaneously structure and weights of feedforward ANNs, a two-stagetraining algorithm is formed by combining the modified cooperative binary-real DEwith the scaled conjugate gradient method. The evolutionary ANNs are used in functionapproximation and pattern classification.
     3. For solving the nonlinear discrete optimization problems with binary and (or)integer variables, a cooperative binary-integer differential evolution is proposed. In theproposed algorithm, a binary-integer mixed encoding is used, and the XOR logicoperation is introduced in the DE mutation to deal with binary variable, as well as theorthogonal crossover is combined with the DE crossover. Some numerical examples areused to validate its effectiveness. The results obtained are also compared against those of other existing algorithms.
     To optimize simultaneously structure and integer weights of feedforward ANNs, thecooperative binary-integer DE training algorithm is proposed. The evolutionary ANNsare used in function approximation and pattern classification.
     4. To study the Integrate-and-Fire (IF) model, a new IF model is presented, and therelation between input and output is given, in which each of the nerve cells restrains theothers nearby them. Although this model has been simplified greatly, it characterizesmany aspects of real neurons. Especially, it is comparatively good that the modelmatches the nonlinear performance of the synaptic connection. We develop the previousmechanism and use the asynchronous firing mechanism, which makes the network moreflexible.
     The effect of an exponentially decaying threshold on a white-noise driven IF neuronis studied, especially on the mean and standard variance of the interspike interval. It isshown that for slow threshold decay the IF model shows a minimum in the coefficientof variation whenever the firing rate of the neuron matches the decay rate of thethreshold. This novel effect can be seen if the firing rate is changed by varying the noiseintensity or the input current. The errors are analyzed, which is associated with resettingthe potential following a spike in simulations of IF neural networks.
     5. A cognitive model is presented with classical conditioning behaviors. The modelcomprises a number of IF neurons connecting to form a neural network with reflex arcstructure, which made it fully exhibit the dependency of classical conditioning ontiming. The simulation results show that the model can successfully simulate manytypical experiments such as acquire, extinction, inter-stimulus effects, block andsecondary conditioning.
     6. A new model of cellular neural network with transient chaos is proposed, in whicha negative self-feedback is introduced into a cellular neural network after transformingthe dynamic equation to discrete time via Euler’s method. The simulation of singleneuron model shows their characteristics of bifurcation and chaos. In the optimizationproblem, the model gradually approaches, through a chaos search by the course ofreversed period-doubling bifurcations, to a dynamical structure similar to the Hopfieldneural network which converges to a stable equilibrium point. As the model has richdynamics such as randomicity, it can be expected to have robust search ability for globaloptimal solutions. The simulation results on two examples of function optimizationshow that the neural networks are efficient.
引文
[1] Xin Yao. A review of artificial neural networks [J]. Int J of Intelligent Systems,1993,8(4):539-567.
    [2] Xin Yao, M. M. Islam. Evolving artificial neural network ensembles [J]. IEEEComputational Intelligence Magazine,2008,3(1):31-42.
    [3] S. Das, P. N. Suganthan. Differential Evolution: a survey of the state-of-the-art [J].IEEE Trans. on Evolutionary Computation,2011,15(1):4-31.
    [4] L. Zhao, F. Qian. Tuning the structure and parameters of a neural network usingcooperative binary-real particle swarm optimization [J]. Expert Systems withApplications,2011,38(5):4972-4977.
    [5] S. Ding, H. Li, C. Su, et al. Evolutionary artificial neural networks: a review [J].Artif. Intell. Rev, Published online:17June2011, doi:10.1007/s10462-011-9270-6
    [6]冯大政,保铮,焦李成.脉冲发放神经网络建模[J].系统工程与电子技术,1996,5(3):23-30.
    [7] D. Hansel, G. Mato, C. Meunier, et al. On numerical simulations ofintegrate-and-fire neural networks [J]. Neural Computation,1998,10(2):467-483.
    [8] David Terman, DeLiang Wang. Global competition and local cooperation in anetwork of neural oscillators [J]. Physica D,1995,81(1-2):148-176.
    [9]冯大政.一种人工神经元:模型及分析[J].通信学报,1992,13(5):43-48.
    [10] Shih-Chii Liu, Rodney Douglas. Temporal coding in a silicon network ofintegrate-and-fire neurons [J]. IEEE Trans. on Neural Networks,2004,15(5):1305-1314.
    [11] L. O. Chua, L. Yang. Cellular neural networks: theory [J]. IEEE Trans. CircuitsSyst.,1988,35(10):1257-1272.
    [12] R. Storn and K. V. Price. Differential evolution: A simple and efficient adaptivescheme for global optimization over continuous spaces [R]. ICSI, USA, Tech. Rep.TR-95-012,1995[Online]. Available: http://icsi.berkeley.edu/~storn/litera.html
    [13] N. Noman, H. Iba. Accelerating differential evolution using an adaptive localsearch [J]. IEEE Trans. on Evolutionary Computation,2008,12(1):107-125.
    [14] R. Storn, K. Price. Differential evolution-a simple and efficient heuristic for globaloptimization over continuous spaces [J]. Journal of Global Optimization,1997,11(4):341-359.
    [15] J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer. Self adapting controlparameters in differential evolution: A comparative study on numerical benchmarkproblems [J]. IEEE Trans. Evol. Comput.,2006,10(6):646-657.
    [16] K. Zielinski, D. Peters, R. Laur. Run time analysis regarding stopping criteria fordifferential evolution and particle swarm optimization [C]. in Proc.1st Int. Conf.Exp. Process System Modelling Simulation Optimization,2005[Online].Available: http://www.item.uni-bremen.de research papers paper.pdfZielinski.Karin zielinski05run.pdf
    [17] S. Das, A. Abraham, U.K. Chakraborty, et al. Differential evolution using aneighborhood-based mutation operator [J]. IEEE Trans. on EvolutionaryComputation,2009,13(3):526-553.
    [18] M.G. Epitropakis, D.K. Tasoulis, N.G. Pavlidis, et al. Enhancing DifferentialEvolution Utilizing Proximity-Based Mutation Operators [J]. IEEE Trans. onEvolutionary Computation,2011,15(1):99-119.
    [19] H.Y. Fan, J. Lampinen. A trigonometric mutation operation to differentialevolution [J]. Journal of Global Optimization,2003,27(1):105-129.
    [20] V. Feoktistov, S. Janaqi. Generalization of the strategies in differential evolution
    [C]. Proc of the18th Int Parallel and Distributed Processing Symposium. Santa Fe,2004:165-170.
    [21] P. Kaelo, M.M. Ali. A numerical study of some modified differential evolutionalgorithms [J]. European J of Operational Research,2006,169(3):1176-1184.
    [22] R. Angira, A. Santosh. Optimization of dynamic systems: a trigonometricdifferential evolution approach [J]. Comput. Chem. Eng.,2007,31(9):1055-1063.
    [23] J. Zhang, A.C. Sanderson. JADE: adaptive differential evolution with optionalexternal archive [J]. IEEE Trans. on Evolutionary Computation,2009,13(5):945-958.
    [24]池元成,方杰,蔡国飙.中心变异差分进化算法[J].系统工程与电子技术,2010,32(5):1105-1108.
    [25] Y. Wang, Z. Cai, Q. Zhang. Differential evolution with composite trial vectorgeneration strategies and control parameters [J]. IEEE Trans. on EvolutionaryComputation,2011,15(1):55-66.
    [26] W. Gong, Z. Cai, C. X. Ling, et al. Enhanced differential evolution with adaptivestrategies for numerical optimization [J]. IEEE Trans. on Systems, Man, andCybernetics—Part B: Cybernetics,2011,41(2):397-413.
    [27] M. M. Ali, A. T rn. Population set-based global optimization algorithms: somemodifications and numerical studies [J]. Comput. Oper. Res.,2004,31(10):1703-1725.
    [28] J. Sun, Q. Zhang, E. Tsang. DE/EDA: a new evolutionary algorithm for globaloptimization [J]. Information Sciences,2005,169(3):249-262.
    [29] E. Mininno, F. Neri, F. Cupertino, et al. Compact differential evolution [J]. IEEETrans. on Evolutionary Computation,2011,15(1):32-54.
    [30]龚文引,刘小波,蔡之华.一种基于正交设计的快速差分演化算法及其应用研究[J].小型微型计算机系统,2007,28(7):1297-1301.
    [31] S. Rahnamayan, H. R. Tizhoosh, M. M. A. Salama. Opposition-based differentialevolution [J]. IEEE Trans. on Evolutionary Computation,2008,12(1):64-79.
    [32] Z. Y. Yang, E. K. Tang, X. Yao. Large scale evolutionary optimization usingcooperative coevolution [J]. Inf. Sci.,2008,87(15):2985-2999.
    [33] M. Pant, M. Ali, A. Abraham. Mixed mutation strategy embedded differentialevolution [C].2009IEEE Congress on Evolutionary Computation (CEC2009),2009,1240-1246.
    [34] J. Chen, B. Xin, Z.H. Peng, et al. Statistical learning makes the hybridization ofparticle swarm and differential evolution more efficient—a novel hybrid optimizer[J]. Science in China Series F-Information Sciences,2009,52(7):1278-1282.
    [35] B. Xin, J. Chen, Z.H. Peng, et al. An adaptive hybrid optimizer based on particleswarm and differential evolution for global optimization [J]. Sci China Inf Sci,2010,53(5):980-989.
    [36]张雪霞,陈维荣,戴朝华.带局部搜索的动态多群体自适应差分进化算法及函数优化[J].电子学报,2010,38(8):1825-1830.
    [37] H. Li, Y.C. Jiao, L. Zhang. Hybrid differential evolution with a simplified quadraticapproximation for constrained optimization problems [J]. EngineeringOptimization,2011,43(2):115-134.
    [38] J. Teo. Exploring dynamic self-adaptive populations in differential evolution [J].Soft Comput.,2006,10(8):637-686.
    [39] M. H. Lee, C. H. Han, K. S. Chang. Dynamic optimization of a continuouspolymer reactor using a modified differential evolution algorithm [J]. Industrial&Engineering Chemistry Research,1999,38(12):4825-4831.
    [40] J. Liu, J. Lampinen. A fuzzy adaptive differential evolution algorithm [J]. SoftComputing-A Fusion of Foundations, Methodologies and Applications,2005,9(6):448-462.
    [41] S. Das, A. Konar, U. K. Chakraborty. Two improved differential evolutionschemes for faster global search [C]. in Proc. Genetic Evol. Comput. Conf.(GECCO), Jun.2005, pp.991-998.
    [42] A. K. Qin, P.N. Suganthan. Self-adaptive differential evolution algorithm fornumerical optimization [C]. in Proc. IEEE Congr. Evol. Comput., Vol.2. Sep.2005, pp.1785-1791.
    [43] A. K. Qin, V. L. Huang, and P. N. Suganthan. Differential evolution algorithm withstrategy adaptation for global numerical optimization [J]. IEEE Trans. Evolut.Comput.,2009,13(2):398-417.
    [44] M. F. Tasgetiren, P. N. Suganthan. A multi-populated differential evolutionalgorithm for solving constrained optimization problem [C]. in Proceedings of theCongress on Evolutionary Computation (CEC’2006), IEEE Press, SheratonVancouver Wall Centre Hotel, Vancouver, BC, Canada,2006, pp.33-40.
    [45] F. Huang, L. Wang, Q. He. A hybrid differential evolution with double populationsfor constrained optimization [C].2008IEEE Congress on EvolutionaryComputation (CEC2008),2008,18-25.
    [46] N. Garcia-Pedrajas, C. Hervas-Martinez, J. Munoz-Perez. COVNET: a cooperativecoevolutionary model for evolving artificial neural networks [J]. IEEE Transactionon Neural Networks,2003,14(3):575-596.
    [47] K. Hornick, M. Stinchcombe, H. White. Multilayer feedforward networks areuniversal approximators [J]. Neural Networks,1989,2:359-366.
    [48] Adam Slowik, Michal Bialko. Training of artificial neural networks usingdifferential evolution algorithm [C]. HSI2008, Krakow, Poland, May25-27,2008,pp.60-65.
    [49] J.G. Wang, L. Shang, S.F. Chen, et al. Application of fuzzy classification byevolutionary neural network in incipient fault detection of power transformer [C].In: Wunsch D, et al., eds. Proc. of the Int’l Joint Conf. on Neural Networks,IJCNN2004. New York: IEEE Press,2004.2279-2283.
    [50] R.S. Sexton, R.S. Sriram, H. Etheridge. Improving decision effectiveness ofartificial neural networks: a modified genetic algorithm approach [J]. Decis Sci,2003,34(3):421-442.
    [51]商琳,王金根,姚望舒等.一种基于多进化神经网络的分类方法[J].软件学报,2005,16(9):1577-1583.
    [52] S.U. Ahmed, M. Shahjahan, K. Murase. Injecting chaos in feedforward neuralnetworks [J]. Neural Process Lett,2011,34(1):87-100.
    [53] Y. Xin. Evolutionary artificial neural networks [J]. Int J of Neural Systems,1993,4(3):203-222.
    [54] K.J. Kim, S.B. Cho. Evolutionary ensemble of diverse artificial neural networksusing speciation [J]. Neurocomputing,2008,71:1604-1618.
    [55] Serkan Kiranyaz, Turker Ince, Alper Yildirim, et al. Evolutionary artificial neuralnetworks by multi-dimensional particle swarm optimization [J]. Neural Networks,2009,22:1448-1462.
    [56] H.A. Abbass. An evolutionary artificial neural networks approach for breast cancerdiagnosis [J]. Artificial Intelligence in Medicine,2002,25(3):265-281.
    [57] J. Iionen, J.K. Kamarainen, J. Lampinen. Differential Evolution Training Algorithmfor Feed-forward Neural Networks [J]. Neural Processing Letters,2003,17(1):93-105.
    [58] D.M. George, P.P. Vassilis, N.V. Michael. Neural network-based colonoscopicdiagnosis using on-line learning and differential evolution [J]. Applied SoftComputing,2004,(4):369-379.
    [59] B. Liu, L. Wang, Y.H. Jin, et al. Designing neural networks using hybrid particleswarm optimization [C]. Lecture Notes in Computer Science. Berlin: Springer,2005:391-397.
    [60]王刚,高阳,夏洁.基于差异进化算法的人工神经网络快速训练研究[J].管理学报,2005,2(4):450-454.
    [61] L. B. Liu, Y. J. Wang, D. Huang. Designing neural networks using PSO-basedmemetic algorithm [C]. in: Proceedings of the Fourth International Symposium onNeural Networks (ISNN’07),2007, pp.219-224.
    [62]李祚泳,汪嘉杨,郭淳. PSO算法优化BP网络的新方法及仿真实验[J].电子学报,2008,36(11):2224-2228.
    [63] M.G. Epitropakis, V.P. Plagianakos, M.N. Vrahatis. Hardware-friendly higher-orderneural network training using distributed evolutionary algorithms [J]. Applied SoftComputing,2010,10(2):398-408.
    [64] J. Bao, Y. Chen, J. Yu. A regeneratable dynamic differential evolution algorithmfor neural networks with integer weights [J]. Journal of ZhejiangUniversity-SCIENCE C (Computers&Electronics),2010,11(12):939-947.
    [65] S. Ding, C. Su, J. Yu. An optimizing BP neural network algorithm based on geneticalgorithm [J]. Artif Intell Rev,2011,36(2):153-162.
    [66] Daniel Rivero, Julian Dorado, Juan Rabu al, et al. Generation and simplification ofartificial neural networks by means of genetic programming [J]. Neurocomputing,2010,73:3200-3223.
    [67] F.H.F. Leung, H.K. Lam, S.H. Ling, P.K.S. Tam. Tuning of the structure andparameters of a neural network using an improved genetic algorithm[J]. IEEETransaction on Neural Networks,2003,14(1):79-88.
    [68] M. Fream. The upstart algorithm: a method for constructing and trainingfeedforward neural networks [J]. Neural Comput,1990,2(2):198-209.
    [69] A. Roy, L. Kim, S. Mukhopaduyay. A polynomial time algorithm for theconstruction and training of a class of multilayer perceptrons [J]. Neural Networks,1993,6(4):535-545.
    [70] R. Reed. Pruning algorithms—a survey [J]. IEEE Trans. Neural Networks,1993,4(5):740-747.
    [71] Ioannis Tsoulos, Dimitris Gavrilis, Euripidis Glavas. Neural network constructionand training using grammatical evolution [J]. Neurocomputing,2008,72:269-277.
    [72] J.Y. Jung, J.A. Reggia. Evolutionary design of neural network architectures using adescriptive encoding language [J]. IEEE Trans. Evol. Comput.,2006,10(6):676-688.
    [73] W.J. Puma-Villanueva, F.J. Von Zuben. Evolving arbitrarily connectedfeedforward neural networks via genetic algorithms [C]. in: Proceedings of theBrazilian Symposium on Neural Networks,2010,127-132.
    [74] W.J. Puma-Villanueva, E.P. dos Santos, F.J. Von Zuben. A constructive algorithm tosynthesize arbitrarily connected feedforward neural networks [J]. Neurocomputing,2012,75(1). Online, doi:10.1016/i.neucom.2011.05.025
    [75] Xin Yao, Yong Liu. A new evolutionary system for evolving artificial neuralnetworks [J]. IEEE Trans on NN,1997,8(3):694-713.
    [76] J.T. Tsai, J.H. Chou, T.K. Liu. Tuning the structure and parameters of a neuralnetwork by using hybrid Taguchi-genetic algorithm [J]. IEEE Transaction onNeural Networks,2006,17(1):69-80.
    [77]高玮.新型进化神经网络模型[J].北京航空航天大学学报,2004,30(11):1101-1105.
    [78] Jianbo Yu, Lifeng Xi, Shijin Wang. An improved particle swarm optimization forevolving feedforward artificial neural networks [J]. Neural Process Lett,2007,26:217-231.
    [79] Jianbo Yu, Shijin Wang, Lifeng Xi. Evolving artificial neural networks using animproved PSO and DPSO [J]. Neurocomputing,2008,71:1054-1060.
    [80] Juan Peralta, Xiaodong Li, German Gutierrez, et al. Time series forecasting byevolving artificial neural networks using genetic algorithms and differentialevolution [C]. WCCI2010IEEE World Congress on Computational Intelligence,July,18-23,2010-CCIB, Barcelona, Spain, pp.3999-4006.
    [81] A. Alvarez. A neural network with evolutionary neurons [J]. Neural Process Lett,2002,16(1):43-52.
    [82] L.M. Almeida, T.B. Ludermir. A multi-objective memetic and hybrid methodologyfor optimizing the parameters and performance of artificial neural networks [J].Neurocomputing,2010,73:1438-1450.
    [83] A.R. Carvalho, F.M. Ramos, A.A. Chaves. Metaheuristics for the feedforwardartificial neural network (ANN) architecture optimization problem [J]. NeuralComputing&Applications,2011,20(8):1273-1284.
    [84] N.Garcia-Pedrajas, C.Hervas-Martinez, D.Ortiz-Boyer. Cooperative coevolution ofartificial neural network ensembles for pattern classification [J]. IEEE Transactionson Evolutionary Computation,2005,9(3):271-302.
    [85] Chi-Keong Goh, Eu-Jin Teoh, Kay Chen Tan. Hybrid multiobjective evolutionarydesign for artificial neural networks [J]. IEEE Trans. on Neural Networks,2008,19(9):1531-1548.
    [86] S.J. Thorpea, R. Guponneaua, et al. SpikeNet: Real-time visual processing with onespike neuron [J]. Neurocomputing,2004,58(60):857-864.
    [87] J.G. Harris, J. Xu, I. Rastog, et al. Real time signal reconstruction from spikes on adigital signal processor [C]. IEEE International symposium on circuits and systems.Seattle, WA: IEEE Computer Society,2008:1060-1063.
    [88] M.J. Escobar, G. S. Masson, Vievillet, et al. Action recognition using a bio-inspiredfeedforward spiking network [J]. International Journal of Computer Vision,2009,82(3):284-301.
    [89] X.Q. Wang, Z.G. Hou, A.M. Zou, et al. A behavior controller based on spikingneural networks for mobile robots [J]. Neurocomputing,2008,71(4/6):655-666.
    [90] Z.W. Shi, Z.Z. Shi, X. Liu, et al. A computational model for feature binding [J].Science in China Series C: Life Science,2008,51(5):470-478.
    [91] Q.X. Wu, T.M. Mcginnity, L.P. Maguire, et al. Learning mechanism in networks ofspiking neurons [C]. Studies in Computational Intelligence. Berlin:Springer-Verlag,2007:171-197.
    [92] Q.X. Wu, T.M. Mcginnity, L.P. Maguire, et al. Processing visual stimuli usinghierarchical spiking neural networks [J]. Neurocomputing,2008,71(11):2055-2068.
    [93]蔡荣太,吴庆祥.基于脉冲神经网络的红外目标提取[J].计算机应用,2010,30(12):3327-3330.
    [94]陈浩,吴庆祥,王颖等.基于脉冲神经网络模型的车辆车型识别[J].计算机系统应用,2011,20(4):182-185.
    [95] M. W. Spratling, M. H. Johnson. Exploring the functional significance of dendriticinhibition in cortical pyramidal cells [J]. Neurocomputing,2003,52-54:389-395.
    [96] A. Delorme. Early cortical orientation selectivity: how fast inhibition decodes theorder of spike latencies [J]. Journal of Computational Neuroscience,2003,15(3):357-365.
    [97]冯大政.具有非线性突触联接和适应性的人工神经元电路模型[J].电子科学学刊,1994,16(2):113-120.
    [98] R. H. Hahnloser, R. Sarpeshkar, M. Mahowald, et al. Digital selection andanalogue amplification coexist in a cortex-inspired silicon circuit [J]. Nature,2000,405:947-951.
    [99] H. S. Seung, D. D. Lee. Cognition the manifold ways of perception [J]. Science,2000,290:2268-2269.
    [100] H. Wersing, J.J. Steil, H. Ritter. A competitive-layer model for feature bindingand sensory segmentation [J]. Neural Computation,2001,13:357-387.
    [101] J. M. Buhmann, T. Lange, U. Ramacher. Image segmentation by networks ofspiking neurons [J], Neural Computation,2005,17:1010-1031.
    [102] M. Gilli. Stability of cellular neural networks and delayed cellular neuralnetworks with nonpositive templates and nonmonotonic output functions [J].IEEE Trans. CAS I,1994,41(8):518-528.
    [103] F.A. Savaci, J. Vandcuialle. On the stability of cellular neural networks [J]. IEEETrans. CAS I,1993,40(3):213-215.
    [104] S. Arik, V. Tavsanoglu. Equilibrium analysis of nonsymmetric CNNs [J]. Int. J.Circuit Theory Application,1996,34:269-274.
    [105] N. Takahashi, L.O.Chua. On the complete stability of nonsymmetric cellularneural networks [J]. IEEE Trans. Circuits Syst,1998,45(7):754-758.
    [106] H. Yand, T.S. Dillon. Exponential stability and oscillation of Hopfield gradedresponse neural network [J]. IEEE Trans. Neural Networks,1994,5(5):719-729.
    [107] M. Gilli. Strange attractors in delayed cellular neural networks [J]. IEEETrans.Circuits Syst,1993,40(11):849-853.
    [108] L. Chen, K. Aihara. Chaotic simulated annealing by a neural network model withtransient chaos [J]. Neural Networks,1995,8(6):915-930.
    [109] T. Masters, W. Land. A new training algorithm for the general regression neuralnetwork [C]. In: IEEE International Conference on Systems, Man, andCybernetics, Computational Cybernetics and Simulation,1997,3:1990-1994.
    [110]方开泰,马长兴.正交与均匀试验设计[M].北京:科学出版社,2001.
    [111] Q. Zhang, Y. W. Leung. An orthogonal genetic algorithm for multimedia multicastrouting [J]. IEEE Trans. on Evolutionary Computation,1999,3(1):53-62.
    [112] H. Li, Y. C. Jiao, L. Zhang, et al. Genetic algorithm based on the orthogonaldesign for multidimensional knapsack problems [C]. Proc. of Second InternationalConference on Advances in Natural Computation. Germany: Springer-Verlag,2006:696-705.
    [113] Y. W. Leung, Y. Wang. An orthogonal genetic algorithm with ouantization forglobal numerical optimization [J]. IEEE Trans. on Evolutionary Computation,2001,5(1):41-53.
    [114] J. T. Tsai, T. K. Liu, J. H. Chou. Hybrid Taguchi-genetic algorithm for globalnumerical optimization [J]. IEEE Trans. on Evolutionary Computation,2004,8(4):365-377.
    [115] S. Y. Ho, L. S. Shu, J. H. Chen. Intelligent evolutionary algorithms for largeparameter optimization problems [J]. IEEE Trans. on Evolutionary Computation,2004,8(6):522-540.
    [116] Y. Wang, H. Liu, Z. Cai, et al. An orthogonal design based constrainedevolutionary optimization algorithm [J]. Engineering Optimization,2007,39(6):715-736.
    [117]李宏,焦永昌,张莉.一种求解混合整数规划的混合进化算法[J].控制与决策,2008,23(10):1098-1102.
    [118] X. Yao, Y. Liu, G. Lin. Evolutionary programming made faster [J]. IEEE Trans.Evol. Comput.,1999,3(2):82-102.
    [119] Lino Costa, Pedro Oliveira. Evolutionary algorithms approach to the solution ofmixed integer non-linear programming problems [J]. Computers and ChemicalEngineering,2001,25(2-3):257-266.
    [120] Y. C. Lin, K. S. Hwang. A mixed-coding scheme of evolutionary algorithms tosolve mixed-integer nonlinear programming problems [J]. Computers andMathematics with Applications,2004,47(8-9):1295-1307.
    [121] Y. Luo, X. Yuan, Y. Liu. An improved PSO algorithm for solving non-convexNLP/MINLP problems with equality constraints [J]. Computers and ChemicalEngineering,2007,31(3):153-162.
    [122] C. Mohan, H. T. Nguyen. A controlled random search technique incorporating thesimulated annealing concept for solving integer and mixed integer globaloptimization problems [J]. Computational Optimization and Applications,1999,14(1):103-132.
    [123] E. Mezura-Montes, C.A.C. Coello. A simple multimembered evolution strategyto solve constrained optimization problems [J]. IEEE Trans. on Evol. Comput.,2005,9(1):1-17.
    [124] Z. Cai, Y. Wang. A multiobjective optimization-based evolutionary algorithm forconstrained optimization [J]. IEEE Trans. on Evol. Comput.,2006,10(6):658-675.
    [125] Y. Wang, Z. Cai, G. Guo, Y. Zhou. Multiobjective optimization and hybridevolutionary algorithm to solve constrained optimization problems [J]. IEEETrans. Syst., Man, Cybern. B, Cybern,2007,37(3):560-575.
    [126] Y. Wang, Z. Cai, Y. Zhou, et al. An adaptive tradeoff model for constrainedevolutionary optimization [J]. IEEE Trans. On Evol. Comput.,2008,12(1):80-92.
    [127] K. E. Parsopoulos, M. N. Vrahatis. Recent approaches to gloabal optimizationproblems through Particle Swarm Optimization [J]. Natural Computing,2002,1(2-3):235-306.
    [128] E. C. Laskari, K. E. Parsopoulos, M. N. Vrahatis. Particle swarm optimization forinteger programming [C]. Proc of Congress on Evolutionary Computation,Washington DC, IEEE Computer Society,2002,1582-1587.
    [129] R. Poli, J. Kennedy, T. Blackwell. Particle swarm optimization-an overview [J].Swarm Intelligence,2007,1(1):33-57.
    [130] W. X. Zhu, H. Fan. A discrete dynamic convexized method for nonlinear integerprogramming [J]. Journal of Computational and Applied Mathematics,2009,223:356-373.
    [131] C. K. Ng, L. S. Zhang, D. Li, et al. Discrete filled function method for discreteglobal optimization [J]. Comput. Optim. Appl.,2005,31:87-115.
    [132] Kusum Deep, Krishna Pratap Singh, M.L. Kansal, C. Mohan. A real codedgenetic algorithm for solving integer and mixed integer optimization problems [J].Applied Mathematics and Computation,2009,212:505-518.
    [133] V. P. Plagianakos, M. N. Vrahatis. Training neural networks with thresholdactivation functions and constrained integer weights [C]. Proceedings of the IEEEInternational Joint Conference on Neural Networks (IJCNN2000),2000.
    [134] V. P. Plagianakos, M. N. Vrahatis. Parallel evolutionary training algorithms for“hardware-friendly” neural networks [J]. Natural Computing,2002,1:307-322.
    [135] D. R. Hush, N. G. Horne. Progress in supervised neural networks [J]. IEEE SignalProcess Mag,1993,10:8-39.
    [136]边肇祺,张学工等.模式识别(第二版)[M].北京:清华大学出版社,1999,250-272.
    [137] C. Charalambous. Conjugate gradient algorithm for efficient training of artificialneural networks [J]. IEE G (Circuits, Devices and Systems),1992,139(3):301-310.
    [138] M. T. Hagan, M. B. Menhaj. Training feedforward networks with the Marquadtalgorithm [J]. IEEE Transactions on Neural Networks,1994,5(6):989-993.
    [139]袁亚湘,孙文瑜.最优化理论与方法[M].北京:科学出版社,1997.
    [140] A. Asuncion, D. J. Newman. UCI machine learning repository [DB]. School ofinformation and computer science, University of California, Irvine.2007.http://www.ics.uci.edu/
    [141]许东,吴铮.基于MATLAB6.x的系统分析与设计——神经网络(第二版)[M].西安:西安电子科技大学出版社,2002.
    [142] T. Behan, Z. Liao, L. Zhao, C.T. Yang. Accelerating Integer Neural Networks onLow Cost DSPs [C]. Proc. Int. Conf. on Intelligent Systems,2008, pp.1270-1273.
    [143] Y. Yan, H. Zhang, B. Zhou. A new learning algorithm for neural networks withinteger weights and quantized non-linear activation functions [J]. InternationalFederation for Information Processing,2008,276:427-431.
    [144] J. Bao, B. Zhou, Y. Yan. A genetic-algorithm-based weight discretizationparadigm for neural networks [C]. WRI World Conf. on Computer Science andInformation Engineering,2009, pp.655-659.
    [145]王伟著.人工神经网络原理:入门与应用[M].北京航空航天大学出版社,1995.
    [146] Martin T. Hagan等著,戴葵等译.神经网络设计[M].机械工业出版社,2002.
    [147] Stephen W. Kuffler等著,张人骥等译.神经生物学[M].北京大学出版社,1991.
    [148] John J. Hopfield, Andreas V. M. Herz. Rapid local synchronization of actionpotentials: toward computation with coupled integrate-and-fire neurons [J]. Proc.Natl. Acad. Sci, USA.1995,92(7):6655-6662.
    [149]王从庆,杜红伟.一种基于皮层柱侧抑制机制的神经网络群[J].信息与控制,2008,37(4):413-417.
    [150] D. Hansel, G. Mato, C. Meunie. Synchrony in excitatory neural networks [J].Neural Computation,1995,7:307-337.
    [151] D. Hansel, G. Mato. Asynchronous states and the emergence of synchrony inlarge networks of interacting excitatory and inhibitory neurons [J]. NeuralComputation,2003,15:1-56.
    [152] D. Golomb, D. Hansel. The number of synaptic inputs and the synchrony of largesparse neuronal networks [J]. Neural Computation,2000,12:1095-1139.
    [153] U.A. Wiedemann, A. Lüthi. Timing of synchronization by refractory mechanisms[J]. Journal Neurophysiol,2003,90:3902-3911.
    [154] D. Golomb, J. Rinzel. Dynamics of globally coupled inhibitory neurons withheterogeneity [J]. Physical Review E,1993,48(10):4810-4814.
    [155] C. van Vreeswijk, D. Hansel. Patterns of synchrony in neural networks with spikeadaptation [J]. Neural Computation,2001,13(5):959-992.
    [156] Michael J. Shelley, Louis Tao. Efficient and accurate time-stepping schemes forintegrate-and-fire neuronal networks [J]. Journal of Computational Neuroscience,2001,11(2):111-119.
    [157] Xuedong Zhang, Laurel H. Carney. Response properties of an integrate-and-firemodel that receives subthreshold inputs [J]. Neural Computation,2005,17:2571-2601.
    [158] P. C. Nelson, L. H. Carney. A phenomenological model of peripheral and centralneural responses to amplitude-modulated tones [J]. J. Acoust. Soc. Am.,2004,116:2173-2186.
    [159] B. Lindner, A. Longtin. Effect of an exponentially decaying threshold on thefiring statistics of a sochastic integrate-and-fire neuron [J]. Journal of TheoreticalBiology,2005,232:505-521.
    [160] B. Lindner, L. S. Geier. Maximizing spike train coherence or incoherence in theleaky integrate-and-fire model [J]. Physical Review E,2002,66(3):1916-1921.
    [161] P. Tiesinga, J. V. Jose. Comparison of current-driven and conductance-drivenneocortical model neurons with hodgkin-huxley voltage-gated channels [J].Physical Review E,2000,62(6):8413-8419.
    [162] V. D. Maio, P. Lánsky, R. Rodriguez. Different types of noise in leakyintegrate-and-fire model of neuronal dynamics with discrete periodical input [J].Gen. Physiol. Biophys.,2004,23:21-38.
    [163]方勇,戚飞虎,裴炳镇.一种新的PCNN实现方法及其在图像处理中的应用[J].红外与毫米波学报,2005,24(4):291-295.
    [164] R. Rodriguez, P. Lánsky. Vesicular mechanism and estimates of fringprobability in a network of spiking neurons [J]. Physica D,2003,181:132-145.
    [165] D. J. Aidley.周培爱等译.可兴奋细胞的生理学[M].北京:北京大学出版社,1983.
    [166] S. Shinomoto, K. Shima, J. Tanji. New classification scheme of cortical sites withthe neuronal spiking characteristics [J]. Neural Networks,2002,15:1165-1169.
    [167] Nicolas Brunel, Peter E. Latham. Firing rate of the noisy quadraticintegrate-and-fire neuron [J]. Neural Computation,2003,15:2281-2306.
    [168] R. A. Rescorla, A. R. Wagner. A theory of Pavlovian conditioning: variations inthe effectiveness of reinforcement and nonreinforecement [R]. ClassicalConditioning II: current research and theory. New York:Appleton-Century-Crofts,1972,64-99.
    [169] R. S. Sutton, A. G. Barto. Toward a modern theory of adaptive networks:expectation and prediction [J]. Psychological Review,1981,88(2):135-170.
    [170] R. S. Sutton, A. G. Barto. Time-derivative models of pavlovian reinforcement [R].Learning and computational neuroscience: foundations of adaptive networks.Cambridge, MA: MIT Press,1990.
    [171] N. A. Schmajuk, J. J. DiCarlo. Stimulus configuration, classical conditioning andhippocampal function [J]. Psychological Review,1992,99(3):268-305.
    [172] C. Balkenius. Natural intelligence in artificial creatures [M]. Lund UniversityCognitive Studies,1995.
    [173] J. Christopher, L. Anders. An associative neural network model of classicalconditioning [R]. TRITA-NA-P0217,2002.
    [174] Edgar H. Vogel, Maria E. Castro, Maria A. Saavedra. Quantitative models ofPavlovian conditioning [J]. Brain Research Bulletin,2004,63:173-202.
    [175] R. Gallistel, J. Gibbon. Time, rate and conditioning [J]. Psychological Review,2000,107:289-344.
    [176] I. P. L. McLaren, N. J. Mackintosh. Associative learning and elementalrepresentation: II. Generalization and discrimination [J]. Anim. Learn. Behav,2002,30:177-200.
    [177]杨贝贝,阮晓钢.基于尖峰神经元的条件反射模型及其认知行为的研究[J].系统仿真学报,2005,17(9):2134-2137.
    [178] R. Raizada, S. Grossberg. Towards a theory of the laminar architecture of cerebralcortex: computational clues from the visual system [J]. Cerebral Cortex,2003,13:100-113.
    [179] X. D. Gu, D. H. Yu, L. M. Zhang. Image shadow removal using pulse coupledneural network [J]. IEEE Trans. on Neural Networks,2005,16(3):692-698.
    [180] Guido Bugmann, Chris Christodoulou. Role of temporal integration andfluctuation detection in the highly irregular firing of a leaky integrator neuronmodel with partial reset [J]. Neural Computation,1997,10:985-1000.
    [181]胡方明,简琴,张秀君.基于BP神经网络的车型分类器[J].西安电子科技大学学报,2005,32(3):439-442.
    [182]于江波,陈后金. PCNN模型的改进及其在医学图像处理中的应用[J].电子与信息学报,2007,29(10):2316-2320.
    [183] A. C. Shih, H. M. Liao, C. S. Lu. A new iterated two band diffusion equation:theory and its application [J]. IEEE Trans. on Image Processing,2003,12(4):466-476.
    [184] Eckhorn Reinhard, Alexander M Gail, et al. Different types of signal coupling inthe visual cortex related to neural mechanisms of associative processing andperception [J]. IEEE Trans. on Neural Networks,2004,15(5):1039-1052.
    [185]彭真明,蒋彪,肖峻,孟凡斌.基于并行点火PCNN模型的图像分割新方法[J].自动化学报,2008,34(9):1169-1173.
    [186] C.W. Shih. Complete stability for a class of cellular neural networks [J]. Int. J.Bifurcation and chaos,2001,11(1):169-177.
    [187] Y. Zhang, J. Yu, Y. Wu. Global stability on a class of cellular neural networks[J]. Science in China. Ser. E,2001,44(1):1-11.
    [188] G. Grassi, E.D. Sciascio, P. Vecchio. New object oriented segmentation algorithmbased on the CNN paradigm [J]. IEEE Trans. Circuits Syst.-II: Exp. Briefs.2006,53(4):259-263.
    [189] S. Wang, M. Wang. A new detection algorithm based on fuzzy cellular neuralnetworks for white blood cell detection [J]. IEEE Trans. Information Technologyin Biomedicine,2006,10(1):5-10.
    [190]张强,许进.自相似过程的几种模型[J].通信学报,2001,22(2):106-112.
    [191] L. P. Wang, Wen Liu, Jacek M. Zurada. Cellular neural network with transientchaos [J]. IEEE Transactions on Circuits and Systems-II: Express Briefs,2007,54(5):440-444.
    [192] L. N. Chen, K. Aihara. Chaotic simulated annealing by a neural network modelwith transient chaos [J]. Neural Network,1995,8(6):915-930.
    [193] C. W. Wu, L. O. Chua. A more rigorous proof of complete stability of cellularneural networks [J]. IEEE Trans. CAS I,1997,44(4):370-371.
    [194] P. P. Civalleri, L. M. Gilli, L. Pabdolf. On stability of cellular neural networkswith delay [J]. IEEE Trans. Circuits Syst.,1993,40:157-164.
    [195] L. P. Wang, K. Smith. On chaotic simulated annealing [J]. IEEE Trans. NeuralNetwork,1998,9(4):716-718.
    [196] L. P. Wang. Oscillatory and chaotic dynamics in neural networks under varyingoperating conditions [J]. IEEE Trans. Neural Network,1996,7(6):1382-1388.
    [197] H. Nozawa. A neural network model as a globally coupled map and applicationsbased on chaos [J]. Chaos,1992,2(3):377-386.
    [198] Jiahai Wang, Wenbin Yi. Nonpositive hopfield neural network with self-feedbackand its application to maximum clique problems [J]. Neural InformationProcessing–Letters and Reviews.2006,10(10):243-248.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700