群神经元计算的动力学行为
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
理解人脑及其思想、认知、学习和记忆等智力能力,是这个时代最迷人的科学挑战。人工神经网络以其固有的模拟大脑智能的特性和强大的计算能力,吸引着国际上许多优秀的科学家和一流的学术研究机构,成为科学和工程上的研究热点。自上个世纪八十年代人工神经网络崛起以来,二十多年间,取得了大量令人振奋的研究成果,同时,其应用已经渗透到经济、军事、工程、医学,以及科学的许多领域。国际一流学术刊物,如《Science》、《Nature》上不断涌现的人工神经网络的研究成果;以及国际知名企业,如Intel、IBM公司等对神经网络芯片研发的大量投入,都表明了人工神经网络重要的科学地位。
     人工神经网络的动力学分析是其面向应用的重要理论基础,许多重要应用,要求网络具有良好的稳定性。通常,网络有两种稳定的计算运行模式:单一稳定和多稳定计算。神经网络的多稳定性,实质上体现了网络中神经元活动的群体计算特征,更深层次地揭示了生物神经网络的内在本质,具有更强大的计算能力。群神经元计算的动力学行为研究,是人工神经网络研究发展的必然趋势。
     本文针对这一前沿性研究课题展开深入研究,主要研究成果有:
     (1)研究了连续型和离散型非饱和分段线性阈值(Linear Threshold,LT)神经网络中的多周期计算特性和网络的全局吸引性。很多的生物和认知行为都存在重复循环现象,周期性震荡是神经网络中一种重要的动力学行为。应用神经元连接权的局部抑制方法,得到了网络全局吸引域的计算表达式。同时,突破性地采用不变集的思想,建立了使得网络实现多周期计算的条件。
     (2)研究了离散型LT神经网络中群神经元的“容许集”和“禁止集”理论,从全新的角度讨论了神经网络中的“记忆”提取,以更易控制的外部输入代替状态初值来提取网络中存储的“记忆”。采用能量函数等方法,建立了网络完全收敛、网络存在“容许集”和“禁止集”,以及网络条件多吸引的充要条件。
     (3)提出了群神经元的“非饱和集”与“饱和集”概念,针对细胞神经网络,建立了“非饱和集”和“饱和集”存在的充要条件等一些基本理论。在这些概念的基础上,利用侧向抑制,讨论了“非饱和集”与神经元组群之间的对应关系,实现了神经元的群选择计算,成功推广了工程上著名的Winner-Take-All(WTA)方法。通过使用该网络成功地提取存储在环形网络中的“记忆”,证明了这些概念的重要性和实际应用价值。
     (4)针对两类神经网络:LT神经网络和Lotka-Volterra(LV)神经网络,提出了群神经元的“活动性不变集”概念,分别推导出确定活动性不变集位置的条件,并且证明了每个活动性不变集中都有一个平衡点指数地吸引着这个不变集中的所有解轨线,因为这些吸引子是位于活动性不变集中,所以每个吸引子都具有数字的二值特性,同时也加载模拟信息。这些结论在“群胜者全得”和联想记忆等领域都具有潜在的应用价值。
     (5)研究了几类神经网络的多稳定性计算,包括具有非饱和分段线性阈值函数的双向联想记忆神经网络和背景回复式神经网络模型,应用局部抑制和能量函数方法,讨论了神经计算中多稳定的三个基本特性:有界性,全局吸引性和完全收敛性,获得了全局指数吸引集的确切表达式。此外,还讨论了背景神经网络中,分叉参数对网络平衡态的分布位置、网络平衡态的具体个数以及网络吸引子的个数和位置的影响。最后,研究了一类具有跳跃性非连续传输函数的变时滞神经网络,这类网络是神经元放大增益极高时的理想模型,得到了容易验证的保证网络全局指数收敛的条件。
     这些成果的取得,对于进一步建立群神经元计算理论将起到积极的推动作用。
One of the most fascinating scientific challenges in our time is to understand the brain, the biological basis of our mental capabilities like thought, perception, learning, and memory. Artificial neural networks arouse the interests of many distinguished scientists in some top institutes or universities by reason of the inherent property of imitating the human intelligent behaviors and powerful parallel computational capability. Since the artificial neural networks grew up in 1980s, up to now, a lot of inspiring achievements have been obtained and the theories of neural networks have been applied in many fields, including finance, military, engineering, medicine, etc.. In addition, many results on neural networks have been published in the top-ranking international journals, for example, Science, and Nature, while some famous corporations, such as Intel and IBM, have devoted to develop the chips of artificial neural networks. All above have indicated that the research of artificial neural networks is very important both in theory and applications.
     The dynamical properties of artificial neural networks play primarily important roles in their applications. Usually, it is a stable neural network that is available in the applications. The stable computational mode of neural network can be roughly divided into two classes: monostability and multistability. The multistability essentially characterizes the group properties of neurons and deeply depicts the neural networks in essence, so the multistable neural networks have more powerful parallel computational capability. Therefore, the dynamical analysis of group nerons' computation has been the trend of investigation of artificial neural networks.
     The main contributions of the dissertation are as follows:
     (1) In the second chapter the multiperiodicity and attractivity are studied for a class of recurrent neural networks (RNNs) with unsaturating piecewise linear transfer functions. Periodic oscillation in RNNs is an interesting behavior since many biological and cognitive activties require repetition. Using local inhibition, conditions for boundedness and global attractivity are established. Moreover, multiperiodicity of the network is investigated by using local invariant set.
     (2) The third chapter focuses on the basic theories of permitted and forbidden sets in discret-time linear threshold recurrent neural networks are investigated. Those concepts enable a new perspective of the memory in neural networks. The memory can be retrieved by some external input which is more controllable than the initial point. Necessary and sufficient conditions for this class recurrent neural networks are obtained for complete convergence, existence of permitted and forbidden sets, as well as conditionally multiattractivity, respectively.
     (3) The concepts of unsaturated and saturated sets are proposed in the forth chapter to deeply describe some interesting dynamical properties in cellular neural networks. The basic theories of unsaturated and saturated sets of cellular neural networks are studied, and the sufficient and necessary conditions for existence of unsaturated and saturated sets are established. Based on such concepts, Winner-Take-All are further extended by using cellular neural networks with lateral inhibition to implement group selection. In addition, the corresponding relations between groups and unsaturated sets are established.
     (4) In the fifth chapter, the concept of activity invariant sets are proposed to study the exponential stable attractors for discrete-time linear threshold RNNs and Lotka-Volterra RNNs, respectively. Conditions are obtained for locating activity invariant sets. It also shows that an invariant set can have one equilibrium point which attracts exponentially all trajectories starting in the set. Since the attractors are located in activity invariant sets, each attractor has binary pattern and also carries analog information. Such results can provide new perspective to apply attractor networks for applications such as group winner-take-all, associative memory, etc..
     (5) In the sixth chapter, multistability are studied for two classes of neural networks: bidirectional associative memory recurrent neural networks with unsat-urating piecewise linear transfer functions and the background neural networks. By using local inhibition and energy function, it deals fully with the three basic properties of a multistable networks: boundedness, global attractivity, and complete convergence. Bounds on global attractive sets are then obtained. Moreover, it shows that shifting the background level affects the quantity, location, as well as stability of the equilibrium point in the background neural networks. This means the background neural network can exhibit not only monostability but also multi-stability. At last, a class of time-varying delayed neural networks with some kind of discontinuous monotone increasing functions is studied. The discontinuities in this class of neural networks are the ideal model of the situation where the gain of the neuron amplifiers is very high. Conditions ensuring the global convergence of the neural network are derived.
引文
[1] McCulloch W, Pitts W. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 1943, 5:115 133
    
    [2] Belair J, Campbell S, Driessche P. Frustration, stability and delayinduced oscillation in a neural network model. SIAM J. Applied Math., 1996, 56:245-255
    
    [3] Ben-Yishai R, Lev Bar-Or R, Sompolinsky H. Theory of orientation tuning in visual cortex. Proc. Natl. Acad. Sci., 1995, 92:3844-3848
    
    [4] Wersing H, Beyn W J, Ritter H. Dynamical stability conditions for recurrent neural networks with unsaterating piecewise linear transfer functions. Neural Computation, 2001, 13:1811-1825
    
    [5] Dougleas R, Koch C, Mahowald M, et al. Recurrent excitation in neocortical circuits. Science, 1995, 269:981-985
    
    [6] Blumenfeld B, Bibitchkov D, Tsodyks M. Neural network model of the primary visual cortex: from functional architechture to lateral connectivity and back. J. Comput. Neu- rosci, 2006, 20:219-241
    
    [7] Tang H J, Tan K C, Teoh E J. Dynamics analysis and analog associative memory. IEEE Trans. Neural Networks, 2006, 17(2):409-418
    
    [8] Xie X, Hahnloser R H R, Seung H S. Selectively grouping neurons in recurrent networks of lateral inhibition. Neural Computation, 2002, 14(11):2627-2646
    
    [9] Wersing H, Steil J J, Ritter H. A competitive layer model for feature binding and sensory segmentation. Neural Computation, 2001, 13:357-387
    
    [10] Hahnloser R L T. On the piecewise analysis of linear threshold neural networks. Neural Networks, 1998, 11:691-697
    
    [11] Tang H J, Tan K C, Yi Z. Neural Networks: Computational Models and Applications. Heidelberg: Springer-Verlag, 2007.
    
    [12] Mao Z H, Massaquoi S G. Dynamics of winner-take-all competition in recurrent neural networks with lateral inhibition. IEEE Trans. Neural Networks, 2007, 18(1):55-69
    
    [13] Tan K C, Tang H J, Zhang W. Qualitative analysis for recurrent neural networks with linear threshold transfer functions. IEEE Trans. Circuits Syst. I, 2005, 52(5):1003-1012
    
    [14] Tang H J, Tan K C, Yi Z. Neural Networks: Computational Models and Applications, Heidelberg: Springer-Verlag, 2007.
    
    [15] Yi Z, Tan K K, Lee T H. Multistability analysis for recurrent neural networks with unsaturating piecewise linear transfer functions. Neural Computation, 2003, 15:639-662
    [16] Yi Z, Tan K K. Multistability analysis of discrete-time recurrent neural networks with unsaturating piecewise linear activation functions. IEEE Trans. Neural Networks, 2004, 15(2):329-336
    
    [17] Wang H, Liao X F, Li C D. Existence and exponential stability of periodic solutions of BAM neural networks with impulse and time-varing delay. Chaos, Solitons and Fractals, 2007, 33:1028-1039
    
    [18] Zhao H Y, Ding N. Existence and global attractivity of positive periodic solution for competition-predator system with variable delays. Chaos, Solitons and Fractals, 2006, 29:162-170
    
    [19] He J H. Periodic solutions and bifurcations of delay-differential equations. Physics Letters A, 2005, 347:228-230
    
    [20] He J H, Abdou M A. New periodic solutions for nonlinear evolution equations using Exp-function method. Chaos, Solitons and Fractals, 2007, 34:1421-1429
    
    [21] Tang H J, Tan K C, Zhang W. Analysis of cyclic dynamics for networks of linear threshold neurons. Neural Computation, 2005, 17:97-114
    
    [22] Gopalsamy K, Leung I. Delay induced periodicity in nerual netlet of excitation and inhibition. Physica D, 1996, 89:395-426
    
    [23] Yi Z. Global exponential stability and periodic solutions of delayed Hopfield neural networks. Int. J. Syst. Sci., 1996, 27:227-231
    
    [24] Zeng Z G, Wang J. Multiperiodicity of discret-time delayed neural networks evoked by periodic external inputs. IEEE Trans. Neural Networks 2006, 17(5):1141-1151
    
    [25] Lu H, Amari S. Global exponential stability of multitime scale competitive neural networks with nonsmooth functions. IEEE Trans. Neural Networks, 2006, 17(5):1152-1164
    
    [26] Yi Z, Lv J C, Zhang L. Output convergence analysis for a class of delayed recurrent neural networks with time varying inputs. IEEE Trans. Systems, Man and Cybernetics- Part B, 2006, 36(1):87-95
    
    [27] Hahnloser R H R, Sarpeshkar R, Mahowald M A, et al. Digital selection and analogue amplification coexist in a cortexinspired silicon circuit. Nature, 2000, 405:947-951
    
    [28] Hahnloser R H R, Seung H S, Slotine J J. Permitted and forbidden sets in symmetric threshold-linear networks. Neural Computation, 2003, 15(3):621-638
    
    [29] Hui S, Zak S H. Dynamical analysis of the brain-state-in-a-box (BSB) neural models. IEEE Trans. Neural Networks, 1992, 3(1):86-94
    
    [30] Jin L, Nikiforuk P N, Gupta M M. Absolute stability conditions for discrete-time recurrent neural networks. IEEE Trans. Neural Networks, 1994, 5(6):954-964
    [31] Michel A N, Si J, Yen G. Analysis and synthesis of a class of discrete-time neural networks described on hypercubes. IEEE Trans. Neural Networks, 1991, 2(1):32-46
    
    [32] Perez-Ilzarbe M J. Convergence analysis of a discrete-time recurrent neural networks to perform quadratic real optimization with bound constraints. IEEE Trans. Neural Networks, 1998, 9(6):1344-1351
    
    [33] Perfetti R. A synthesis procedure for brain-state-in-a-box neural networks. IEEE Trans. Neural Networks, 1995, 6:1071-1080
    
    [34] Si J, Michel A. N. Analysis and synthesis of a class of discrete-time neural networks with multilevel threshold neurons. IEEE Trans. Neural Networks, 1995, 6(1):105-116
    
    [35] Yi Z, Tan K K. Convergence analysis of recurrent neural networks. Kluwer Academic Publishers, 2004.
    
    [36] Horn R, Johnson C. Matrix analysis. Cambridge University Press, Cambridge, 1985.
    
    [37] Chua L O, Yang L. Cellular neural networks: theory. IEEE Trans. Circuits Syst. I, 1998, 35:1257-1272
    
    [38] Chua L O, Yang L. Cellular neural networks: application. IEEE Trans. Circuits Syst. I, 1998, 35:1273-1290
    
    [39] Wu C W, Chua L O. Amore rigorous proof of complete stability of cellular neural networks. IEEE Trans. Circuits Syst. I, 1998, 44:370-371
    
    [40] Takahashi N, Chua L 0. On the complete stability of nonsymmetric cellular neural networks. IEEE Trans. Circuits Syst. I, 1998, 45:754-758
    
    [41] Lin S, Shih C. Complete stability for standard cellular neural networks. Int. J. Bifurcation and Choas, 1999, 9:909-918
    
    [42] Sandre G D. Stability of 1-D-CNNs with Dirichlet boundary conditions and global propagation dynamics. IEEE Tran. Circuits Syst. I, 2000, 47:785-792
    
    [43] Forti M, Tesi A. A new method to analyze complete stability of PWL cellular neural networks. Int. J. Bifurcation and Choas, 2001, 11:655-676
    
    [44] Forti M. Some extensions of a new method to analyze complete stability of neural networks. IEEE Trans. Neural Networks, 2002, 13:1230-1238
    
    [45] Di Marco M, Forti M, Tesi A. Complex dynamics in nearly symmetric three-cell cellular neural networks. Int. J. Bifurcation and Choas, 2002, 12:1357-1362
    
    [46] Li X, Ma C, Huang L. Invariance principle and complete stability for cellular neural networks. IEEE Trans. Circuits Syst. II, 2006, 53:202-206
    
    [47] Roska T, Chua L O. Cellular neural networks with delay type template elements and nonuniform grids. Int. J. Circuit Theory Application, 1992, 20(4):469-481.
    [48] Roska T, Wu C W, Balsi M, Chua L 0. Stablity and dynamics of delay-type general and cellular neural networks. IEEE Trans. Circuits Syst. I, 1992, 39(4):487-490
    
    [49] Li X, Huang L, Wu J. Further results on the stability of stability of delayed cellular neural networks. IEEE Trans. Circuits Syst. I, 2003, 50:1239-1242
    
    [50] Takahashi N. A new sufficient condition for complete stability of cellular neural networks with delay. IEEE Trans. Circuits Syst. I, 2000, 47:793-799
    
    [51] Zeng Z G, Wang J. Complete stability of cellular neural networks with time-varying delays. IEEE Trans. Circuits Syst. I, 2006, 53:944-955
    
    [52] Yi Z, Heng P A, Leung K S. Convergence analysis of cellular neural networks. IEEE Trans. Circuits Syst. I, 2000, 48(6):1584-1589
    
    [53] Yi Z, Heng P A, Vadakkepat P. Absolute periodic and absolute stability of delayed neural networks. IEEE Trans. Circuits Syst. I, 2002, 49(2):256-261
    
    [54] Berns D W, Moiola J L, Chen G. Predicting period-doubling bifurcations and multiple oscillations in nonlinear time-delayed feedback systems. IEEE Trans. Circuits Syst. I, 1998, 45(7):759-763.
    
    [55] Zeng Z G, Wang J. Multiperiodicity and exponential attractivity evoked by periodic external inputs in delayed cellular neural networks. Neural Computation, 2006, 18:848-870
    
    [56] Grassi G. A new approach to design cellular neural networks for associative memories. IEEE Trans. Circuits Syst. I, 1998, 44(9):835-838
    
    [57] Grassi G. On discrete-time cellular neural networks for associative memories. IEEE Trans. Circuits Syst. I, 2001, 48(1):107-lll
    
    [58] Fantacci R, Forti M, Marini M, et al. Cellular neural networks approach to a class of comunication problems. IEEE Trans. Circuits Syst. I, 1999, 46(12):1457-1467
    
    [59] Yang H, Yang L B. Application of fuzzy cellular neural networks to Euclidean distance transformation. IEEE Trans. Circuits Syst. I, 1997, 44(3) :242-246
    
    [60] Julian P, Dogaru R, Chua L O. A piecewise-linear simplicial coupling cell for CNN gray-level image processing. IEEE Trans. Circuits Syst. I, 2002, 49(7):904-913
    
    [61] Zarandy A, Orzo L, Grawes E, et al. CNN-based models for color vision and visual illusions. IEEE Trans. Circuits Syst. I, 1999, 46(2):229-238
    
    [62] Sziranyi T, Zerubia J. Markov random field image segmentation using cellular neural networks. IEEE Trans. Circuits Syst. I, 1997, 44(1):86-89
    
    [63] Yen J C, Guo J I, Chen H C. A new k-winner-take all neural network and its array architecture. IEEE Trans. Neural Networks, 1998, 9(5):901-12
    [64] Mao Z H, Massaqoi S G. Dynamics of winner-take-all competition in recurrent neural networks with lateral inhibition. IEEE Trans. Neural Networks, 2007, 18(1):55-69
    
    [65] Hopfield J J. Neural networks and physical systems with emergent collective computatinal abilites. Proc. Natl. Acad. Sci. USA, 1982, 79:2554-2558
    
    [66] Fukai T, Tanaka S. A simple neural network exhibiting selective activation of neuronal ensembles: from winner-take-all to winner-share-all. Neural Computation, 1997, 9:77-97
    
    [67] Asai T, Fukai T, Tanaka S. A subthreshold MOS circuit for the Lotka-Volterra neural network porducing the winner-take-all solutions. Neural Networks, 1999, 12: 211-216
    
    [68] Asai T, Ohtani M, Yonezu H. Analog integrated circuits for the Lotka-Volterra competitive neural networks. IEEE Trans. Neural Networks, 1999, 10(5):1222-1231
    
    [69] Rabinovich M, Volkovskii A, Lecanda P, et al. Dynamical encoding by networks of competing neuron groups: winnerless competition. Physical Review Letters, 2001, 87(6):068102
    
    [70] Yi Z, Tan K K. Global convergence of Lotka-Volterra recurrent neural networks with delays. IEEE Trans. Circuits Syst. I, 2005, 52(11):2482-2489
    
    [71] Yi Z, Tan K K. Dynamic stability conditions for Lotka-Volterrra recurrent neural networks with delay. Phys. Rev. E, 2002, 66:011910
    
    [72] Berman A, Plemmons R J. Nonnegative matrices in the mathematical science, New York: Academic Press, 1994.
    
    [73] Kosko B. Adaptive bidirectional associative memories. Appl. Opt., 1987, 26(23):4947- 4960
    
    [74] Kosko B. Bidirectional associative memories. IEEE Trans. Syst. Man Cybern, 1988, 18(1):49-60
    
    [75] Kosko B. Neural Networks and Fuzzy Systems—A Dynamical Systems Approach to Machine Intelligence. Prentice-Hall, Englewood Cliffs, NJ, 1992:38-108
    
    [76] Mathai G, Upadhyaya B R. Performance analysis and application of the bi-directional associative memory to industrial spectral signatures. Proc. IJCNN' 1989, 1989, 1:33-37
    
    [77] Jing M H, Shen Y, Liao X X. Global stability of periodic solution for bidirectional associative memory neural networks with varying-time delay. Applied Mathematics and Computation, 2006, 182:509-520
    
    [78] Tang H J, Tan K C, Zhang W. Analysis of cyclic dynamics for networks of linear threshold neurons. Neural Computation, 2005, 17:97-114
    
    [79] Yi Z, Heng P A, Fu A W C. Estimate of exponential convergence rate and exponential stability of neural networks with unbounded delay. IEEE Trans. Neural Networks, 1999, 10(6):1487-1493
    [80] Cohen M A, Grossberg S. Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Systems, Man, Cybernetics, 1983, 13:815-826
    
    [81] Hopfield J J. Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USA, 1984, 81:3088-3092
    
    [82] Sommer M A, Wurtz R H. Frontal eye field sends delay activity related to movement, memory, and vision to the superior colliculus. J. Neurophysiol, 2001, 85:1673-1685
    
    [83] Salinas E. Background synaptic activity as a swich between dynamical states in network. Nerual computation, 2003, 15:1439-1475
    
    [84] Forti M. On global asymptotic stability of a class of nonlinear systems arising in neural network theory. Journal of Differential Equations, 1994, 113:246-264
    
    [85] Forti M, Tesi A. New conditions for global stability of neural networks with application to linear and quadratic programming problems. IEEE Trans. Circuits Systems. I, 1995, 42:354-366
    
    [86] Liang X B, Si J. Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problem. IEEE Trans. Neural Networks, 2001, 12:349-359
    
    [87] Liang X B, Wang J. A recurrent neural network for nonlinear optimization with a continuous differentiable objective function and bound constraints. IEEE Trans. Neural Networks, 2000, 11:1251-1262
    
    [88] Liao X, Wang J. Algebraic Crieria for Global Exponential Stability of Celluar Neural Networks with Multiple Time Delays. IEEE Trans. Circuits and Systems-I, 2003, 50:268-285
    
    [89] Qi H, Qi L. Deriving Sufficient Conditions for Global Asymptotic Stability of Delayed Neural Networks via Nonsmooth Analysis. IEEE Trans. Neural Networks, 2004, 15:99- 109
    
    [90] Forti M, Nistri P. Global Convergence of Neural Networks with Discontinuous Neural Activation. IEEE Trans. Circuits and Systems-I, 2003, 50:1421-1435
    
    [91] Forti M, Nistri P, Papini D. Global Exponential Stability and Global Convergence in Finite Time of Delayed Neural Networks with Infinite Gain. IEEE Trans. Neural Networks, 2005, 16:1449-1463
    
    [92] Filippov A F. Differential Equations With Discontinuous Right-Hand Side. Masthematics and its Applications (Soviet Series), 1988, Boston

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700