稀疏互联联想记忆及其复杂网络实现
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
联想记忆网络模拟人脑信息存储及回忆机制,具有对含噪及不完全信息的鲁棒处理能力,因而在人工智能、模式识别等领域获得了广泛的研究与应用。复杂网络关注系统结构与功能之间的关系,是近年来研究复杂系统的新视角、新方法。模拟大脑神经学习机理的联想记忆模型其本质表现为一种复杂的非线性动力学系统,同时,生物脑神经系统中普适存在着复杂网络中典型的小世界效应和无标度特性。因此,从复杂网络角度出发,研究稀疏互联联想记忆模型实现就成为了一种新颖的思路。
     本文借鉴复杂网络研究结构与功能关系新思想,从网络体系结构角度出发,深入而系统地从理论分析和应用实例两方面进行交替互补研究,着重探讨了神经元间稀疏互联方式对于网络联想记忆性能的影响,并构建了相应的复杂网络体系结构下的稀疏互联联想记忆模型。
     论文的主要工作及创新点包括:
     (1)综述了联想记忆神经网络相关研究工作,指出其在生物学建模的合理性及硬件实现时存在的问题,分析了采用复杂网络思想研究稀疏互联联想记忆的可行性,提出了从网络体系结构角度出发,从神经元稀疏互联方式入手,借鉴复杂网络研究思想和生物神经系统中普遍存在的复杂网络性质,展开对稀疏互联联想记忆模型理论及应用两方面进行研究的新的思路和方法。
     (2)借鉴复杂网络研究背景及生物神经系统本身所具有的广泛稀疏连接的内在特点,研究了一类广义稀疏互联联想记忆网络的实现。它可以将现有的各种基于复杂网络结构体系的稀疏互联联想记忆模型整合入一个统一的框架中,以概率统计分析为手段,研究了具有任意连通度的稀疏互联联想记忆网络的动力学演化行为。
     (3)鉴于大脑皮层神经元突触连接中有限的代谢能量资源限制,使用信噪比分析方法,研究了传统全互联Hopfield网络基础上,有限连接代价限制条件下,网络最优稀疏互联结构的确定原则,以期在降低网络连接成本的同时,最大限度的维持网络性能。
     (4)研究了复杂网络小世界体系下的联想记忆实现问题,针对原始小世界网络捷径生成具有随机性,缺乏面向任务的确定性操作的缺陷,借鉴复杂动态网络中和谐统一的混合择优模型构建思想,考虑有限连接代价限制条件下网络结构最优稀疏原则引导的捷径生成方式,提出了一种新的小世界体系结构自适应联想记忆模型。新模型可根据学习任务的实际需求,有目的的选择捷径生成,构建任务自适应的网络结构,有效的实现了联想记忆。
     (5)模拟人脑功能区核磁共振成像所揭示的无标度特性,考虑神经元突触动态生成时融入复杂网络无标度模型形成中的“马太效应”,提出了一种结构动态择优的无标度联想记忆模型。该模型根据有限连接代价限制条件下网络互联结构最优稀疏原则,定义了节点间亲和度的概念,综合考虑了基于节点度值和节点亲和度共同驱动下的择优连接机制,从而同时具有较高的联想记忆性能及神经生物学背景。
Associative Memory modeling human intelligence as information storage and recovery is a hot research issue in neural computing, and receives widely application in the field of artificial intelligence and pattern recognition. Complex network focuses on the relationship between topology and function of the network, and it's a brand-new method of complex system investigation. Associative Memory model is a dynamic nonlinear complex system in essence, and small-word and scale-free characteristics are a universal phenomenon in biological neural system. Therefore, from these points of view, it's a novel and feasible manner to study Associative Memory through complex network ideology.
     In this dissertation, we utilize complex network ideology to research the influence of the neurons' sparse interconnection style on the network performance, i.e. topology versus function. Starting from network topology, we do detailed theoretical analysis and application instance research on Associative Memory realization.
     The main research contents and innovative contributions of this dissertation are as follows:
     (1) Status of the Associative Memory modeling technology is summarized. The problems existed in current models as lacking of biological modeling background and its' unrealistic VLSI implementation are pointed out. The feasibility of researching sparsely connected Associative Memory through complex network is analyzed, and the new ideology of complementary investigation both theoretical and numerical is realized.
     (2) To go one step closer to more biological realistic model which displays widely sparely connected architecture and meanwhile possesses complex network property, we study a general sparely connected Associative Memory model using probabilistic approach, and explicit analytical solutions for the transient dynamics of the model with arbitrary connectionism are derived.
     (3) In view of the limited energy consumptions as in the human brain, we derive optimal synaptic dilution strategy under the constraint of limited energy consumptions on fully connected Hopfield network through signal-to-noise analysis. Such synaptic dilution strategy can maintain the network performance utmost while contributing to the energy saving.
     (4) A novel Associative Memory model based on small-world adaptive structure is proposed in this paper. Aimed at overcoming the disadvantage of random shortcuts formation of the existing methods, this new model takes the ideology of Harmonious Unifying Hybrid Preferential Model and the optimal synaptic dilution strategy under the constraint of limited energy consumptions both into account. This new model breaks the traditional mean of random rewiring but instead constructs a task-based network structure which is much closer to human brain as possessing small-world architecture and can also achieve better performance than the existing counterparts of the same class. The rationality and validity of the proposed model is validated from great number experiments.
     (5) In order to imitating the scale-free characteristic discovered in the human brain through fMRI. We propose a new Associative Memory model base on dynamic preferential attachment scale-free structure. The conception of affinity between neurons is defined via the optimal synaptic dilution strategy under the constraint of limited energy consumptions. Then the preferential attachment is done by integrate driven of the neuron degree and affinity. The new model not only possesses scale-free architecture but also achieves better performance than the existing counterparts.
引文
[1]S.C.Chen.Research summary on discrete associative memories.in Neural networks and their applications.Z.H.Zhou and C.G.Cao, Eds.Beijing:Tsinghua University Press, 2004.
    [2]J.J.Hopfield.Neural networks and physical system with emergent collection computation abilities”, In Proc.National Acad Sciences, 79:2553-2558, 1982.
    [3]J.J.Hopfield.Neurons with graded response have collective computational properities like those of two-state neurons.In Proc.National Acad Sciences, 81:3088-3092, 1984.
    [4]J.J.Hopfield, D.W.Tank.Neural computation of decisions in optimization problems.Biological Cybemetics, 52:141-154, 1985.
    [5]D.W.Tank, J.J.Hopfield.Simple neural optimization networks:An A/D converter, signal decision circuit and a linear programming circuit.IEEE Transactions on Circuit and Systems,33(5):533-541, 1986.
    [6]D.S.Bassett, E.D.Bullmore.Small-World Brain Networks.The Neuroscientist, 12(6):512-523, 2006.
    [7]V.M.Egu(?)luz, D.R.Chialvo, Cecchi GA et al.Scale-free brain functional networks.Phys.Rev.Lett, 94, 018102, 2005.
    [8]D.J.Watts, S.H.Strogatz.Collective dynamics of 'small-world' networks.Nature, 393(4):440-442, 1998.
    [9]R.Albert, A.L.Barab(?)si.Statistical mechanisms of complex networks.Reviews of Modem Physics, 74:47-97, 2002.
    [10]M.E.J.Newman.The structure and function of complex networks.SIAM Review, 45:167-256, 2003.
    [11]汪小帆,李翔,陈关荣.复杂网络理论及其应用.北京:清华大学出版社,2006.
    [12]李翔.从复杂到有序——神经网络智能控制理论新进展.上海:上海交通大学出版社, 2006.
    [13]Simon Haykin.Neural Networks:A Comprehensive Foundation.Second Edition:Prentice Hall, 1998.
    [14]王敏.基于核的增强型联想记忆模型及推广性研究.南京航空航天大学大学博士论文,2006.
    [15]E.M(?)rida-Casermeiro, G.Gal(?)n-Mar(?)n, J.Mu(?)oz-Perez.An Efficient Multivalued Hopfield Network for the Traveling Salesman Problem.Neural Processing Letters, 10.1023/A:203-216,2001.
    [16]K.C.Tan, H.Tang, S.S.Ge.On parameter settings of Hopfield networks applied to traveling alesman problems.IEEE Transactions on Circuits and Systems, 52(5):994-1002, 2005.
    [17]P.K.Brouwer.A fully recurrent artificial neural network (FRANN) for pattern classification.In Int.J.of Uncertainty, Fuzziness and Knowledge-based Syst., 525-538, 2000.
    [18]P.K.Brouwer.A discrete fully recurrent network of max product units for associative memory and classification.In Int.J.of Neural Syst., 247-262, 2002.
    [19]F.J.L.Aligue, I.S.Stooca, I.A.Troncoso, C.J.G.Orellana, and H.G.Velasco.A neural associative pattern classifier.In IBERAMIA2002, LNA12527 2002, 430-439, 2002.
    [20]S.Bicciato, M.Pandin, G.Didone, and C.D.Bedo.Analysis of an associative memory neural network for pattern identification in gene expression data.In BIOKDDO 1:Workshop on Data Mining in Bioinformatics (with SI6KDDO1), 22-30, 2001.
    [21]J.Kung, H.Sylvia, and H.Horst.Knowledge discovery with the associative memory model.In Neunet, DEXA'99, LNCS1677, 146-155, 1999.
    [22]A.Raudys.High speed associative memory for feature extraction and visualization.Pattern Recognition Letter, 24:1317-1329, 2003.
    [23]周志华,曹存根.神经网及其应用.北京,清华大学出版社,2004.
    [24]B.Kosko.Bidirectional associative memories.IEEE Transactions on Systems, Man and Cybernetics, 18(1):49-60, 1988.
    [25]M.Hagiwara.Multi-directional associative memory.In proc.Int.J.Conf.Neural Networks,3-6, 1990.
    [26]T.Tanaka.Bit error probability of an associative memory with many-to-many correspondence.Network Computation in Neural System, 7:573-586, 1996.
    [27]B.A.Telfer, D.P.Casasent.Fast method for updating robust pseudoinverse and Ho-Kashyapassociative processors.IEEE Transactions on Systems, Man and Cybemetics,24(9):1387-1390, 1994.
    [28]M.Kam, J.C.Chow, R.Fischl.Design of the fully connected binary neural network via linearprogramming.IEEE International Symposium on Circuits and Systems, 2:1094-1097,1990.
    [29]陈松灿.一个新的高阶双向联想记忆模型及其性能估计.软件学报,9(11):814-819, 1998.
    [30]高航,陈松灿.一类新的指数式联想记忆模型.数据采集与处理,14(2):251-253,1999.
    [31]J.Bruck, V.P.Roychowdhury.On the number of Spurious Memories in the Hopfield Model.IEEE Transactions on Information Theory, 36:393-397, 1990.
    [32]M.Morita.Associative Memory with Nonmonotonic Dynamics.Neural Networks, 6:115-126, 1993.
    [33]M.Okada.Notions of Associative Memory and Sparse Coding.Neural Networks, 9:1429-1498, 1996.
    [34]A.C.C.Cheng, L.Guan.A Combined Evolution Method for Associative Networks.Neural Networks, 11:785-792, 1998.
    [35]陈松灿,高航.多值指数式多向联想记忆模型.软件学报,9(5):397-400, 1998.
    [36]J.W.Bohland, A.A.Minai.Efficient associative memory using small-world architecture.Neurocomputing, 38-40:489-496, 2001.
    [37]高隽.人工神经网络原理及仿真实例.北京:机械工业出版社,2003.
    [38]V.Braitenberg, A.Schuz.Cortex:Statistics and Geometry of Neuronal Connectivity.Berlin:Springer-Verlag, 1998.
    [39]I.Kanter, H.Sompolinsky.Associative recall of memory without errors.Phys.Rev.A35,380-392, 1987.
    [40]V.Dotsenko.An introduction to the theory of spin glasses and neural networks.World Scientific Lecture Notes in Physics, 1994.
    [41]戴汝为,沙飞.复杂性问题研究综述:概念及研究方法.自然杂志,17(2):73-78, 1995.
    [42]D.J.Amit, H.Gutfreund, H.Sompolinsky.Spin-glass models of neural networks.Phys.Rev.A32, 1007-1018, 1985.
    [43]D.J.Amit, H.Gutfreund, H.Sompolinsky.Storing infinite numbers of patterns in a spin-glass model of neural networks.Phys Rev Lett, 55(14):1530-1533, 1985.
    [44]B.Derrida, E.Gardner, A.Zippelius.An exactly solvable asymmetric neural network model Europhys.Lett.4, 167-173, 1987.
    [45]B.Derrida.Distribution of the activities in a diluted neural network.J.Phys.A:Math.Gen.22:2069-2080, 1989.
    [46]A.E.Patrick, V.A.Zagrebnov.Parallel dynamics for an extremely diluted neural network.J.Phys.A:Math.Gen.23:1323-1329, 1990.
    [47]A.Canning.Partially connected models of neural networks.J.Phys.A:Math.Gen.21:3275-3284, 1988.
    [48]H.Sompolinsky.Neural networks with nonlinear synapses and a static noise.Phys.Rev.A,34(3):2571-2575, 1988.
    [49]J.L.Van-Hemmen.Nonlinear neural networks near saturation.Phys.Rev.A, 36:1959-1969,1987.
    [50]M.V.Tsodyks.Associative memory in asymmetric diluted networks with low level of activity Europhys. Letter., 7(3): 203-208,1988.
    [51] M. R. Evans. Random dilutions in a neural networks for biased patterns. J. Phys. A. Math. Gen., 22: 2103-2118,1989.
    [52] E Gardner. Optimal basins of attraction in randomly sparse neural network models. J. Phys. A: Math. Gen. 22: 1969-1974,1989.
    [53] M. Bouten, A. Engel, A. Komoda, and R. Serneel. Quenched versus annealed dilution in neural networks. J. Phys. A. Math. Gen., 23: 4643-4657, 1990.
    [54] A. L. Barab(?)si, R. Albert. Emergence of scaling in random networks. Science, 286(5439): 509-512, 1999.
    [55] I. P(?)rez Castillo, B. Wemmenhove et al. Analytic solution of attractor neural networks on scale-free graphs. J. Phys. A: Math. Gen. 37: 8789-8799,2004.
    [56] A. C. C. Coolen et al. Finitely connected vector spin systems with random matrix interactions. J. Phys. A: Math. Gen. 38: 8289-8317,2005.
    [57] B. Wemmenhove et al. Finite connectivity attractor neural networks. J. Phys. A: Math. Gen. 36:9617-9633,2003.
    [58] B. Wemmenhove et al. Slowly evolving random graphs Ⅱ: adaptive geometry in finite-connectivity Hopfield models. J. Phys. A: Math. Gen. 37: 7843-7858, 2004.
    [59] R. Heylen, N. S. Skantzos, J. Busquets Blanco, and D. Boll(?). Metastable configurations of small-world networks. Phys. Rev. E73,016138, 2006.
    [60] L. da F. Costa, D. Stauffer. Associative recall in non-randomly diluted neuronal networks. PhysicaA,330: 37-45,2003.
    [61] P. N. McGraw, M. Menzinger. Topology and computational performance of attractor neural networks. Physical Review E, 68,047102,2003.
    [62] J. J. Torres, M. A. Munoz, J. Marro, and P. L. Garrido. Influence of topology on the performance of a neural network. Neurocomputing, 58-60: 229-234,2004.
    [63] D. Stauffer, A. Aharony, L. da F. Costa, and J. Adler. Efficient Hopfield pattern recognition on a scale-free neural network. Eur. Phys. J. B, 32, 395-399,2003.
    [64] Davey, N., Calcraft, L., Adams, R.. High capacity, small-world associative memory models[J]. Connection Science, 18: 247-270,2006.
    [65] Hiraku Oshima, Takashi Odagaki. Storage capacity and retrieval time of small-world neural networks[J]. Physical Review E, 76: 036114, 2007.
    [66] Amaral, L Guimera, R. Lies. Damned lies and bad statistics. Nature Phys. Feb 1, 2006.
    [67] T. Verbeiren. Dilution in recurrent neural networks. Ph.D. Thesis, Katholieke University Leuven, 2003.
    [68] H. Shouno, S. Kido, M. Okada. Analysis of Bidirectional Associative Memory using SCSNA and Statistical Neurodynamics. Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications, 1: 239-245,2002.
    [69] M. Mezard, G. Parisi, N. Sourlas, G. Toulouse. Replica symmetry breaking and the nature of the spin glass phase. J. Phys. France 45,:843-854, 1984.
    [70] B. Wemmenhove, T. Nikoletopoulos, J. P. L. Hatchett. Replica symmetry breaking in the 'small world' spin glass. J. Stat. Mech, P11007,2005.
    [71] D. Bolle, J. Busquets Blanco, T. Verbeiren. The signal-to-noise analysis of the Little-Hopfield model revisited. Journal of Physics A: Math. Gen, 37: 1951-1969,2004.
    [1]S.C.Chen.Research summary on discrete associative memories.in Neural networks and their applications.Z.H.Zhou and C.G.Cao, Eds.Beijing:Tsinghua University Press, 2004.
    [2]D.S.Bassett, E.d.Bullmore.Small-World Brain Networks.The Neuroscientist, 12(6):512-523, 2006.
    [3]V.M.Egu(?)luz, D.R.Chialvo, Cecchi GA et al.Scale-free brain functional networks.Phys.Rev.Lett, 94, 018102, 2005.
    [4]R.Albert, A.L.Barab(?)si.Statistical mechanisms of complex networks.Reviews of Modem Physics, 74:47-97, 2002.
    [5]M.E.J.Newman.The structure and function of complex networks.SIAM Review, 45:167-256, 2003.
    [6]汪小帆,李翔,陈关荣.复杂网络理论及其应用.北京:清华大学出版社,2006.
    [7]S.H.Strogatz.Exploring compex networks.Nature, 410:268-276, 2001.
    [8]M.E.J.Newman.The structure and function of complex networks.Computer Physics Communications,147:40-45, 2002.
    [9]Simon Haykin.Neural Networks:A Comprehensive Foundation.Second Edition:Prentice Hall, 1998.
    [10]张承福,赵刚.联想记忆神经网络的若干问题.自动化学报,20(5):513-521, 1994.
    [11]高隽.人工神经网络原理及仿真实例.北京:机械工业出版社,2003.
    [12]J.J.Hopfield.Neural networks and physical system with emergent collection computation abilities”, In Proc.National Acad Sciences, 79:2553-2558, 1982.
    [13]J.J.Hopfield.Neurons with graded response have collective computational properities like those of two-state neurons.In Proc.National Acad Sciences, 81:3088-3092, 1984.
    [14]G.Athithan, C.Dasgupta.On the problem of spurious patterns in neural associative memory models.IEEE Transactions on Neural Networks, 8(6):1483-1491, 1997.
    [15]D.J.Amit, H.Gutfreund, H.Sompolinsky.Spin-glass models of neural networks.Phys.Rev.A32, 1007-1018, 1985.
    [16]D.J.Amit, H.Gutfreund, H.Sompolinsky.Storing infinite numbers of patterns in a spin-glass model of neural networks.Phys Rev Lett, 55(14):1530-1533, 1985.
    [17]J.F.Fontanari, R.K(o|¨)berle.Information storage and retrieval in synchronous neural networks.Phys.Rev.A, 36(5):2475-2477, 1987.
    [18]R.K(u|¨)hn, S.B(o|¨)s, J.L.Van Hemmen.Statistical mechanics for networks of graded response neurons.Phys.Rev.A, 43(4):2084-2087, 1991.
    [19]M.Shiino, T.Fukai.Replica-symmetric theory of nonlinear analogue neural networks.J.Phys A, Math.Gen.,23:L 1009-L 10 17,1990.
    [20]M.Shiino, T.Fukai.Self-consistent signal-to-noise analysis and its application to analogue neural networks with asymmetric connections.J.Phys.A, Math.Gen ,25:375-381, 1992.
    [21]M.Shiino, T.Fukai.Self-consistent signal-to-noise analysis of the statistical behavior of analog neural networks and enhancement of the storage capacity.Phys.Rev.E, 48(2):867-897, 1993.
    [22]T.Verbeiren.Dilution in recurrent neural networks.Ph.D.Thesis, Katholieke University Leuven, 2003.
    [23]D.Bolle, J.Busquets Blanco, T.Verbeiren.The signal-to-noise analysis of the Little-Hopfield model revisited.Journal of Physics A:Math.Gen, 37:1951-1969, 2004.
    [24]V.Braitenberg, A.Schuz.Cortex:Statistics and Geometry of Neuronal Connectivity.Berlin:Springer-Verlag, 1998.
    [25]周涛,柏文洁,汪秉宏,刘之景,严钢.复杂网络研究概述.物理,34(1):31-36, 2005.
    [26]P.Erd(o|¨)s, A.R(?)nyi.On random graphs.Publicationes Mathematicae, 1959.
    [27]P.Erd(o|¨)s, A.R(?)nyi.On the Evolution of Random Graphs.Publ.Math.Inst.Hung.Acad.Sci,1960.
    [28]D.J.Watts, S.H.Strogatz.Collective dynamics of 'small-world' networks.Nature, 393(4):440-442, 1998.
    [29]A.L.Barab(?)si, R.Albert.Emergence of scaling in random networks.Science, 286(5439):509-512, 1999.
    [30]S.Milgram.The small-world problem.Psychol.Today, 2:60-67,1967.
    [31]R.Albert, A.L.Barab(?)si.Statistical mechanics of complex networks.Ph.D.Thesis,University Of Notre Dame, 2001.
    [32]S.N.Dorogovtsev, J.F.F.Mendes, A.N.Samukhhin.Structure of growing networks with preferential linking.Phys.Rev.Lett, 85:4633-4636, 2000.
    [33]P.L.Krapivsky, S.Redner, F.Leyvraz.Phys.Rev.Lett, 85:4629-4632, 2000.
    [34]Li X, Chen G.A local word evolving network model.Physica A, 328:274-286, 2003.
    [35]S.H.Yook, H.Jeong, A.L.Barabasi..Weighted evolving networks.Physical Review Letters,86 (25):5835-5838, 2001.
    [36]S.H.Strogatz.Exploring compex networks.Nature, 410:268-276, 2001.
    [37]李翔.从复杂到有序——神经网络智能控制理论新进展.上海:上海交通大学出版社, 2006.
    [38]Bohland, J.W.and A.A.Minai.Efficient associative memory using small-world architecture.Neurocomputing, 38-40:489-496, 2001.
    [39]P.N.McGraw, M.Menzinger.Topology and computational performance of attractor neural networks.Physical Review E, 68(4):047102.1-047102.4, 2003.
    [40]J.J.Torres, M.A.Munoz, J.Marro, and P.L.Garrido.Influence of topology on the performance of a neural network.Neurocomputing, 58-60:229-234, 2004.
    [41]N.Davey, R.Adams.High capacity associative memories and connection constraints.Connection Science, 16(1):47-66, 2004.
    [42]L.da F.Costa, D.Stauffer.Associative recall in non-randomly diluted neuronal networks.Physica A, 330:37-45, 2003.
    [43]D.Stauffer, A.Aharony, L.da F.Costa, and J.Adler.Efficient Hopfield pattern recognition on a scale-free neural network.Eur.Phys.J.B, 32, 395-399, 2003.
    [44]L.G.Morelli, G.Abramson, M.N.Kuperman.Associative memory on a small-world neural network.The European Physical Journal B, 38:495-500, 2004.
    [45]R.Adams, L.Calcraft, and N.Dave y.Evolving high capacity associative memories with efficient wiring.In Proc.International Joint Conference on Neural Networks, 3168-3173,2005.
    [46]D.Simard, L.Nadeau, H.Kr(o|¨)ger.Fastest learning in small-world neural networks.Physics Letters A, 2005, 336:8-15.
    [47]Jianquan Lu, Juan He.Topology influences performances in the associative memory neural networks.Physics Letters A, 2006, 354:335-343.
    [48]Amaral, L Guimera, R.Lies.Damned lies and bad statistics.Nature Phys.Feb 1, 2006.
    [49]I.P(?)rez Castillo, B.Wemmenhove et al.Analytic solution of attractor neural networks on scale-free graphs.J.Phys.A:Math.Gen.37:8789-8799, 2004.
    [50]T.Nikoletopoulos, A.C.C.Coolen, I.P.Castillo.Replicated transfer matrix analysis of Ising spin models on 'small world' lattices.J.Phys.A, Math.Gen., 37:6455-6475, 2004.
    [51]A.C.C.Coolen et al.Finitely connected vector spin systems with random matrix interactions.J.Phys.A:Math.Gen.38:8289-8317, 2005.
    [52]B.Wemmenhove et al.Finite connectivity attractor neural networks.J.Phys.A:Math.Gen.36:9617-9633, 2003.
    [53]B.Wemmenhove et al.Slowly evolving random graphs Ⅱ:adaptive geometry in finite-connectivity Hopfield models. J. Phys. A: Math. Gen. 37: 7843-7858, 2004.
    [54] R. Heylen, N. S. Skantzos, J. Busquets Blanco, and D. Boll(?). Metastable configurations of small-world networks. Phys. Rev. E73,016138, 2006.
    [55] T. Verbeiren. Dilution in recurrent neural networks. Ph.D. Thesis, Katholieke University Leuven, 2003.
    [56] M. Mezard, G. Parisi, N. Sourlas, G. Toulouse. Replica symmetry breaking and the nature of the spin glass phase. J. Phys. France 45,:843-854,1984.
    [1] V. Braitenberg, A. Schuz,. Cortex: Statistics and Geometry of Neuronal Connectivity. Berlin: Springer-Verlag, 1998.
    [2] D. S. Bassett, E. d. Bullmore. Small-World Brain Networks. The Neuroscientist, 12(6): 512-523,2006.
    [3] V. M. Egu(?)luz, D. R. Chialvo, G. A Cecchi, et al. Scale-free brain functional networks. Phys. Rev. Lett, 94, 018102, 2005.
    [4] H. Sompolinsky. Neural networks with nonlinear synapses and a static noise. Phys. Rev. A, 34(3): 2571-2575,1988.
    [5] J. L. Van-Hemmen. Nonlinear neural networks near saturation. Phys. Rev. A, 36: 1959-1969, 1987.
    [6] M. V. Tsodyks. Associative memory in asymmetric diluted networks with low level of activity. Europhys. Letter., 7(3): 203-208, 1988.
    [7] M. R. Evans. Random dilutions in a neural networks for biased patterns. J. Phys. A. Math. Gen., 22: 2103-2118, 1989.
    [8] E Gardner. Optimal basins of attraction in randomly sparse neural network models. J. Phys. A: Math. Gen. 22: 1969-1974,1989.
    [9] M. Bouten, A. Engel, A. Komoda, and R. Serneel. Quenched versus annealed dilution in neural networks. J. Phys. A. Math. Gen., 23: 4643-4657, 1990.
    [10] I. P(?)rez Castillo, B. Wemmenhove et al. Analytic solution of attractor neural networks on scale-free graphs. J. Phys. A: Math. Gen. 37: 8789-8799,2004.
    [11] T. Nikoletopoulos, A. C. C. Coolen, I. P. Castillo. Replicated transfer matrix analysis of Ising spin models on 'small world' lattices. J. Phys. A, Math. Gen., 37: 6455-6475, 2004.
    [12] A. C. C. Coolen et al. Finitely connected vector spin systems with random matrix interactions. J. Phys. A: Math. Gen. 38: 8289-8317, 2005.
    [13] B. Wemmenhove et al. Finite connectivity attractor neural networks. J. Phys. A: Math. Gen. 36:9617-9633,2003.
    [14] B. Wemmenhove et al. Slowly evolving random graphs Ⅱ: adaptive geometry in finite-connectivity Hopfield models. J. Phys. A: Math. Gen. 37: 7843-7858, 2004.
    [15] M. Mezard, G. Parisi, N. Sourlas, G. Toulouse. Replica symmetry breaking and the nature of the spin glass phase. J. Phys. France 45,:843-854, 1984.
    [16] B. Wemmenhove, T. Nikoletopoulos, J. P. L. Hatchett. Replica symmetry breaking in the 'small world' spin glass. J. Stat. Mech, P11007, 2005.
    [17]T.Verbeiren.Dilution in recurrent neural networks.Ph.D.Thesis, Katholieke University Leuven,2003.
    [18]D.Bolle, J.Busquets Blanco, T.Verbeiren.The signal-to-noise analysis of the Little-Hopfield model revisited.Journal of Physics A:Math.Gen, 37:1951-1969, 2004.
    [19]S.I.Amari.Statistical neurodynamics of various versions of correlationassociative memory.IEEE International Conference on Neural Networks, 1:633-640, 1988.
    [20]M.Okada.A hierarchy of macrodynamical equations for associative memory.Neural Networks, 8(6):833-838, 1995.
    [21]J.W.Bohland, A.A.Minai.Efficient associative memory using small-world architecture.Neurocomputing, 38-40:489-496, 2001.
    [22]L.da F.Costa, D.Stauffer.Associative recall in non-randomly diluted neuronal networks.Physica A, 330:37-45, 2003.
    [23]P.N.McGraw, M.Menzinger.Topology and computational performance of attractor neural networks.Physical Review E, 68, 047102, 2003.
    [24]J.J.Torres, M.A.Munoz, J.Marro, and P.L.Garrido.Influence of topology on the performance of a neural network.Neurocomputing, 58-60:229-234, 2004.
    [25]D.Stauffer, A.Aharony, L.da F.Costa, and J.Adler.Efficient Hopfield pattern recognition on a scale-free neural network.Eur.Phys.J.B, 32, 395-399, 2003.
    [26]Davey, N., Calcraft, L., Adams, R..High capacity, small-world associative memory models.Connection Science, 18:247-270, 2006.
    [27]Hiraku Oshima, Takashi Odagaki.Storage capacity and retrieval time of small-world neural networks.Physical Review E, 76:036114, 2007.
    [28]H.Sompolinsky.Neural networks with nonlinear synapses and a static noise.Phys.Rev.A,34(3):2571-2575, 1988.
    [29]汪涛,邢小良.感知器的动态稀疏化学习.自动化学报,21(1):93-98, 1995.
    [1]孙功星,朱科军.任务自适应神经网络结构研究.核电子学与探测技术,19(3):164-168, 1999.
    [2]孟祥武,程虎.优化神经网络结构.计算机研究与发展,34(8):594-598, 1997.
    [3]武妍,张立明.神经网络的泛化能力与结构优化算法研究.计算机应用研究,19(6):21-25, 2002.
    [4]雷鸣,朱心飚.自构形神经网络及其应用.计算机科学,21(1):52-54, 1994.
    [5]张敏,赵金城.全局优化神经网络拓扑结构及权值的遗传算法.大连大学学报,20(6):9-13, 1999.
    [6]方剑,席裕庚.神经网络结构设计的准则和方法.信息与控制,25(3):156-164, 1996.
    [7]张鸿宾.训练多层网络的样本数问题.自动化学报,19(1):71-77, 1993.
    [8]D.Hush, J.Salas.Improving the learning rate of back-propagation.In Proc.of the IEEE ICNN, 441-447,1988.
    [9]P.V.S.Ponnapalli., el al.A formal selection and pruning algorithm for feedforward artificial neural network optimization.IEEE Trans.On Neural Network, 10(4):954-968, 1999.
    [10]Y.L.Cun, J.S.Denker, S.A.Solla.Optimal Brain Damage.In Advance in Neural Information Processing System 2, San Mateo, CA:Morgan Kaufmann, 598-605, 1990.
    [11]B.Hassibi, et al.Optimal Brain Surgeon and General Network Pruning.In IEEE Int.Conf Neural Networks, 1:293-299, 1993.
    [12]E.B.Bartlett.Dynamic Node Architecture Learning:an information theortic approach.Neural Networks, 7(1):129-140, 1994.
    [13]Hwang Jeng-Neng, S.S.You, S.R.Lay.The cascade-correlation learning:A projection pursuit perspective.IEEE Trans.Neural Networks, 7(2):278-288, 1996.
    [14]I.Rivals, L.Personnaz.Neural-network construction and selection in nonlinear modeling.IEEE Transactions on Neural Networks, 14:804-819, 2003.
    [15]Vladimir N.Vapnik.统计学习理论的本质.张学工译,北京:清华大学出版社,2000.
    [16]宾德,科伯.玻璃质材料和无序固体:它们的统计力学导论.上海:复旦大学出版社,2006.
    [17]H.Sompolinsky.Neural networks with nonlinear synapses and a static noise.Phys.Rev.A,34(3):2571-2575, 1988.
    [18]J.L.Van-Hemmen.Nonlinear neural networks near saturation.Phys.Rev.A, 36:1959-1969,1987.
    [19]M.V.Tsodyks.Associative memory in asymmetric diluted networks with low level of activity Europhys.Letter., 7(3):203-208, 1988.
    [20]M.R.Evans.Random dilutions in a neural networks for biased patterns.J.Phys.A.Math.Gen., 22:2103-2118, 1989.
    [21]E Gardner.Optimal basins of attraction in randomly sparse neural network models.J.Phys.A:Math.Gen.22:1969-1974, 1989.
    [22]B.Derrida.Distribution of the activities in a diluted neural network.J.Phys.A:Math.Gen.22:2069-2080, 1989.
    [23]A.Canning.Partially connected models of neural networks.J.Phys.A:Math.Gen.21:3275-3284, 1988.
    [24]M.Bouten, A.Engel, A.Komoda, and R.Semeel.Quenched versus annealed dilution in neural networks.J.Phys.A.Math.Gen., 23:4643-4657, 1990.
    [25]M.Okada.A hierarchy of macrodynamical equations for associative memory.Neural Networks, 8(6):833-838, 1995.
    [26]D.Bolle, J.Busquets Blanco, T.Verbeiren.The signal-to-noise analysis of the Little-Hopfield model revisited.Journal of Physics A:Math.Gen, 37:1951-1969, 2004.
    [27]V.Braitenberg, A.Schuz,.Cortex:Statistics and Geometry of Neuronal Connectivity.Berlin:Springer-Verlag, 1998.
    [1]V.Braitenberg, A.Schuz, “Cortex:Statistics and Geometry of Neuronal Connectivity”, Berlin Springer-Verlag, 1998.
    [2]S.Venkatesh.Robustness in neural computation:Random graphs and sparsity, IEEE Trans.Inf.Theory 38:1114-1119,1992.
    [3]J.Komlos, R.Paturi.Effect of connectivity in an associative memory model.Journal of Computer and System Sciences, 47:350-373.1993.
    [4]S.S.Manna, P.Sen.Modulated scale-free network in Euclidean space.Phys.Rev.E 66(6):066114, 2002.
    [5]C.Johansson, A.Lansner.Towards cortex sized artificial neural systems.Neural Networks,20(1):48-61, 2006.
    [6]L O Chua, L Yang.Cellular neural network:applications.IEEE Transactions on Circuits and Systems 35:1273-1290, 1988.
    [7]S.Arik, V.Tavsanoglu.On the global asymptotic stability of delayed cellular neural networks.IEEE Trans.Circuits Syst., 47:571-574, 2000.
    [8]S.Arik.An analysis of global asymptotic stability of delayed cellular neural networks.IEEE Transactions on Neural Networks, 13(5):1239-1242, 2002.
    [9]Zhigang Zeng, Jun Wang, Xiaoxin Liao.Stability analysis of delayed cellular neural networks described using cloning templates.IEEE Transactions on Circuits and Systems I:Regular Papers, 51(11):2313-2324, 2004.
    [10]The C.elegans Sequencing Consortium, Science 282, 1998.
    [11]D.S.Bassett, E.d.Bullmore.Small-World Brain Networks.The Neuroscientist, 12(6):512-523, 2006.
    [12]V.M.Egu(?)luz, D.R.Chialvo, Cecchi GA et al.Scale-free brain functional networks.Phys.Rev.Lett, 94, 018102, 2005.
    [13]J.W.Bohland, A.A.Minai.Efficient associative memory using small-world architecture.Neurocomputing, 38-40:489-496, 2001.
    [14]P.N.McGraw, M.Menzinger.Topology and computational performance of attractor neural networks.Physical Review E, 68(4):047102.1-047102.4, 2003.
    [15]R.Adams, L.Calcraft, and N.Dave y.Evolving high capacity associative memories with efficient wiring.In Proc.International Joint Conference on Neural Networks, 3168-3173, 2005.
    [16]D.Simard, L.Nadeau, H.Kr(o|¨)ger.Fastest learning in small-world neural networks.Physics Letters A, 336:8-15, 2005.
    [17]Jianquan Lu, Juan He.Topology influences performances in the associative memory neural networks.Physics Letters A, 354:335-343, 2006.
    [18]Davey, N., Calcraft, L., Adams, R..High capacity, small-world associative memory models.Connection Science, 18:247-270, 2006.
    [19]Xu Zhi, Gao Jun, Shao Jing.Associative Memory with Small World Connectivity Built on Watts-Strogatz Model.ICNC 2006 Part Ⅰ, LNCS 4211:115-122, 2006.
    [20]B.J.Kim.Performance of networks of artificial neurons:The role of clustering.Physical Review E, 69(4):045101, 2004.
    [21]L.G.Morelli, G.Abramson, M.N.Kuperman.Associative memory on a small-world neural network.The European Physical Journal B, 38:495-500, 2004.
    [22]方锦清,汪小帆,郑志刚.网络科学的理论模型及其应用课题研究的若干进展.复杂系统与复杂性科学,5(4):1-12, 2008.
    [23]H.Sompolinsky.Neural networks with nonlinear synapses and a static noise.Phys.Rev.A,34(3):2571-2575, 1988.
    [24]汪涛,邢小良.感知器的动态稀疏化学习.自动化学报,21(1):93-98, 1995.
    [25]方锦清.网络科学的理论模型探索及其进展.科技导报,24(12):67-72, 2006.
    [26]Vladimir N.Vapnik.统计学习理论的本质.张学工译,北京:清华大学出版社,2000.
    [1]V.M.Egu(?)luz, D.R.Chialvo, Cecchi GA et al.Scale-free brain functional networks.Phys.Rev.Lett, 94, 018102, 2005.
    [2]L.da F.Costa, D.Stauffer.Associative recall in non-randomly diluted neuronal networks.Physica A, 330:37-45, 2003.
    [3]P.N.McGraw, M.Menzinger.Topology and computational performance of attractor neural networks.Physical Review E, 68, 047102, 2003.
    [4]D.Stauffer, A.Aharony, L.da F.Costa, and J.Adler.Efficient Hopfield pattern recognition on a scale-free neural network.Eur.Phys.J.B, 32, 395-399, 2003.
    [5]J.J.Torres, M.A.Munoz, J.Marro, and P.L.Garrido.Influence of topology on the performance of a neural network.Neurocomputing, 58-60:229-234, 2004.
    [6]Jianquan Lu, Juan He.Topology influences performances in the associative memory neural networks.Physics Letters A, 354:335-343, 2006.
    [7]P.L.Krapivsky, S.Redner, F.Leyvraz.Connectivity of Growing Random Networks.Phys.Rev.Lett.85(21):4629-4632, 2000.
    [8]S.N.Dorogovtsev, J.F.F.Mendes, A.N.Samukhin.Structure of Growing Networks with Preferential Linking.Phys.Rev.Lett.85(21):4633-4636, 2000.
    [9]S.N.Dorogovtsev, J.F.F.Mendes.Effect of the accelerating growth of communications networks on their structure.Phys.Rev.E, 63(2):025101,2001.
    [10]S.N.Dorogvtsev, J.F.F.Mendes.Scaling properties of scale-free evolving networks:Continuous approach.Phys.Rev.E, 63:056125, 2001.
    [11]G.Bianconi and A.L.Barab(?)si.Competition and multiscaling in evolving networks.Europhys.Lett.54(4):436-442, 2001.
    [12]S.N.Dorogovtsev, J.F.F.Mendes.Evolution of networks with aging of sites.Phys.Rev.E.,62(2):1842-1845, 2000.
    [13]A.L.Barabasi, E.Ravasz, T.Vicsek.Deterministic scale-free networks.Physica A:Statistical Mechanics and its Applications, 299(3-4):559-564, 2001.
    [14]D.Shi, Q.Chen, L.Liu.Markov chain-based numerical method for degree distributions of growing networks.Phys.Rev.E, 71(3):036140, 2005.
    [15]K.Klemm, V.M.Egu(?)luz.Growing scale-free networks with small-world behavior.Phys.Rev E, 65(5):057102, 2002.
    [16]J.F.F.Mendes, S.N.Dorogovtsev, A.F.loffe.Evolution of networks:From biological nets to the internet and WWW.Oxford University Press, Oxford UK, 2003.
    [17]郭雷,许晓鸣.复杂网络.上海:上海科技教育出版社,2006.
    [18]H.Sompolinsky.Neural networks with nonlinear synapses and a static noise.Phys.Rev.A,34(3):2571-2575, 1988.
    [19]汪涛,邢小良.感知器的动态稀疏化学习.自动化学报,21(1):93-98, 1995.
    [20]S.H.Yook, H.Jeong, A.L.Barabasi..Weighted evolving networks.Physical Review Letters,86 (25):5835-5838, 2001.
    [21]A.L.Barabasi, H.Jeong, Z.Neda, E.Ravasz, A.Schubertd,T.Vicsek.Evolution of the social network of scientific collaborations.Physica A:Statistical Mechanics and its Applications,311(3-4):590-614, 2002.