基于连接自组织发育的稀疏跨越-侧抑制神经网络设计
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Design of Sparse Span-lateral Inhibition Neural Network Based on Connection Self-organization Development
  • 作者:杨刚 ; 王乐 ; 戴丽珍 ; 杨辉
  • 英文作者:YANG Gang;WANG Le;DAI Li-Zhen;YANG Hui;School of Electrical and Automation Engineering, East China Jiaotong University;Key Laboratory of Advanced Control and Optimization of Jiangxi Province;
  • 关键词:跨越–侧抑制神经网络 ; 稀疏 ; 小世界网络 ; 智力发展
  • 英文关键词:Span-lateral inhibition neural network(S-LINN);;spares;;small-world network;;intelligence development
  • 中文刊名:MOTO
  • 英文刊名:Acta Automatica Sinica
  • 机构:华东交通大学电气与自动化工程学院;江西省先进控制与优化重点实验室;
  • 出版日期:2018-04-18 14:45
  • 出版单位:自动化学报
  • 年:2019
  • 期:v.45
  • 基金:国家自然科学基金(61663012,61673172,61733005);; 国家留学基金(201509795007);; 江西省自然科学基金(20161BAB212054);; 江西省教育厅科技项目(GJJ150490);; 江西省交通运输厅科技项目(2014X0015)资助~~
  • 语种:中文;
  • 页:MOTO201904015
  • 页数:11
  • CN:04
  • ISSN:11-2109/TP
  • 分类号:174-184
摘要
针对跨越–侧抑制神经网络(Span-lateral inhibition neural network, S-LINN)的结构调整及参数学习问题,结合生物神经系统中神经元的稀疏连接特性,依据儿童及青少年智力发展水平与大脑皮层发育之间的相互关系,提出以小世界网络连接模式进行初始稀疏化的连接自组织发育稀疏跨越–侧抑制神经网络设计方法.定义网络连接稀疏度及神经元输出贡献率,设计网络连接增长–修剪规则,根据智力超常组皮层发育与智力水平的对应关系调整和控制网络连接权值,动态调整网络连接实现网络智力的自组织发育.通过非线性动力学系统辨识及函数逼近基准问题的求解,证明在同等连接复杂度的情况下,稀疏连接的跨越–侧抑制神经网络具有更好的泛化能力.
        Inspired by the sparse connection of neurons in biological nervous system and the relationship between children and adolescents intellectual ability and cortical development, a connection self-organization development-based sparse span-lateral inhibition neural network(s S-LINN) is developed to solve the structure adjustment and parameter learning problem, which adopts the small-world network connection mode as the initial sparse network architecture. A growing-pruning rule of network connection is designed to adjust and control the sparseness of network connections based on the definitions of connection sparseness and neuron output contribution rate. Performance of the proposed sparse S-LINN is evaluated successfully through simulation using nonlinear dynamic system identification and function approximation benchmark problems. It is shown that the proposed s S-LINN can produce a very compact structure with good generalization ability in comparison with other methods.
引文
1 Braitenberg V. Cortical architectonics:general and areal.Architectonics of the Cerebral Cortex. New York, USA:Raven Press, 1978. 443-465
    2 Paschke P, Moller R. Simulation of sparse random networks on a CNAPS SIMD neurocomputer. Neuromorphic Systems:Engineering Silicon from Neurobiology. Singapore:Scientific Press, 1998. 251-260
    3 Liu D R, Michel A N. Robustness analysis and design of a class of neural networks with sparse interconnecting structure. Neurocomputing, 1996, 12(1):59-76
    4 Gripon V, Berrou C. Sparse neural networks with large learning diversity. IEEE Transactions on Neural Networks,2011, 22(7):1087-1096
    5 Guo Z X, Wong W K, Li M. Sparsely connected neural network-based time series forecasting. Information Sciences,2012, 193:54-71
    6 Wang J, Cai Q L, Chang Q Q, Zurada J M. Convergence analyses on sparse feedforward neural networks via group lasso regularization. Information Sciences, 2017, 381:250-269
    7 Watts D J, Strogatz S H. Collective dynamics of “SmallWorld” networks. Nature, 1998, 393(6684):440-442
    8 Sporns O, Zwi J D. The small world of the cerebral cortex.Neuroinformatics, 2004, 2(2):145-162
    9 Bassett D S, Bullmore E. Small-world brain networks. The Neuroscientist, 2006, 12(6):512-523
    10 Ahn Y Y, Jeong H, Kim B J. Wiring cost in the organization of a biological neuronal network. Physica A:Statistical Mechanics and Its Applications, 2006, 367:531-537
    11 Zheng P S, Tang W S, Zhang J X. A Simple method for designing efficient small-world neural networks. Neural Networks, 2010, 23(2):155-159
    12 Simard D, Nadeau L, Kr¨oger H. Fastest learning in smallworld neural networks. Physics Letters A, 2005, 336(1):8-15
    13 Lago-Fern′andez L F, Huerta R, Corbacho F, Sig¨uenza J A.Fast response and temporal coherent oscillations in smallworld networks. Physical Review Letters, 2000, 84(12):2758-2761
    14 Morelli L G, Abramson G, Kuperman M N. Associative memory on a small-world neural network. The European Physical Journal B—Condensed Matter and Complex Systems, 2004, 38(3):495-500
    15 Wang Shuang-Xin, Yang Cheng-Hui. Novel small-world neural network based on topology optimization. Control and Decision, 2014, 29(1):77-82(王爽心,杨成慧.基于层连优化的新型小世界神经网络.控制与决策, 2014, 29(1):77-82)
    16 Lun Shu-Xian, Lin Jian, Yao Xian-Shuang. Time series prediction with an improved echo state network using small world network. Acta Automatica Sinica, 2015, 41(9):1669-1679(伦淑娴,林健,姚显双.基于小世界回声状态网的时间序列预测.自动化学报, 2015, 41(9):1669-1679)
    17 Erkaymaz O, Ozer M, Perc M. Performance of small-world feedforward neural networks for the diagnosis of diabetes.Applied Mathematics and Computation, 2017, 311:22-28
    18 Peters A, Sethares C. Organization of pyramidal neurons in area 17 of monkey visual cortex. The Journal of Comparative Neurology, 1991, 306(1):1-23
    19 Markram H, Toledo-Rodriguez M, Wang Y, Gupta A, Silberberg G, Wu C Z. Interneurons of the neocortical inhibitory system. Nature Reviews Neuroscience, 2004, 5(10):793-807
    20 Mountcastle V B. The columnar organization of the neocortex. Brain, 1997, 120(4):701-722
    21 Hubel D H, Wiesel T N. Sequence regularity and geometry of orientation columns in the monkey striate cortex. The Journal of Comparative Neurology, 1974, 158(3):267-293
    22 Lubke J, Feldmeyer D. Excitatory signal flow and connectivity in a cortical column:focus on barrel cortex. Brain Structure and Function, 2007, 212(1):3-17
    23 Buxhoeveden D P, Casanova M F. The minicolumn hypothesis in neuroscience. Brain, 2002, 125(5):935-951
    24 Rockland K S, Ichinohe N. Some Thoughts on cortical minicolumns. Experimental Brain Research, 2004, 158(3):265-277
    25 Yang Gang, Qiao Jun-Fei, Bo Ying-Chun, Han Hong-Gui. A lateral inhibition neural network based on neocortex topology. Control and Decision, 2013, 28(11):1702-1706(杨刚,乔俊飞,薄迎春,韩红桂.一种基于大脑皮层结构的侧抑制神经网络.控制与决策, 2013, 28(11):1702-1706)
    26 Yang G, Qiao J F. A fast and efficient two-phase sequential learning algorithm for spatial architecture neural network.Applied Soft Computing, 2014, 25:129-138
    27 Fiesler E. Comparative bibliography of ontogenic neural networks. In:Proceedings of the 1994 International Conference on Artificial Neural Networks. Sorrento, Italy:Springer,1994. 793-796
    28 Elizondao D, Fiesler E, Korczak J. Non-ontogenic sparse neural networks. In:Proceedings of the 1995 IEEE International Conference on Neural Networks. Perth, WA, Australia:IEEE, 1995. 290-295
    29 Newman M E J, Watts D J. Renormalization group analysis of the small-world network model. Physics Letters A, 1999,263(4-6):341-346
    30 Wang Bo, Wang Wan-Liang, Yang Xu-Hua. Research of modeling and simulation on WS and NW small-world network model. Journal of Zhejiang University of Technology,2009, 37(2):179-182, 189(王波,王万良,杨旭华. WS与NW两种小世界网络模型的建模及仿真研究.浙江工业大学学报, 2009, 37(2):179-182, 189)
    31 Shaw P, Greenstein D, Lerch J, Clasen L, Lenroot R, Gogtay N, Evans A, Rapoport J, Giedd J. Intellectual ability and cortical development in children and adolescents. Nature,2006, 440(7084):676-679
    32 Leng G, Mc Ginnity T M, Prasad G. Design for selforganizing fuzzy neural networks based on genetic algorithms. IEEE Transactions on Fuzzy Systems, 2006, 14(6):755-766
    33 Lauer F, Bloch G. Incorporating prior knowledge in support vector regression. Machine Learning, 2008, 70(1):89-118
    34 Qu Y J, Hu B G. RBF networks for nonlinear models subject to linear constraints. In:Proceedings of the 2009 IEEE International Conference on Granular Computing. Nanchang,China:IEEE, 2009. 482-487
    35 Han H G, Qiao J F. A structure optimisation algorithm for feedforward neural network construction. Neurocomputing,2013, 99:347-357
    36 Narendra K S, Parthasarathy K. Identification and control of dynamical systems using neural networks. IEEE Transactions on Neural Networks, 1990, 1(1):4-27
    37 Manngard M, Kronqvist J, Boling J M. Structural learning in artificial neural networks using sparse optimization.Neurocomputing, 2018, 272:660-667

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700