Sine neural network (SNN) with double-stage weights and structure determination (DS-WASD)
详细信息    查看全文
  • 作者:Yunong Zhang ; Lu Qu ; Jinrong Liu ; Dongsheng Guo ; Mingming Li
  • 关键词:Sine neural network (SNN) ; Double ; stage weights and structure determination (DS ; WASD) ; Function approximation ; Linear independence
  • 刊名:Soft Computing - A Fusion of Foundations, Methodologies and Applications
  • 出版年:2016
  • 出版时间:January 2016
  • 年:2016
  • 卷:20
  • 期:1
  • 页码:211-221
  • 全文大小:1,410 KB
  • 参考文献:Bartelett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44(2):525–536CrossRef
    Cheney W, Lignt W (2000) A course in approximation theory. Amer Math Society, Washington, DC
    Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst 2(4):303–314MATH MathSciNet CrossRef
    Deimling K (1985) Nonlinear functional analysis. Springer, BerlinMATH CrossRef
    Friedman JH (2001) Greedy function approximation: a gradient boosting machine. Ann Stat 29(5):1189–1232MATH CrossRef
    Funahashi K (1989) On the approximate realization of continuous mappings by neural networks. Neural Netw 2(3):183–192
    Halawa K (2011) A method to improve the performance of multilayer perception by utilizing various activation functions in the last hidden layer and the least squares method. Neural Process Lett 34(2):293–303CrossRef
    Ham FM, Kostanic I (2001) Principles of neurocomputing for science and engineering. McGraw-Hill Companies, New York
    Hornik H (1991) Approximation capabilities of multilayer feedforward networks. Neural Netw 4(2):251–257CrossRef
    Huang GB, Chen L (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892CrossRef
    John M, Kurtis F (2004) Numerical methods using MATLAB, 4th edn. Prentice Hall, London
    Jones LK (1990) Constructive approximations for neural networks by sigmoidal functions. IEEE Proc 78(10):1586–1589CrossRef
    Kadirkamanathan V, Niranjan M (1993) A function estimation approach to sequential learning with neural networks. Neural Comput 5(6):954–975CrossRef
    Li YW, Sundararajan N, Saratchandran P (1997) A sequential learning scheme for function approximation using minimal radial basis function neural networks. Neural Comput 9(2):461–478CrossRef
    Llanas B, Sainz FJ (2006) Constructive approximate interpolation by neural networks. J Comput Appl Math 188(2):283–308MATH MathSciNet CrossRef
    Mahil J, Raja TSR (2014) An intelligent biological inspired evolutionary algorithm for the suppression of incubator interference in premature infants ECG. Soft Comput 18(3):571–578CrossRef
    Moré JJ (1978) The Levenberg-Marquardt algorithm: implementation and theory. Lect Notes Math 630:105–116CrossRef
    Pérez-Cruz F, Camps-Valls G, Soria-Olivas E, Pérez-Ruixo JJ, Figueiras-Vidal AR, Artés-Rodríguez A (2002) Multi-dimensional function approximation and regression estimation. Lect Notes Comput Sci 2415:796–796
    Romero E, Alquézar R (2002) A new incremental method for function approximation using feed-forward neural networks. In: Proceedings of the international joint conference on neural networks. Honolulu, American, pp 1968–1973
    Sheela KG, Deepa SN (2014) Performance analysis of modeling framework for prediction in wind farms employing artificial neural networks. Soft Comput 18(3):607–615CrossRef
    Steven JL (1999) Linear algebra with applications, 5th edn. Prentice Hall/Pearson, New Jersey
    Taylor JG (1993) Mathematical approaches to neural networks. Elsevier Science Publishers, The NetherlandsMATH
    Wang GT, Li P, Cao JT (2012) Variable activation function extreme learning machine based on residual prediction compensation. Soft Comput 16(9):1477–1484CrossRef
    Wang JJ, Xu ZB (2009) Approximation method of multivariate polynomials by feedforward neural networks. Chin J Comput 32(12):2482–2488
    Zhang Y, Tan N (2010) Weights direct determination of feedforward neural networks without iteration BP-training. In: Wang LSL, Hong TP (eds) Intelligent soft computation and evolving data mining: integrating advanced technologies. IGI Global, USA
  • 作者单位:Yunong Zhang (1) (2) (3)
    Lu Qu (1)
    Jinrong Liu (1)
    Dongsheng Guo (1) (2) (3)
    Mingming Li (1)

    1. School of Information Science and Technology, Sun Yat-sen University, Guangzhou, 510006, China
    2. SYSU-CMU Shunde International Joint Research Institute, Shunde, 528300, China
    3. Key Laboratory of Autonomous Systems and Networked Control, Ministry of Education, Guangzhou, 510640, China
  • 刊物类别:Engineering
  • 刊物主题:Numerical and Computational Methods in Engineering
    Theory of Computation
    Computing Methodologies
    Mathematical Logic and Foundations
    Control Engineering
  • 出版者:Springer Berlin / Heidelberg
  • ISSN:1433-7479
文摘
To solve complex problems such as multi-input function approximation by using neural networks and to overcome the inherent defects of traditional back-propagation neural networks, a single hidden-layer feed-forward sine-activated neural network, sine neural network (SNN), is proposed and investigated in this paper. Then, a double-stage weights and structure determination (DS-WASD) method, which is based on the weights direct determination method and the approximation theory of using linearly independent functions, is developed to train the proposed SNN. Such a DS-WASD method can efficiently and automatically obtain the relatively optimal SNN structure. Numerical results illustrate the validity and efficacy of the SNN model and the DS-WASD method. That is, the proposed SNN model equipped with the DS-WASD method has great performance of approximation on multi-input function data. Keywords Sine neural network (SNN) Double-stage weights and structure determination (DS-WASD) Function approximation Linear independence

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700