On the approximation capability of recurrent neural networks
详细信息    查看全文
文摘
The capability of recurrent neural networks of approximating functions from lists of real vectors to a real vector space is examined. Any measurable function can be approximated in probability. Additionally, bounds on the resources sufficient for an approximation can be derived in interesting cases. On the contrary, there exist computable mappings on symbolic data which cannot be approximated in the maximum norm. For restricted input length, some continuous functions on real-valued sequences need a number of neurons increasing at least linearly in the input length. On unary sequences, any mapping with bounded range can be approximated in the maximum norm. Consequently, standard sigmoidal networks can compute any mapping on offline inputs as a computational model.
NGLC 2004-2010.National Geological Library of China All Rights Reserved.
Add:29 Xueyuan Rd,Haidian District,Beijing,PRC. Mail Add: 8324 mailbox 100083
For exchange or info please contact us via email.