引入外部记忆的循环神经网络的口语理解
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Spoken Language Understanding Method Based on Recurrent Neural Network with Persistent Memory
  • 作者:许莹莹 ; 黄浩
  • 英文作者:XU Yingying;HUANG Hao;School of Information Science and Engineering,Xinjiang University;
  • 关键词:口语理解 ; 循环神经网络 ; 长短时记忆网络 ; 神经图灵机
  • 英文关键词:Spoken Language Understanding(SLU);;Recurrent Neural Network(RNN);;Long Short Term Memory(LSTM)network;;neural turing machine
  • 中文刊名:JSGG
  • 英文刊名:Computer Engineering and Applications
  • 机构:新疆大学信息科学与工程学院;
  • 出版日期:2018-12-25 17:17
  • 出版单位:计算机工程与应用
  • 年:2019
  • 期:v.55;No.931
  • 基金:国家自然科学基金(No.61365005,No.61663044,No.61761041);; 新疆大学博士科研启动基金(No.BS160239)
  • 语种:中文;
  • 页:JSGG201912021
  • 页数:5
  • CN:12
  • 分类号:150-153+166
摘要
循环神经网络(RNN)越来越在口语理解(Spoken Language Understanding,SLU)任务中显示出优势。然而,由于梯度消失和梯度爆炸问题,简单循环神经网络的存储容量受到限制。提出一种使用外部存储器来提高记忆能力的循环神经网络。并在ATIS数据集上进行了实验,并与其他公开报道的模型进行比较。结果说明,在口语理解任务上,提出的引入外部记忆的循环神经网络在准确性、召回率和F1值都有较明显提高,优于传统循环神经网络及其变体结构。
        Recurrent Neural Network(RNN)has increasingly shown its advantages in the Spoken Language Understanding(SLU)task. However, because of the problem of gradient disappearance and gradient explosion, the storage capacity of simple recurrent neural network is limited. A RNN that uses external memory is proposed to improve memory. Experiments are carried out on the ATIS data set and compared with other publicly reported models. The results show that, in oral comprehension tasks, the RNN introduced external memory has significantly improved accuracy, recall rate and F1-score, which is superior to traditional recurrent neural network and its variant structure.
引文
[1]Mctear M,Callejas Z,Griol D.Spoken language understanding[M].[S.l.]:Springer International Publishing,2016.
    [2]Wang Y Y,Yu D,Ju Y C,et al.An introduction to voice search[J].IEEE Signal Processing Magazine,2008,25(3):28-38.
    [3]Minker W,Bennacef S,Gauvain J.A stochastic case frame approach for natural language understanding[C]//Proceedings of the International Conference on Spoken Language,1996.
    [4]Raymond C,Riccardi G.Generative and discriminative algorithms for spoken language understanding[C]//Proceedings of the INTERSPEECH,2007.
    [5]Lafferty J D,Mccallum A,Pereira F C N.Conditional random fields:probabilistic models for segmenting and labeling sequence data[C]//Proceedings of the Eighteenth International Conference on Machine Learning,2001.
    [6]Bengio Y,Ducharme R,Vincent P,et al.A neural probabilistic language model[J].Journal of Machine Learning Research,2003,3(6):1137-1155.
    [7]Mikolov T,Karafiát M,Burget L,et al.Recurrent neural network based language model[C]//Proceedings of the INTERSPEECH 2010,Conference of the International Speech Communication Association,Makuhari,Chiba,Japan,September 2010.
    [8]Mesnil G,He X,Deng L,et al.Investigation of recurrentneural-network architectures and learning methods for spoken language understanding[C]//Proceedings of the INTERSPEECH,2013.
    [9]Yao K,Zweig G,Hwang M Y,et al.Recurrent neural networks for language understanding[C]//Proceedings of the INTERSPEECH,2013.
    [10]Mesnil G,Dauphin Y,Yao K,et al.Using recurrent neural networks for slot filling in spoken language understanding[J].IEEE/ACM Transactions on Audio Speech&Language Processing,2015,23(3):530-539.
    [11]Jaech A,Heck L P,Ostendorf M.Domain adaptation of recurrent neural networks for natural language understanding[J].arXiv:1604.00117,2016.
    [12]Jozefowicz R,Zaremba W,Sutskever I.An empirical exploration of recurrent network architectures[C]//Proceedings of the International Conference on International Conference on Machine Learning,2015.
    [13]Hochreiter S,Schmidhuber J.Long short-term memory[J].Neural Computation,2013,9(8):1735-1780.
    [14]Cho K,Merrienboer B V,Gulcehre C,et al.Learning phrase representations using RNN encoder-decoder for statistical machine translation[J].Computer Science,2014.
    [15]Graves A,Wayne G,Danihelka I.Neural turing machines[J].Computer Science,2014.
    [16]Weston J,Chopra S,Bordes A.Memory networks[J].Eprint Arxiv,2014.
    [17]Zhao K,Li Y,Zhang C,et al.PRNN:recurrent neural network with persistent memory[J].ar Xiv:1801.08094v2,2018.
    [18]Dahl D A,Bates M,Brown M,et al.Expanding the scope of the ATIS task:the ATIS-3 corpus[C]//Proceedings of the Workshop on Human Language Technology,1994.
    [19]Tur G,Hakkanitur D,Heck L.What is left to be understood in ATIS?[C]//2010 IEEE Spoken Langnage Technology Workshop,2010.
    [20]Ramshaw L A,Marcus M P.Text chunking using transformation-based learning[J].Text Speech&Language Technology,1995,11:82-94.
    [21]Pascanu R,Mikolov T,Bengio Y.On the difficulty of training Recurrent Neural Networks[J].ar Xiv:1211.5063v2,2012.
    [22]Bergstra J,Breuleux O,Bastien F,et al.Theano:a CPUand GPU math compiler in Python[C]//Proceedings of the 9th Python in Science Conference,2010:3-10.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700