结合树形概率和双向长短期记忆的渐步性句法分析方法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:A step-by-step syntactic analysis method based on tree-like probability and bidirectional long short term memory
  • 作者:谌志群 ; 鞠婷 ; 王冰
  • 英文作者:CHEN Zhiqun;JU Ting;WANG Bing;Institute of Cognitive and Intelligent Computing,Hangzhou Dianzi University;
  • 关键词:树形概率计算方法 ; 双向长短时记忆 ; 渐步性 ; 依存句法分析 ; 句法标签分类
  • 英文关键词:tree-like probability calculation method;;bidirectional long short term memory;;step-by-step;;dependency parsing;;syntactic label classification
  • 中文刊名:XDZK
  • 英文刊名:Journal of Xiamen University(Natural Science)
  • 机构:杭州电子科技大学认知与智能计算研究所;
  • 出版日期:2018-12-03 11:32
  • 出版单位:厦门大学学报(自然科学版)
  • 年:2019
  • 期:v.58;No.269
  • 语种:中文;
  • 页:XDZK201902016
  • 页数:6
  • CN:02
  • ISSN:35-1070/N
  • 分类号:101-106
摘要
为有效解决数据的稀疏性问题,并考虑句法预测的内在层次性,提出了一个基于双向长短时记忆(bidirectional long short term memory,BLSTM)神经网络模型的渐步性句法分析模型.该模型将树形概率计算方法应用到对句法标签分类的研究中,利用句法结构和标签之间的层次关系,提出一种从句法结构到句法标签的渐步性句法分析方法,再使用句法分析树来生成句法标签的特征表示,并输入到BLSTM神经网络模型里进行句法标签的分类.在清华大学语义依存语料库上进行实验的结果表明,与链式概率计算方法以及其他依存句法分析器比较,依存准确率提升了0~1个百分点,表明新方法是可行、有效的.
        In order to effectively solve the problem of data sparseness and inherent level of syntactic prediction,an incremental stepwise dependency parsing model based on bidirectional long short term memory(BLSTM)is proposed.This paper applying the tree-like probability calculation method to the study of syntactic tag classification,using the hierarchical relationship between syntactic structure and tag,proposes a step-by-step syntactic analysis method from syntactic structure to syntax tag,using syntactic analysis tree to generate the characteristics of the syntactic tag which are input into the BLSTM model to classify syntactic tags.Compared with other syntactic analysis methods and chained probability calculation method on the Semantic Dependency Corpus dataset of Tsinghua University,the dependency accuracy rate is improved by 0-1 percent.It shows that the new method is feasible and effective.
引文
[1]刘海涛.依存语法和机器翻译[J].语言文字应用,1997,23(3):89-93.
    [2]HAYS D G.Dependency theory:a formalism and some observations[J].Language,1964,40(4):511-525.
    [3]GAIFMAN H.Dependency systems and phrase-structure systems[J].Information and Control,1965,8(3):304-337.
    [4]YAMADA H.Statistical dependency analysis with support vector machines[C]∥Proceedings of the 8th International Workshop on Parsing Technologies.Nancy:International Workshop on Parsing Technologies,2003:195-206.
    [5]LAI T B Y,HUANG C,ZHOU M,et al.Span-based statistical dependency parsing of chinese[C]∥Proceedings of the 6th Natural Language Processsing Pacific Rim Syposium(NLPRS2001).Tokyo:National Center of Sciences,2001:677-684.
    [6]COLLOBERT R.Deep learning for efficient discriminative parsing[C]∥International Conference on Artificical Intelligence and Statistics.Lauderdate:AISTATS,2011:224-232.
    [7]CHEN D,MANNING C D.A fast and accurate dependency parser using neural networks[C]∥Proceedings of the2014Conference on Empirical Methods in Natural Language Processing.Doha:Association for Computational Linguistics,2014:740-750.
    [8]DURRETT G,KLEIN D.Neural CRF Parsing[EB/OL].[2018-03-01].http:∥arxiv.org/abs/1507.03641.
    [9]MA X,HOVY E.Neural probabilistic model for nonprojective MST parsing[C]∥Proceeding of the 8th International Point Conference on Natural Language Processing.Taipei:IJCNLP,2017:59-69.
    [10]王衡军,司念文,宋玉龙,单义栋.结合全局向量特征的神经网络依存句法分析模型[J].通信学报,2018,39(2):53-64.
    [11]WANG W,CHANG B.Improved graph-based dependency parsing via hierarchical LSTM networks[C]∥China National Conference on Chinese Computational Linguistics.Yantai:Springer International Publishing,2016:25-32.
    [12]ZHANG X,CHENG J,LAPATA M.Dependency parsing as head selection[EB/OL].[2018-03-01].http:∥arxiv.org/pdf/1606.01280.pdf.
    [13]张丹,周俏丽,张桂平.引入层次成分分析的依存句法分析[J].沈阳航空航天大学学报,2017,34(1):76-82.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700