基于深层注意力的LSTM的特定主题情感分析
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Deeper attention-based LSTM for aspect sentiment analysis
  • 作者:胡朝举 ; 梁宁
  • 英文作者:Hu Chaoju;Liang Ning;School of Control & Computer Engineering,North China Electric Power University;
  • 关键词:特定主题情感分析 ; 深层注意力 ; LSTM ; 深度学习 ; 自然语言处理
  • 英文关键词:aspect sentiment analysis;;deeper attention;;LSTM;;deep learning;;natural language processing
  • 中文刊名:JSYJ
  • 英文刊名:Application Research of Computers
  • 机构:华北电力大学控制与计算机工程学院;
  • 出版日期:2018-03-14 17:29
  • 出版单位:计算机应用研究
  • 年:2019
  • 期:v.36;No.330
  • 语种:中文;
  • 页:JSYJ201904027
  • 页数:5
  • CN:04
  • ISSN:51-1196/TP
  • 分类号:121-125
摘要
目前特定主题情感分析任务中,传统的基于注意力的深度学习模型缺乏对主题特征和情感信息的有效关注。针对该问题,构建了融合主题特征的深层注意力的LSTM模型(deeper attention LSTM with aspect embedding,AE-DATT-LSTM),通过共享权重的双向LSTM将主题词向量和文本词向量进行训练,得到主题特征和文本特征进行特征融合;经过深层注意力机制的处理,由分类器得到相应主题的情感分类结果。在Sem Eval-2014task4和Sem Eval-2017 task4数据集上的实验结果表明,该方法在特定主题情感分析任务中,较之前基于注意力的情感分析模型在准确率和稳定性上有了进一步的提高。主题特征和深层注意力机制的引入,对于基于特定主题的情感分类任务具有重要的意义,为舆情分析、问答系统和文本推理等领域提供了方法的支持。
        In the current aspect sentiment analysis task,the traditional attention-based deep learning model lacks the effective attention to the aspect information and the sentiment information. This paper put forward a new LSTM model,which combining aspect information and deeper attention. Through the bidirectional LSTM with shared weights,it trained the aspect embedding and the text embedding to get the aspect feature and text feature to carry on the feature fusion,and after the deeper attention mechanism processing,it obtained the classification result of the corresponding aspect by the classifier. The experimental results of the SemEval-2014 task4 and SemEval-2017 task4 datasets show that this method has further improved the accuracy and stability of the attention-based sentiment analysis model in the aspect sentiment analysis. The introduction of aspect features and deeper attention mechanisms is of great significance to the task of sentiment analysis based on aspect,which provides method support for public opinion analysis,question answering system and text reasoning.
引文
[1]王仲远,程健鹏,王海勋,等.短文本理解研究[J].计算机研究与发展,2016,53(2):262-269.(Wang Zhongyuan,Cheng Jianpeng,Wang Haixun,et al.Short text understanding:a survey[J].Journal of Computer Research and Development,2016,53(2):262-269.)
    [2]Liu Bing.Sentiment analysis and opinion mining[J].Synthesis Lectures on Human Language Technologies,2012,5(1):1-167.
    [3]Socher R,Perelygin A,Wu J,et al.Recursive deep models for semantic compositionality over a sentiment treebank[C]//Proc of Conference on Empirical Methods in Natural Language Processing.[S.l.]:Stanford Press,2013.
    [4]Tang Duyu,Qin Bing,Liu Ting.Document modeling with gated recurrent neural network for sentiment classification[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2015:1422-1432.
    [5]Tai Kaisheng,Socher R,Manning S D.Improved semantic representations from tree-structured long short-term memory networks[C]//Proc of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing.2015:1556-1566.
    [6]Minh V,Graves N H A,Kavukcuoglu K.Recurrent models of visual attention[C]//Proc of the 27th International Conference on Neural Information Processing Systems.Cambridge,MA:MIT Press,2014:2204-2212.
    [7]Bahdanau D,Cho K,Bengio Y.Neural machine translation by jointly learning to align and translate[C]//Proc of International Conference on Learning Representations.2015.
    [8]Rush A M,Chopra S,Weston J.A neural attention model for sentence summarization[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2015:379-389.
    [9]Yang Z C,Yang D Y,Dyer C,et al.Hierarchical attention networks for document classification[C]//Proc of HLT-NAACL.Stroudsburg,PA:Association for Computational Linguistics,2016:1480-1489.
    [10]Pavlopoulos J,Malakasiotis P,Androutsopoulos I.Deeper attention to abusive user content moderation[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2017:1136-1146.
    [11]Pontiki M,Galanis D,Pavlopoulos J,et al.Sem Eval-2014 task4:aspect based sentiment analysis[C]//Proc of the 8th International Workshop on Semantic Evaluation.2014:27-35.
    [12]Rosenthal S,Farra N,Nakov P.Sem Eval-2017 task4:sentiment analysis in Twitter[C]//Proc of SemEval.2017.
    [13]Kiritchenko S,Zhu Xiaodan,Cherry C,et al.NRC-Canada-2014:detecting aspects and sentiment in customer reviews[C]//Proc of the8th International Workshop on Semantic Evaluation.2014:437-442.
    [14]Nguyen T H,Shirai K.PhraseRNN:phrase recursive neural network for aspect-based sentiment analyisis[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2015:2509-2514.
    [15]Tang Duyu,Qin Bing,Feng Xiaocheng,et al.Effective LSTMs for target-dependent sentiment classification[C]//Proc of the 26th International Conference on Computational Linguistics.2016:3298-3307.
    [16]Wang Yequan,Huang Minlie,Zhu Xiaoyan,et al.Attention-based LSTM for aspect-level sentiment classification[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2016:606-615.
    [17]Chen Peng,Sun Zhongqian,Bing Lidong,et al.Recurrent attention network on memory for aspect sentiment analysis[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2017:463-472.
    [18]Tang Duyu,Qin Bing,Liu Ting.Aspect level sentiment classification with deep memory network[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2016:214-224.
    [19]梁斌,刘全,徐进,等.基于多注意力卷积神经网络的特定目标情感分析[J].计算机研究与发展,2017,54(8):1724-1735.(Liang Bin,Liu Quan,Xu Jin,et al.Aspect-based sentiment analysis based on multi-attention CNN[J].Journal of Computer Research and Development,2017,54(8):1724-1735.)
    [20]Baziotis C,Pelekis N,Doulkeridis C.Data Stories at Sem Eval-2017task 4:deep LSTM with attention for message-level and topic-based sentiment analysis[C]//Proc of the 11th International Workshop on Semantic Evaluations.2017:747-754.
    [21]Hochreiter S,Schmidhuber J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780.
    [22]Pennington J,Socher R,Manning C D.Glo Ve:global vectors for word representation[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2014:1532-1543.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700