基于卷积神经网络和Tree-LSTM的微博情感分析
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Sentiment analysis of micro-blog based on CNN and Tree-LSTM
  • 作者:王文凯 ; 王黎明 ; 柴玉梅
  • 英文作者:Wang Wenkai;Wang Liming;Chai Yumei;School of Information Engineering,Zhengzhou University;
  • 关键词:卷积神经网络 ; 注意力机制 ; 长短期记忆神经网络 ; 微博情感分析
  • 英文关键词:CNN;;attention mechanism;;LSTM;;micro-blog sentiment analysis
  • 中文刊名:JSYJ
  • 英文刊名:Application Research of Computers
  • 机构:郑州大学信息工程学院;
  • 出版日期:2018-03-14 17:29
  • 出版单位:计算机应用研究
  • 年:2019
  • 期:v.36;No.331
  • 基金:社会媒体文本情感可视计算方法研究基金资助项目(U1636111)
  • 语种:中文;
  • 页:JSYJ201905021
  • 页数:5
  • CN:05
  • ISSN:51-1196/TP
  • 分类号:97-101
摘要
微博情感分析旨在研究用户关于热点事件的情感观点,研究表明深度学习在微博情感分析上具有可行性。针对传统卷积神经网络进行微博情感分析时忽略了非连续词之间的相关性,为此将注意力机制应用到卷积神经网络(CNN)模型的输入端以改善此问题。由于中文微博属于短文本范畴,卷积神经网络前向传播过程中池化层特征选择存在丢失过多语义特征的可能性,为此在卷积神经网络的输出端融入树型的长短期记忆神经网络(LSTM),通过添加句子结构特征加强深层语义学习。在两种改进基础上构造出一种微博情感分析模型(Att-CTL),实验表明该模型在微博情感分析上具有优良的特性,尤其在极性转移方面仍保持较高的F_1值。
        Micro-blog sentiment analysis aims to study the emotional views of users on hot events,and research shows that deep learning is feasible in micro-blog's sentiment analysis. In view of traditional convolutional neural network,micro-blog sentiment analysis ignores the correlation between discontinuous words. Therefore,this paper applied attention mechanism to the input end of convolutional neural network(CNN) model to improve this problem. Because Chinese micro-blog belongs to the short text category,there was a possibility of losing too many semantic features in the selection of pooling layer features in the process of convolutional neural network forward propagation,so into the long short term memory neural network tree at the output of the convolutional neural network terminal(LSTM),by adding the sentence structure to strengthen the deep semantic learning. Based on the two improvements,it constructed a Chinese micro-blog sentiment analysis model(Att-CTL). Experiments show that the model has excellent characteristics in Chinese micro-blog sentiment analysis,especially in polarity shifting,and maintains a high F_1 value.
引文
[1]Pang Bo,Lee L.Seeing stars:exploiting class relationships for sentiment categorization with respect to rating scales[C]//Proc of the43rd Meeting of the Association for Computational Linguistics.2005:115-124.
    [2]Liu Bing.Sentiment analysis and opinion mining[C]//Proc of Synthesis Lectures on Human Language Technologies.Piscataway,NJ:IEEE Press,2012:152-153.
    [3]Kim Y.Convolutional neural networks for sentence classification[C]//Proc of Empirical Methods in Natural Language Processing.2014:1746-1751.
    [4]Kalchbrenner N,Grefenstette E,Blunsom P.A convolutional neural network for modelling sentences[C]//Proc of ACL.2014:655-665.
    [5]Zhou Chunting,Sun Chonglin,Liu Zhiyuan,et al.A C-LSTM neural network for text classification[EB/OL].(2015-11-27)[2015-11-30].https://arxiv.org/abs/1511.08630.
    [6]张小倩.情感极性转移现象研究及应用[D].苏州:苏州大学,2012.(Zhang Xiaoqian.Study on sentimental polarity shifting with its application[D].Suzhou:Soochow University,2012.)
    [7]Tai Kaisheng,Socher R,Manning C D.Improved semantic representations from tree-structured long short-term memory networks[EB/OL].(2015-02-28)[2015-05-30].https://arxiv.org/abs/1503.00075.
    [8]Zhou Peng,Qi Zhenyu,Zheng Suncong,et al.Text classification improved by integrating bidirectional LSTM with two-dimensional max pooling[C]//Proc of COLING.2016:3485-3495.
    [9]赵妍妍,秦兵,刘挺.文本情感分析综述[J].软件学报,2010,21(8):1834-1848.(Zhao Yanyan,Qing Bing,Liu Ting.Sentiment analysis[J].Journal of Software,2010,21(8):1834-1848.)
    [10]Santos C N D,Gattit M.Deep convolutional neural networks for sentiment analysis of short texts[C]//Proc of the 25th International Conference on Computational Linguistics.2014:69-78.
    [11]Bahdanau D,Cho K,Bengio Y.Neural machine translation by jointly learning to align and translate[EB/OL].(2014-09-01)[2014-09-04].https://arxiv.org/abs/1409.0473v2.
    [12]Socher R,Pennington J,Huang E H,et al.Semi-supervised recursive autoencoders for predicting sentiment distributions[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2011:151-161.
    [13]Socher R,Manning C D,Ng A Y.Learning continuous phrase representations and syntactic parsing with recursive neural networks[C]//Proc of the NIPS-2010 Deep Learning and Unsupervised Feature Learning Workshop.2010:1-9.
    [14]Socher R,Perelygin A,Wu J,et al.Recursive deep models for semantic compositionality over a sentiment treebank[C]//Proc of Conference on Empirical Methods in Natural Language Processing.2013:1631-1642.
    [15]Mikolov T,Sutskever I,Chen K,et al.Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems.2013:3111-3119.
    [16]Goller C,Kuchler A.Learning task-dependent distributed representations by backpropagation through structure[C]//Proc of International Conference on Neural Networks.Piscataway,NJ:IEEE Press,2002:347-352.
    [17]Duchi J,Hazan E,Singer Y.Adaptive subgradient methods for online learning and stochastic optimization[J].Journal of Machine Learning Research,2011,12(7):257-269.
    [18]Zhang Ye,Wallace B.A sensitivity analysis of(and practitioners’guide to)convolutional neural networks for sentence classification[EB/OL].(2015-10-13)[2016-02-20].https://arxiv.org/abs/1510.03820v3.
    [19]Pang Bo,Lee L,Vaithyanathan S.Thumbs up?:sentiment classification using machine learning techniques[C]//Proc of the ACL-02Conference on Empirical Methods in Natural Language Processing.Stroudsburg,PA:Association for Computational Linguistics,2002:79-86.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700