结合自注意力机制和Tree-LSTM的情感分析模型
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Sentiment Analysis Model with the Combination of Self-attention and Tree-LSTM
  • 作者:石磊 ; 张鑫倩 ; 陶永才 ; 卫琳
  • 英文作者:SHI Lei;ZHANG Xin-qian;TAO Yong-cai;WEI Lin;School of Information Engineering,Zhengzhou University;School of Software,Zhengzhou University;
  • 关键词:微博情感分析 ; 自注意力机制 ; Tree-LSTM模型 ; Maxout神经元
  • 英文关键词:micro-blog sentiment analysis;;self-attention mechanism;;Tree-LSTM model;;Maxout neura
  • 中文刊名:XXWX
  • 英文刊名:Journal of Chinese Computer Systems
  • 机构:郑州大学信息工程学院;郑州大学软件技术学院;
  • 出版日期:2019-07-15
  • 出版单位:小型微型计算机系统
  • 年:2019
  • 期:v.40
  • 基金:河南省高等学校重点科研项目(16A520027)资助
  • 语种:中文;
  • 页:XXWX201907025
  • 页数:5
  • CN:07
  • ISSN:21-1106/TP
  • 分类号:128-132
摘要
情感分析随着人工智能的发展而逐渐受到重视,微博情感分析旨在研究用户对于社会热点事件的情感倾向,研究表明深度学习在情感分析上具有可行性.针对传统循环神经网络模型存在信息记忆丢失、忽略上下文非连续词之间的相关性和梯度弥散的问题,为此本文结合自注意机制和Tree-LSTM模型,并且在Tree-LSTM模型的输出端引入了Maxout神经元,基于以上两种改进基础上构建了SAtt-TLSTM-M模型.实验使用COAE2014评测数据集进行情感分析,实验结果表明:本文提出的模型相比于传统的SVM、MNB和LSTM模型准确率分别提高了16. 18%、15. 34和12. 05%,其中引入了Maxout神经元的RMNN模型相对于LSTM模型准确率提高了4. 10%,引入自注意力机制之后的Self-Attention+Tree-LSTM模型相比于Tree-LSTM模型准确率提高了1. 85%,并在召回率和F值两项指标上均优于其他对比模型.由此证明,本文提出的SAtt-TLSTM-M模型可用于提高情感分析的准确率,具有一定的研究价值.
        Sentiment analysis is valued with the development of artificial intelligence. Micro-blog sentiment analysis aims to study the emotional tendency of users to social hot events. Research shows that deep learning is feasible in sentiment analysis. In view of traditional cyclic neural network model,micro-blog sentiment analysis has information memory loss,ignores the correlation between context non-continuous words and gradient dispersion. Therefore,this paper combines the self-attention mechanism and the Tree-LSTM model,and introduces Maxout neuron at the output of the Tree-LSTM model. Based on the two improvemwnts,it constructed a Chinses microblog sentiment analysis model( SAtt-TLSTM-M). The experiment uses the COAE2014 evaluation data set for sentiment analysis. The experimental results show that the accuracy of the proposed model is improved by 16. 18%,15. 34% and 12. 05% compared to the traditional SVM,MNB and LSTM models. The RMNN with Maxout neurons is improved by 4. 10% compared with LSTM model.Self-Attention + Tree-LSTM model with the self-attention mechanism is improved by 1. 85% compared with Tree-LSTM model. It is also superior to other contrast models in the recall rate and F value. Thus,It proves that the SAtt-TLSTM-M model in this paper can be used to improve the accuracy of sentiment analysis,which shows certain research value in research.
引文
[1]Ding X,Liu T,Duan J,et al.Mining user consumption intention from social media using domain adaptive convolutional neural network[C]//The American Association for Artificial Intelligence(AAAI),2015,15:2389-2395.
    [2]Yang Li-gong,Zhu Jian,Tang Shi-ping.Survey of text sentiment analysis[J].Journal of Computer Applications,2013,33(6):1574-1607.
    [3]Qian Kai-yu,Guo Li-peng.Research on sentimenta analysis of social network reviews with idioms[J].Journal of Chinese Computer Systems,2017,38(6):1273-1277.
    [4]Chen Long,Guan Zi-yu,Heng Jin-hong,et al.A survey on sentiment classfication[J].Journal of Computer Research and Development,2017,54(6):1150-1170.
    [5]Pang B,Lee L,Vaithyanathan S.Thumbs up?Sentiment classification using machine learning techniques[C]//Proceedings of the2002 C-onference on Empirical Methods in Natural Language Processing(EMNLP),2002:79-86.
    [6]Li Qiang,Liu Xiao-feng,He Jing.Sentiment classification based on voice features[J].Journal of Chinese Computer Systems,2016,37(2):385-388.
    [7]Mnih V,Heess N,Graves A.Recurrent mod-els of visual attention[C]//Advances in NeuralInformation Processing Systems,2014:2204-2212.
    [8]Hermann K M,Kocisky T,Grefenstette E,et al.Teaching machines to read and compreh-end[C]//Advances in Neural Information Processing Systems,2015:1693-1701.
    [9]Chen H,Sun M,Tu C,et al.Neural sentiment classification with user and product attention[C]//Proceedings of the 2016 Conferenceon Empirical Methods in Natural Language Processing,2016:1650-1659.
    [10]Vaswani A,Shazeer N,Parmar N,et al.Attention is all you need[C]//Advances in Neural Information Processing Systems,2017:5998-6008.
    [11]Tai K S,Socher R,Manning C D.Improved semantic representations from tree-structured long short-term memory networks[C]//Conference of the Association for Computational Linguistics(ACL),2015:1556-1566.
    [12]Goodfellow I J,Warde-Farley D,Mirza M,et al.Maxout networks[J].Computer Science,2013,28(3):1319-1327.
    [13]Cai M,Liu J.Maxout neurons for deep convolutional and LSTMneural networks in speech recognition[J].Speech Communication,2016,77:53-64.
    [14]Dong Y,Seide F,Gang L.Conversational speech transcription using context-dependent deep neural networks[C]//International Coference on International Conference on Machine Learning,Omnipress,2012:1-2.
    [15]Bermingham A,Smeaton A F.Classifying sentiment in microblogs:is brevity an advantage[C]//Proceedings of the 19th ACMInternational Conference on Information and Knowledge Management,ACM,2010:1833-1836.
    [2]杨立公,朱俭,汤世平.文本情感分析综述[J].计算机应用,2013,33(6):1574-1607.
    [3]钱凯雨,郭立鹏.融入习语信息的网络评论情感分析研究[J].小型微型计算机系统,2017,38(6):1273-1277.
    [4]陈龙,管子玉,何金红,等.情感分类研究进展[J].计算机研究与发展,2017,54(6):1150-1170.
    [6]李强,刘晓峰,贺静.基于语音特征的情感分类[J].小型微型计算机系统,2016,37(2):385-388.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700