基于情感词向量和BLSTM的评论文本情感倾向分析
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Sentiment orientation analysis of review text based on sentiment word embedding and BLSTM
  • 作者:邓楠 ; 余本功
  • 英文作者:Deng Nan;Yu Bengong;School of Management,Hefei University of Technology;Key Laboratory of Process Optimization & Intelligent Decision-making of Ministry of Education,Hefei University of Technology;
  • 关键词:长短期记忆模型 ; 情感倾向分析 ; 自然语言处理 ; 词向量
  • 英文关键词:LSTM;;sentiment orientation analysis;;NLP;;word embedding
  • 中文刊名:JSYJ
  • 英文刊名:Application Research of Computers
  • 机构:合肥工业大学管理学院;合肥工业大学过程优化与智能决策教育部重点实验室;
  • 出版日期:2017-12-12 18:34
  • 出版单位:计算机应用研究
  • 年:2018
  • 期:v.35;No.326
  • 基金:国家自然科学基金资助项目(71671057)
  • 语种:中文;
  • 页:JSYJ201812006
  • 页数:4
  • CN:12
  • ISSN:51-1196/TP
  • 分类号:33-36
摘要
传统的机器学习方法主要是浅层的学习算法,并不能很好地抽取文本中高层情感信息。针对该问题,提出了一种以组合了语义信息和情感信息的情感词向量作为输入的改进双向长短期记忆模型,通过构建语义和情感双输入矩阵,并在隐藏层加入情感特征抽取模块来增强模型的情感特征表达能力。在数据集上的实验结果表明,与标准的BLSTM模型和传统机器学习模型相比,该模型能够有效提升文本情感倾向分析的效果。
        The traditional machine learning method is mainly shallow learning algorithm,and cannot extract the high-level emotional information in the text. Motivated by this,this paper proposed a BLSTM model with sentiment word embedding as input. Through building semantic and sentiment dual input matrix,it added the emotion feature extraction module in the hidden layer,to enhance the emotion expression ability of the model. The experimental results show that compared with the standard BLSTM model and the traditional machine learning model,the model can effectively improve the effect of sentiment orientation analysis.
引文
[1] Graves A,Jaitly N. Towards end-to-end speech recognition with recurrent neural networks[C]//Proc of International Conference on Machine Learning. 2014:1764-1772.
    [2] Liu Shujie,Yang Nan,Li Mu,et al. A recursive recurrent neural network for statistical machine translation[C]//Proc of Meeting of the Association for Computational Linguistics. 2014:1491-1500.
    [3] Sutskever I,Martens J,Hinton G E. Generating text with recurrent neural networks[C]//Proc of International Conference on Machine Learning. 2011:1017-1024.
    [4] Socher R,Lin C Y,Ng A Y,et al. Parsing natural scenes and natural language with recursive neural networks[C]//Proc of International Conference on Machine Learning. 2011:129-136.
    [5] Li Peng,Liu Yang,Sun Maosong. Recursive autoencoders for ITGbased translation[C]//Proc of EMNLP. 2013:567-577.
    [6] Bengio Y,Simard P,Frasconi P. Learning long-term dependencies with gradient descent is difficult[M]. Piscataway,NJ:IEEE Press,1994.
    [7] Graves A. Long short-term memory[M]//Supervised Sequence Labelling with Recurrent Neural Networks. Berlin:Springer,2012:1735-1780.
    [8] Schuster M,Paliwal K K. Bidirectional recurrent neural networks[J].IEEE Trans on Signal Processing,1997,45(11):2673-2681.
    [9] Graves A,Schmidhuber J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures[J]. Neural Networks,2005,18(5-6):602-610.
    [10] Ma Xuezhe,Hovy E. End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF[C]//Proc of Meeting of the Association for Computational Linguistics. 2016:1064-1074.
    [11]商俊蓓.基于双向长短时记忆递归神经网络的联机手写数字公式字符识别[D].广州:华南理工大学,2015.
    [12]Bengio Y,Vincent P,Janvin C. A neural probabilistic language model[J]. Journal of Machine Learning Research,2003,3(6):1137-1155.
    [13]Mikolov T,Sutskever I,Chen Kai,et al. Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems. 2013:3111-3119.
    [14]唐明,朱磊,邹显春.基于Word2Vec的一种文档向量表示[J].计算机科学,2016,43(6):214-217.
    [15]蔡慧苹,王丽丹,段书凯.基于word embedding和CNN的情感分类模型[J].计算机应用研究,2016,33(10):2902-2905,2909.
    [16] Tang Duyu,Wei Furu,Yang Nan,et al. Learning sentiment-specific word embedding for Twitter sentiment classification[C]//Proc of Meeting of the Association for Computational Linguistics. 2014:1555-1565.
    [17]Zhang Zhihua,Lan Man. Learning sentiment-inherent word embedding for word-level and sentence-level sentiment analysis[C]//Proc of International Conference on Asian Language Processing. Piscataway,NJ:IEEE Press,2016.
    [18]Zaremba W,Sutskever I,Vinyals O. Recurrent neural network regularization[EB/OL].(2014-09-08)[2015-02-19]. https://arxiv. org/abs/1409. 2329.