基于深度卷积神经网络模型的文本情感分类
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Text Sentiment Classification Based on Deep Convolutional Neural Network Model
  • 作者:周锦峰 ; 叶施仁 ; 王晖
  • 英文作者:ZHOU Jinfeng;YE Shiren;WANG Hui;School of Information Science and Engineering,Changzhou University;
  • 关键词:情感分析 ; 情感分类标注 ; 深度学习 ; 卷积神经网络 ; 词向量
  • 英文关键词:sentiment analysis;;sentiment classification annotation;;deep learning;;Convolutional Neural Network(CNN);;word vector
  • 中文刊名:JSJC
  • 英文刊名:Computer Engineering
  • 机构:常州大学信息科学与工程学院;
  • 出版日期:2018-03-01 16:16
  • 出版单位:计算机工程
  • 年:2019
  • 期:v.45;No.498
  • 基金:国家自然科学基金(61272367);; 江苏省科技厅项目(BY2015027-12)
  • 语种:中文;
  • 页:JSJC201903050
  • 页数:9
  • CN:03
  • ISSN:31-1289/TP
  • 分类号:306-314
摘要
为高效提取不同卷积层窗口的文本局部语义特征,提出一种深度卷积神经网络(CNN)模型。通过堆叠多个卷积层,提取不同窗口的局部语义特征。基于全局最大池化层构建分类模块,对每个窗口的局部语义特征计算情感类别得分,综合类别得分完成情感分类标注。实验结果表明,与现有CNN模型相比,该模型具有较快的文本情感分类速度。
        This paper proposes a deep Convolutional Neural Network(CNN) model to efficiently extract the local semantic features of different convolutional layer windows for text.The model avoids manually specifying multiple window sizes and retains local semantic features of different windows by stacking a number of convolutional layers.Classification modules are built based on the Global Max Pooling(GMP) layer to calculate the category score for the local semantic features of each window.The model synthesizes these category scores to complete the sentiment classification annotation.Experimental results show that the model has faster text sentiment classification speed than that of other CNN models.
引文
[1] MEDHAT W,HASSAN A,KORASHY H.Sentiment analysis algorithms and applications:a survey[J].Ain Shams Engineering Journal,2014,5(4):1093-1113.
    [2] KUBLER S,MCDONALD R,NIVRE J.Synthesis lectures on human language technologies[EB/OL].[2018-01-05].http://www.morganclaypool.com/doi/abs/10.2200/S00416ED1V01Y201204HLT016.
    [3] PANG B,LEE L,VAITHYANATHAN S,et al.Sentiment classification using machine learning techniques[C]//Procedings of Empirical Methods in Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2002:79-86.
    [4] MA M,HUANG L,ZHOU B,et al.Dependency-based convolutional neural networks for sentence embedding[EB/OL].[2018-01-05].http://www.oalib.com/paper/4048778.
    [5] SANTOS C N D,GATTIT M.Deep convolutional neural networks for sentiment analysis of short texts[C]//Proceeding of the 25th International Conference on Computational Linguistics.Dublin,Ireland:[s.n.],2014:69-78.
    [6] 刘龙飞,杨亮,张绍武,等.基于卷积神经网络的微博情感倾向性分析[J].中文信息学报,2015,29(6):159-165.
    [7] LEE G,JEONG J,SEO S,et al.Sentiment classification with word attention based on weakly supervised learning with a convolutional neural network[EB/OL].[2018-01-05].https://arxiv.org/abs/1709.09885.
    [8] SANTOS C N D,XIANG B,ZHOU B.Classifying relations by ranking with convolutional neural networks[J].Computer Science,2015,86:132-137.
    [9] WANG L,CAO Z,MELO G,et al.Relation classification via multi-level attention CNNs[C]//Proceeding of the 54th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2016:1298-1307.
    [10] LIN Y,SHEN S,LIU Z,et al.Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2016:2124-2133.
    [11] DONG L,WEI F,XU K,et al.Adaptive multi-compositionality for recursive neural network models[J].IEEE Transactions on Audio,Speech,and Language Processing,2016,24(3):422-431.
    [12] BENGIO Y,DUCHARME R,VINCENT P,et al.A neural probabilistic language model[J].Journal of Machine Learning Research,2003,3(6):1137-1155.
    [13] KIM Y.Convolutional neural networks for sentence classification[C]//Proceedings of Empirical Methods in Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2014:1746-1751.
    [14] SIMONYAN K,ZISSERMAN A.Very deep convolutional networks for large-scale image recognition[C]//Proceedings of International Conference on Learning Representations.Washington D.C.,USA:IEEE Press,2015:1-7.
    [15] HE K,ZHANG X,REN S,et al.Deep residual learning for image recognition[C]//Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C.,USA:IEEE Press,2016:770-778.
    [16] LIN M,CHEN Q,YAN S.Network in network[EB/OL].[2018-01-05].https://arxiv.org/abs/1312.4400.
    [17] SRIVASTAVA N,HINTON G E,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J].Journal of Machine Learning Research,2014,15(1):1929-1958.
    [18] SOCHER R,PERELYGIN A,WU J Y,et al.Recursive deep models for semantic compositionality over a sentiment treebank[C]//Proceedings of Empirical Methods in Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2013:1631-1642.
    [19] PENNINGTON J,SOCHER R,CHRISTOPHER D,et al.GloVe:global vectors for word representation[C]//Proceedings of Empirical Methods in Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2014:1532-1543.
    [20] TANG D,QIN B,LIU T,et al.Aspect level sentiment classification with deep memory network[EB/OL].[2018-01-05].https://arxiv.org/abs/1605.08900.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700