核心词修正的Seq2Seq短文摘要
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Automatic summary of short text based on Seq2Seq and keywords correction
  • 作者:方旭 ; 过弋 ; 王祺 ; 樊振
  • 英文作者:FANG Xu;GUO Yi;WANG Qi;FAN Zhen;Department of Computer Science and Engineering,East China University of Science and Technology;School of Information Science and Technology,Shihezi University;Shanghai Data Exchange Corporation;
  • 关键词:中文短文本 ; 自动摘要 ; 长短期记忆网络 ; 注意力机制的序列到序列模型 ; 核心词修正
  • 英文关键词:Chinese short text;;automatic abstract;;LSTM;;attention-based sequence-to-sequence model;;keywords correction
  • 中文刊名:SJSJ
  • 英文刊名:Computer Engineering and Design
  • 机构:华东理工大学信息科学与工程学院;石河子大学信息科学与技术学院;上海数据交易中心有限公司;
  • 出版日期:2018-12-16
  • 出版单位:计算机工程与设计
  • 年:2018
  • 期:v.39;No.384
  • 基金:国家自然科学基金项目(61462073);; 上海科学技术委员会科研计划基金项目(STCSM:17DZ1101003、16511101000)
  • 语种:中文;
  • 页:SJSJ201812003
  • 页数:6
  • CN:12
  • ISSN:11-1775/TP
  • 分类号:18-23
摘要
为使用户能够更快从海量的互联网信息中获得自己想要的信息,需要利用自动摘要技术给这些短文本添加简短的摘要,提出采用深度学习结合核心词修正的方法自动生成中文短文本的摘要。通过对长短期记忆网络(LSTM)进行研究,构建一个基于注意力机制的序列到序列模型(Seq2Seq),采用字词联合特征作为模型的输入进行训练,利用原文的核心词対生成的摘要进行修正,得到最终的摘要结果。通过分析在LCSTS数据集上的实验结果验证了该方法的有效性。
        Automatic summary techniques can make people get the information they need faster from the short texts on the Internet.Deep learning method was introduced to generate short text summary automatically and the keywords of the original text were used to optimize the results.An attention-based sequence-to-sequence(Seq2Seq)model with long-short term memory network(LSTM)was constructed,which combined the character features and word features as its inputs.The keywords of short texts were used to correct the words of model-generated summary.Experimental results show that the proposed method improves the performance of short summaries generation on LCSTS dataset.
引文
[1]Liu Xide,Wang Changxuan.Summary of social short text automatic summarization[J].Journal of Chinese Computer Systems,2013,34(12):2764-2771.
    [2]Kiyani F,Tas O.A survey on automatic text summarization[J].Press Academia Procedia,2017,5(1):205-213.
    [3]Ferreira R,Cabral L D S,Lins R D,et al.Assessing sentence scoring techniques for extractive text summarization[J].Expert Systems with Applications,2013,40(14):5755-5764.
    [4]Zhang X,Zhao J,Lecun Y.Character-level convolutional networks for text classification[C]//International Conference on Neural Information Processing Systems.MIT Press,2015:649-657.
    [5]Qin P,Xu W,Guo J.A novel negative sampling based on TFIDF for learning word representation[M].Elsevier Science Publishers,2016.
    [6]Goyal P,Behera L,Mcginnity T M.A context based word indexing model for document summarization[J].IEEE Transactions on Knowledge&Data Engineering,2013,25(8):1693-1705.
    [7]Shalev-Shwartz S,Ben-David S.Understanding machine learning:From theory to algorithms[M].Cambridge University Press,2014.
    [8]Bahdanau D,Cho K,Bengio Y.Neural machine translation by jointly learning to align and translate[J].arXiv preprint arXiv:1409.0473,2014.
    [9]Nallapati Ramesh,Zhou Bowen,Gulcehre Caglar,et al.Abstractive text summarization using sequence-to-sequence RNNs and beyond[C]//Proc of EMNLP,2016.
    [10]Cho K, Merrienboer B V,Gulcehre C,et al.Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Proc of EMNLP,2014:1724-1734.
    [11]Hu B,Chen Q,Zhu F.LCSTS:A large scale Chinese short text summarization dataset[C]//Proc of EMNLP,2015:1967-1972.
    [12]Goldberg Y,Levy O.Word2vec explained:Deriving Mikolov et al.’s negative-sampling word-embedding method[J].arXiv Preprint arXiv:1402.3722,2014.
    [13]Mikolov T,Chen K,Corrado G,et al.Efficient estimation of word representations in vector space[J].arXiv preprint arXiv:1301.3781,2013.
    [14]Ranzato M A,Chopra S,Auli M,et al.Sequence level training with recurrent neural networks[J].arXiv preprint arXiv:1511.06732,2015.
    [15]Zeiler M D.ADADELTA:An adaptive learning rate method[J].arXiv preprint arXiv:12125701,2012.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700