摘要
针对问题文本细粒度分类中文本特征稀疏、文本整体特征相似、局部差异特征较难提取的特点,提出基于语义扩展与注意力网络相结合的分类方法。通过依存句法分析树提取语义单元,在向量空间模型中计算语义单元周围的相似语义区域并进行扩展。利用长短期记忆网络模型对扩展后的文本进行词编码,引入注意力机制生成问题文本的向量表示,根据Softmax分类器对问题文本进行分类。实验结果表明,与传统的基于深度学习网络的文本分类方法相比,该方法能够提取出更重要的分类特征,具有较好的分类效果。
For the fine-grained classification of question texts,which include that the features of text are sparse,the overall features of the text are similar,and the features of local differences are difficult to extract,a classification method based on the combination of semantic expansion and attention network is proposed. The semantic unit is extracted by the dependency syntax analysis tree,and the similar semantic regions around the semantic unit are calculated and expanded in the vector space model. The Long Short Term Memory( LSTM) network model is used to encode the extended text,the attention mechanism is introduced to generate the vector representation of the question text,and the problem text is classified according to the Softmax classifier. Experimental results show that compared with the traditional text classification method based on deep learning network,this method can extract more important classification features and has better classification effect.
引文
[1]张青,吕钊.基于主题扩展的领域问题分类方法[J].计算机工程,2016,42(9):202-207,213.
[2]张谦,高章敏,刘嘉勇.基于Word2vec的微博短文本分类研究[J].信息网络安全,2017(1):57-62.
[3]胡学钢,杨超群,张玉红.基于自身特征扩展的短文本分类方法[J].计算机应用研究,2017,34(4):1008-1010.
[4]KIM Y.Convolutional neural networks for sentence classification[C]//Proceedings of Conference on Empirical M ethods in Natural Language Processing.Washington D.C.,USA:IEEE Press,2014:215-222.
[5]ZHANG X,ZHAO J,LECUN Y.Character-level convolutional netw orks for text classification[C]//Proceedings of International Conference on Neural Information Processing Systems.Washington D.C.,USA:IEEE Press,2015:649-657.
[6]TAI K S,SOCHER R,MANNING C D.Improved semantic representations from tree-structured long shortterm memory netw orks[J].Computer Science,2015,5(1):36-40.
[7]SHI Y,YAO K,TIAN L,et al.Deep LSTM based feature mapping for query classification[C]//Proceedings of 2016Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Washington D.C.,USA:IEEE Press,2016:1501-1511.
[8]LAI S,XU L,LIU K,et al.Recurrent convolutional neural Netw orks for text classification[C]//Proceedings of AAAI’15.Washington D.C.,USA:IEEE Press,2015:2267-2273.
[9]ZHOU C,SUN C,LIU Z,et al.A C-LSTM neural netw ork for text classification[J].Computer Science,2015,1(4):39-44.
[10]WANG P,XU B,XU J,et al.Semantic expansion using w ord embedding clustering and convolutional neural netw ork for improving short text classification[J].Neurocomputing,2016,174:806-814.
[11]BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[C]//Proceedings of ICLR’14,Washington D.C.,USA:IEEEPress,2014:151-162.
[12]YANG Z,HE X,GAO J,et al.Stacked attention netw orks for image question answ ering[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C.,USA:IEEEPress,2015:321-332.
[13]YANG Z,YANG D,DYER C,et al.Hierarchical attention networks for document classification[C]//Proceedings of2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Washington D.C.,USA:IEEE Press,2016:154-165.
[14]胡新辰.基于LSTM的语义关系分类研究[D].哈尔滨:哈尔滨工业大学,2015.
[15]GE Z Y,MCCOOL C,SANDERSON C,et al.Exploiting temporal information for DCNN-based finegrained object classification[C]//Proceedings of IEEEConference on Computer Vision&Pattern Recognition.Washington D.C.,USA:IEEE Press,2016:1-3.
[16]MIKOLOV T,CHEN K,CORRADO G,et al.Efficient estimation of word representations in vector space[C]//Proceedings of Workshop at ICLR’13.Washington D.C.,USA:IEEE Press,2013:145-154.
[17]唐素勤,黄运有,王娜娜.基于依存语法及本体技术的问句分析[J].广西师范大学学报(自然科学版),2014,32(4):52-58.
[18]GREFENSTETTE G,MUCHEMI L.Determining the characteristic vocabulary for a specialized dictionary using w ord2vec and a directed craw ler[EB/OL].[2017-10-21].http://cn.arxiv.org/abs/1605.09564.