摘要
针对非事实类问答任务,本文搭建了带有注意力机制的双向长短时记忆(BiLSTM)网络模型。实验表明,在2016 NLPCC QA任务数据集上,该模型MRR可达到75.12%,优于传统的机器学习方法。
For non-factoid QA tasks, in this paper,we build a BiLSTM model with Attention mechanism.Experiments show that in 2016 NLPCC QA dataset, this model can reach 75.12% on MRR, which is better than the traditional machine learning method.
引文
[1]Wang B,Liu K,Zhao J.Inner Attention based Recurrent Neural Networks for Answer Selection[C]//ACL(1).2016:1288-1297.
[2]Wu F,Yang M,Zhao T,et al.A Hybrid Approach to DBQA[C]//International Conference on Computer Processing of Oriental Languages.Springer International Publishing,2016.