北京邮电大学学报

  • EI核心期刊

北京邮电大学学报 ›› 2018, Vol. 41 ›› Issue (3): 7-13.doi: 10.13190/j.jbupt.2017-219

• 论文 • 上一篇    下一篇

基于注意力机制的评论摘要生成

苏放1,2, 王晓宇1, 张治1,2   

  1. 1. 北京邮电大学 信息与通信工程学院, 北京 100876;
    2. 无锡北邮感知技术产业研究院有限公司, 江苏 无锡 214135
  • 收稿日期:2017-11-02 出版日期:2018-06-28 发布日期:2018-06-04
  • 作者简介:苏放(1973-),男,副教授,硕士生导师,E-mail:sufang@bupt.edu.cn.
  • 基金资助:
    国家科技重大专项项目(2017ZX03001022)

Review Summarization Generation Based on Attention Mechanism

SU Fang1,2, WANG Xiao-yu1, ZHANG Zhi1,2   

  1. 1. School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, China;
    2. Institute of Sensing Technology and Business, Beijing University of Posts and Telecommunications, Jiangsu Wuxi 214135, China
  • Received:2017-11-02 Online:2018-06-28 Published:2018-06-04

摘要: 为了实现评论摘要的生成式提取,对序列到序列学习的神经网络模型进行分析,提出了一种改进的注意力机制应用模型,并用于评论摘要.挖掘评论摘要特征,使在摘要中出现的文字更多集中在原文首部;针对评论摘要的样本特征,通过改进局部注意力模型,使其对评论原文的句首具有更高的注意力权重,并可端到端地生成评论摘要的每一个词.实验结果表明,该模型在对英文同类别全文长度小于200的评论摘要提取上有更高的准确率.

关键词: 评论摘要, 注意力机制, 序列到序列, 循环神经网络

Abstract: In order to implement generative review summarization, a research of the neural network model of sequence to sequence learning was conducted, besides, an improved attention mechanism for review summarization based on the model was proposed. By focusing on the feature of review summarization samples, the local attention mechanism is improved which has more attention weights on the start of the source sentence. Then every word of the summarization is generated through the end-to-end model. Experiments show that this approach achieves superior performance on English review summarization in the same category when the length of reviews is less than 200.

Key words: review summarization, attention mechanism, sequence to sequence, recurrent neural network

中图分类号: