Journal of Beijing University of Posts and Telecommunications

  • EI核心期刊

JOURNAL OF BEIJING UNIVERSITY OF POSTS AND TELECOM ›› 2018, Vol. 41 ›› Issue (3): 7-13.doi: 10.13190/j.jbupt.2017-219

• Papers • Previous Articles     Next Articles

Review Summarization Generation Based on Attention Mechanism

SU Fang1,2, WANG Xiao-yu1, ZHANG Zhi1,2   

  1. 1. School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, China;
    2. Institute of Sensing Technology and Business, Beijing University of Posts and Telecommunications, Jiangsu Wuxi 214135, China
  • Received:2017-11-02 Online:2018-06-28 Published:2018-06-04

Abstract: In order to implement generative review summarization, a research of the neural network model of sequence to sequence learning was conducted, besides, an improved attention mechanism for review summarization based on the model was proposed. By focusing on the feature of review summarization samples, the local attention mechanism is improved which has more attention weights on the start of the source sentence. Then every word of the summarization is generated through the end-to-end model. Experiments show that this approach achieves superior performance on English review summarization in the same category when the length of reviews is less than 200.

Key words: review summarization, attention mechanism, sequence to sequence, recurrent neural network

CLC Number: