Journal of Beijing University of Posts and Telecommunications

  • EI核心期刊

Journal of Beijing University of Posts and Telecommunications ›› 2024, Vol. 47 ›› Issue (1): 127-132.

Previous Articles    

Aspect-Level Sentiment Analysis Based on Self-Attention and Graph Convolutional Network

CHEN Kejia1, HUANG Chunxiang1, LIN Hongxi2   

  • Received:2023-01-06 Revised:2023-05-22 Online:2024-02-26 Published:2024-02-26

Abstract: In view of the lack of interactive information between aspect words and the context in most aspect-level sentiment studies, and the inability to make full use of semantic information. To address the problems above, a model based on self-attention and graph convolution network is proposed. In order to improve the semantic representation ability of the model, the multi-head self-attention mechanism is used to obtain the long-distance dependency relationship of the text, combined with the dependency type matrix. Then, the weight matrix that combines the location information and the relationship type information is calculated and is inputted to the graph convolution network to obtain the text feature representation. Besides, the text aspect attention layer is employed to extract the context-sensitive aspect features, and it is inputted to graph convolution network to obtain aspect feature representation. Finally, the two vectors above are connected to complete the task of sentiment analysis. Simulation results show that the overall performance of the proposed model is better than that these of other comparison models in two open datasets.

Key words: aspect-level sentiment analysis ,  graph convolution network ,  self-attention mechanism ,  semantic perception

CLC Number: