北京邮电大学学报

  • EI核心期刊

北京邮电大学学报 ›› 2024, Vol. 47 ›› Issue (1): 127-132.

• 论文 • 上一篇    

基于自注意力与图卷积网络的方面级情感分析

陈可嘉1,黄春香1,林鸿熙2   

  1. 1. 福州大学
    2. 莆田学院
  • 收稿日期:2023-01-06 修回日期:2023-05-22 出版日期:2024-02-26 发布日期:2024-02-26
  • 通讯作者: 林鸿熙 E-mail:ptulhx@163.com
  • 基金资助:
    国家自然科学基金项目

Aspect-Level Sentiment Analysis Based on Self-Attention and Graph Convolutional Network

CHEN Kejia1, HUANG Chunxiang1, LIN Hongxi2   

  • Received:2023-01-06 Revised:2023-05-22 Online:2024-02-26 Published:2024-02-26

摘要: 针对以往大多数方面级情感分析研究中方面词与上下文交互信息缺失,无法充分利用语义信息等问题,提出一种基于自注意力与图卷积网络结合的方面级情感分析模型。为了提高模型的语义表示能力,一方面利用多头自注意力机制,获取文本长距离依赖关系,与依存关系类型矩阵结合,计算融合位置信息和关系类型信息的权重矩阵,输入图卷积网络获取文本特征表示;另一方面设计了文本-方面注意力层,增强方面与上下文的交互,输入图卷积网络获取方面特征表示;最后连接 2 个向量,完成情感分析任务。在 2 个开放数据集中,所提模型的整体性能优于其他对比模型。

关键词: 方面级情感分析, 图卷积网络, 自注意力机制, 语义感知

Abstract: In view of the lack of interactive information between aspect words and the context in most aspect-level sentiment studies, and the inability to make full use of semantic information. To address the problems above, a model based on self-attention and graph convolution network is proposed. In order to improve the semantic representation ability of the model, the multi-head self-attention mechanism is used to obtain the long-distance dependency relationship of the text, combined with the dependency type matrix. Then, the weight matrix that combines the location information and the relationship type information is calculated and is inputted to the graph convolution network to obtain the text feature representation. Besides, the text aspect attention layer is employed to extract the context-sensitive aspect features, and it is inputted to graph convolution network to obtain aspect feature representation. Finally, the two vectors above are connected to complete the task of sentiment analysis. Simulation results show that the overall performance of the proposed model is better than that these of other comparison models in two open datasets.

Key words: aspect-level sentiment analysis ,  graph convolution network ,  self-attention mechanism ,  semantic perception

中图分类号: