北京邮电大学学报

  • EI核心期刊

北京邮电大学学报 ›› 2023, Vol. 46 ›› Issue (4): 123-128.

• 研究报告 • 上一篇    

融合句法依存与 BERT-Att-BiLSTM 的属性分类

包乾辉,文娟,石淑珍,董萌萍,刘雪   

  1. 中国农业大学 信息与电气工程学院
  • 收稿日期:2022-06-22 修回日期:2022-08-18 出版日期:2023-08-28 发布日期:2023-08-24
  • 通讯作者: 刘雪 E-mail:liusnow@cau.edu.cn
  • 基金资助:
    2021 年现代农业产业技术体系北京市创新团队建设项目

Aspect Category Classification Integrated in Syntactic Dependency and BERT-Att-BiLSTM

BAO Qianhui, WEN Juan, SHI Shuzhen, DONG Mengping, LIU Xue   

  • Received:2022-06-22 Revised:2022-08-18 Online:2023-08-28 Published:2023-08-24

摘要: 针对细粒度情感分析属性分类准确率低的问题提出了一种融合句法依存关系和基于转换器的双向编码器-注意力机制-双向长短期记忆网络(BERT鄄Att鄄BiLSTM)的属性分类模型该模型首先构建基于句法依存关系的目标信息提取层进行属性-观点对提取其次在词嵌入层使用 BERT 模块实现结合上下文动态特征的词向量预训练然后,特征提取层融入 Att BiLSTM 模块进行特征空间降维处理最后,在分类层通过激活函数输出属性-观点对的属性类别实验结果表明所提模型的精准度召回率和 F1 值分别为 85.25% 、72.38% 77.06%,均优于其他模型证明了所提模型的有效性

关键词: 属性抽取 , 句法依存关系 ,  属性分类 ,  基于转换器的双向编码器 , 注意力机制

Abstract: Current aspect category classification in fine-grained sentimental analysis suffers from low accuracy problems. An aspect category classification method is proposed integrates in syntactic dependency and bidirectional encoder representations from Transformers-attention mechanism-bidirectional short-term memory network ( BERT-Att-BiLSTM). Firstly, in the target information extraction layer, aspect-opinion pairs from comments is extracted by syntactic dependency. In the word embedding layer, BERT module is used to combined the dynamic features of context to achieve word vector representation. Then, in feature extraction layer, the BiLSTM module is integrated in attention mechanism to reduce the dimension of feature space. Finally, the aspect category is obtained at the classification layer through activation function. Experimental results show that the precision, recall rate, and F1 value of the proposed method reached 85.25% , 72.38% and 77.06% , surpassed the other chosen models and proved its effectiveness.

Key words: aspect category extraction , syntactic dependency , aspect category classification ,  bidirectional encoder representations from Transformers
,
attention mechanism

中图分类号: