Journal of Beijing University of Posts and Telecommunications

  • EI核心期刊

Journal of Beijing University of Posts and Telecommunications ›› 2023, Vol. 46 ›› Issue (5): 87-92.

Previous Articles     Next Articles

Emotion recognition based on meta bi-modal learning model

  

  • Received:2022-09-02 Revised:2022-12-17 Online:2023-10-28 Published:2023-11-03

Abstract: In the existing emotion recognition models, there are some problems, such as ambiguity in single-mode rep-resentation, different ways of expressing emotion for each person, and ignoring discrete emotion and con-tinuous emotion. To solve these problems, the author proposes meta Bi-modal learning (MBL) model, which realizes single-mode continuous emotion, namely valence activation control (V-A-D), to assist in the recog-nition of dual-mode discrete emotion, Bi-modal feature fusion used cross modal self attention, which effec-tively solved the problem of modal sequence data alignment. At the same time, in the process of auxiliary task training, the V-A-D three-dimensional information interaction was realized through the sharing of hard pa-rameters in multi-task learning. And the learning model taked each speaker's sentence as a small sample, which improved the ability of the model to adapt to different speakers and make the model more generalized. The experiments show that the MBL model has achieved 71.24% and 69.12% emotion recognition rates on the script and dialogue data sets of the corpus IEMOCAP, respectively, showing good performance.

Key words: bi-modal, Meta learning, discrete emotion, continuous emotion

CLC Number: