Multimodal individual emotion recognition with joint labeling based on integrated learning and clustering

To address the low recognition accuracy of generic emotion recognition models when faced with different individuals, a multimodal individual emotion recognition technique based on joint labelling with integrated learning and clustering was proposed. The method first trained a generic emotion recogni...

Full description

Saved in:
Bibliographic Details
Main Authors: KE Shanjun, NIE Chengyang, WANG Yumiao, HE Bangsheng
Format: Article
Language:zho
Published: POSTS&TELECOM PRESS Co., LTD 2024-03-01
Series:智能科学与技术学报
Subjects:
Online Access:http://www.cjist.com.cn/thesisDetails#10.11959/j.issn.2096-6652.202401
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To address the low recognition accuracy of generic emotion recognition models when faced with different individuals, a multimodal individual emotion recognition technique based on joint labelling with integrated learning and clustering was proposed. The method first trained a generic emotion recognition model based on a public dataset, then anallysed the distributional differences between the data in the public dataset and the unlabelled data of individuals, and established a cross-domain model for predicting and labelling pseudo-labels of individual data. At the same time, the individual data were weighted clustered and labelled with cluster labels, and the cluster labels were used to jointly label with pseudo-labels, and high confidence samples were screened to further train the generic model to obtain a personalized emotion recognition model. Using this method to annotate these data with the experimentally collected data of 3 emotions from 3 subjects, the final optimized personalized model achieved an average recognition accuracy of more than 80% for the 3 emotions, which was at least a 35% improvement compared to the original generic model.
ISSN:2096-6652