Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition

Emotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep lea...

Full description

Saved in:
Bibliographic Details
Main Authors: Wei Lu, Xiaobo Zhang, Lingnan Xia, Hua Ma, Tien-Ping Tan
Format: Article
Language:English
Published: Frontiers Media S.A. 2024-12-01
Series:Frontiers in Human Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnhum.2024.1471634/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846119269653282816
author Wei Lu
Wei Lu
Xiaobo Zhang
Xiaobo Zhang
Lingnan Xia
Hua Ma
Tien-Ping Tan
author_facet Wei Lu
Wei Lu
Xiaobo Zhang
Xiaobo Zhang
Lingnan Xia
Hua Ma
Tien-Ping Tan
author_sort Wei Lu
collection DOAJ
description Emotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep learning-based models face challenges in capturing both the spatial activity features and spatial topology features of EEG signals simultaneously. To address this challenge, a domain-adaptation spatial-feature perception-network has been proposed for cross-subject EEG emotion recognition tasks, named DSP-EmotionNet. Firstly, a spatial activity topological feature extractor module has been designed to capture spatial activity features and spatial topology features of EEG signals, named SATFEM. Then, using SATFEM as the feature extractor, DSP-EmotionNet has been designed, significantly improving the accuracy of the model in cross-subject EEG emotion recognition tasks. The proposed model surpasses state-of-the-art methods in cross-subject EEG emotion recognition tasks, achieving an average recognition accuracy of 82.5% on the SEED dataset and 65.9% on the SEED-IV dataset.
format Article
id doaj-art-6751c7935a3c4a15bb0c042402c81a56
institution Kabale University
issn 1662-5161
language English
publishDate 2024-12-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Human Neuroscience
spelling doaj-art-6751c7935a3c4a15bb0c042402c81a562024-12-17T06:23:06ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612024-12-011810.3389/fnhum.2024.14716341471634Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognitionWei Lu0Wei Lu1Xiaobo Zhang2Xiaobo Zhang3Lingnan Xia4Hua Ma5Tien-Ping Tan6Henan High-speed Railway Operation and Maintenance Engineering Research Center, Zhengzhou Railway Vocational and Technical College, Zhengzhou, ChinaSchool of Computer Sciences, Universiti Sains Malaysia, Penang, MalaysiaSchool of Computer Sciences, Universiti Sains Malaysia, Penang, MalaysiaJiangxi Vocational College of Finance and Economics, Jiujiang, ChinaHenan High-speed Railway Operation and Maintenance Engineering Research Center, Zhengzhou Railway Vocational and Technical College, Zhengzhou, ChinaHenan High-speed Railway Operation and Maintenance Engineering Research Center, Zhengzhou Railway Vocational and Technical College, Zhengzhou, ChinaSchool of Computer Sciences, Universiti Sains Malaysia, Penang, MalaysiaEmotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep learning-based models face challenges in capturing both the spatial activity features and spatial topology features of EEG signals simultaneously. To address this challenge, a domain-adaptation spatial-feature perception-network has been proposed for cross-subject EEG emotion recognition tasks, named DSP-EmotionNet. Firstly, a spatial activity topological feature extractor module has been designed to capture spatial activity features and spatial topology features of EEG signals, named SATFEM. Then, using SATFEM as the feature extractor, DSP-EmotionNet has been designed, significantly improving the accuracy of the model in cross-subject EEG emotion recognition tasks. The proposed model surpasses state-of-the-art methods in cross-subject EEG emotion recognition tasks, achieving an average recognition accuracy of 82.5% on the SEED dataset and 65.9% on the SEED-IV dataset.https://www.frontiersin.org/articles/10.3389/fnhum.2024.1471634/fullaffective computingelectroencephalographyemotion recognitionconvolutional neural networkgraph attention networkdomain adaptation
spellingShingle Wei Lu
Wei Lu
Xiaobo Zhang
Xiaobo Zhang
Lingnan Xia
Hua Ma
Tien-Ping Tan
Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition
Frontiers in Human Neuroscience
affective computing
electroencephalography
emotion recognition
convolutional neural network
graph attention network
domain adaptation
title Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition
title_full Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition
title_fullStr Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition
title_full_unstemmed Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition
title_short Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition
title_sort domain adaptation spatial feature perception neural network for cross subject eeg emotion recognition
topic affective computing
electroencephalography
emotion recognition
convolutional neural network
graph attention network
domain adaptation
url https://www.frontiersin.org/articles/10.3389/fnhum.2024.1471634/full
work_keys_str_mv AT weilu domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition
AT weilu domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition
AT xiaobozhang domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition
AT xiaobozhang domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition
AT lingnanxia domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition
AT huama domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition
AT tienpingtan domainadaptationspatialfeatureperceptionneuralnetworkforcrosssubjecteegemotionrecognition