Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition
Significant strides have been made in emotion recognition from Electroencephalography (EEG) signals. However, effectively modeling the diverse spatial, spectral, and temporal features of multi-channel brain signals remains a challenge. This paper proposes a novel framework, the Directional Spatial a...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2025-01-01
|
Series: | Frontiers in Neurorobotics |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fnbot.2024.1481746/full |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841556632355471360 |
---|---|
author | Jiyao Liu Lang He Haifeng Chen Dongmei Jiang |
author_facet | Jiyao Liu Lang He Haifeng Chen Dongmei Jiang |
author_sort | Jiyao Liu |
collection | DOAJ |
description | Significant strides have been made in emotion recognition from Electroencephalography (EEG) signals. However, effectively modeling the diverse spatial, spectral, and temporal features of multi-channel brain signals remains a challenge. This paper proposes a novel framework, the Directional Spatial and Spectral Attention Network (DSSA Net), which enhances emotion recognition accuracy by capturing critical spatial-spectral-temporal features from EEG signals. The framework consists of three modules: Positional Attention (PA), Spectral Attention (SA), and Temporal Attention (TA). The PA module includes Vertical Attention (VA) and Horizontal Attention (HA) branches, designed to detect active brain regions from different orientations. Experimental results on three benchmark EEG datasets demonstrate that DSSA Net outperforms most competitive methods. On the SEED and SEED-IV datasets, it achieves accuracies of 96.61% and 85.07% for subject-dependent emotion recognition, respectively, and 87.03% and 75.86% for subject-independent recognition. On the DEAP dataset, it attains accuracies of 94.97% for valence and 94.73% for arousal. These results showcase the framework's ability to leverage both spatial and spectral differences across brain hemispheres and regions, enhancing classification accuracy for emotion recognition. |
format | Article |
id | doaj-art-3aeff9b8721f49cab2d4212b1063fb80 |
institution | Kabale University |
issn | 1662-5218 |
language | English |
publishDate | 2025-01-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Neurorobotics |
spelling | doaj-art-3aeff9b8721f49cab2d4212b1063fb802025-01-07T06:50:55ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182025-01-011810.3389/fnbot.2024.14817461481746Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognitionJiyao Liu0Lang He1Haifeng Chen2Dongmei Jiang3School of Computer Science, Northwestern Polytechnical University, Xi'an, ChinaSchool of Computer Science and Technology, Xi'an University of Posts and Telecommunications, Xi'an, Shaanxi, ChinaSchool of Electronic Information and Artificial Intelligence, Shaanxi University of Science and Technology, Xi'an, ChinaSchool of Computer Science, Northwestern Polytechnical University, Xi'an, ChinaSignificant strides have been made in emotion recognition from Electroencephalography (EEG) signals. However, effectively modeling the diverse spatial, spectral, and temporal features of multi-channel brain signals remains a challenge. This paper proposes a novel framework, the Directional Spatial and Spectral Attention Network (DSSA Net), which enhances emotion recognition accuracy by capturing critical spatial-spectral-temporal features from EEG signals. The framework consists of three modules: Positional Attention (PA), Spectral Attention (SA), and Temporal Attention (TA). The PA module includes Vertical Attention (VA) and Horizontal Attention (HA) branches, designed to detect active brain regions from different orientations. Experimental results on three benchmark EEG datasets demonstrate that DSSA Net outperforms most competitive methods. On the SEED and SEED-IV datasets, it achieves accuracies of 96.61% and 85.07% for subject-dependent emotion recognition, respectively, and 87.03% and 75.86% for subject-independent recognition. On the DEAP dataset, it attains accuracies of 94.97% for valence and 94.73% for arousal. These results showcase the framework's ability to leverage both spatial and spectral differences across brain hemispheres and regions, enhancing classification accuracy for emotion recognition.https://www.frontiersin.org/articles/10.3389/fnbot.2024.1481746/fullEEGemotion recognitionspectral attentionposition attentiontemporal attentiondirectional spatial attention |
spellingShingle | Jiyao Liu Lang He Haifeng Chen Dongmei Jiang Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition Frontiers in Neurorobotics EEG emotion recognition spectral attention position attention temporal attention directional spatial attention |
title | Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition |
title_full | Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition |
title_fullStr | Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition |
title_full_unstemmed | Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition |
title_short | Directional Spatial and Spectral Attention Network (DSSA Net) for EEG-based emotion recognition |
title_sort | directional spatial and spectral attention network dssa net for eeg based emotion recognition |
topic | EEG emotion recognition spectral attention position attention temporal attention directional spatial attention |
url | https://www.frontiersin.org/articles/10.3389/fnbot.2024.1481746/full |
work_keys_str_mv | AT jiyaoliu directionalspatialandspectralattentionnetworkdssanetforeegbasedemotionrecognition AT langhe directionalspatialandspectralattentionnetworkdssanetforeegbasedemotionrecognition AT haifengchen directionalspatialandspectralattentionnetworkdssanetforeegbasedemotionrecognition AT dongmeijiang directionalspatialandspectralattentionnetworkdssanetforeegbasedemotionrecognition |