FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model
Facial emotion recognition (FER) can serve as a valuable tool for assessing emotional states, which are often linked to mental health. However, mental health encompasses a broad range of factors that go beyond facial expressions. While FER provides insights into certain aspects of emotional well-bei...
Saved in:
| Main Authors: | , , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2024-12-01
|
| Series: | Frontiers in Computational Neuroscience |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/fncom.2024.1485121/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1846120101645910016 |
|---|---|
| author | In-seop Na Asma Aldrees Abeer Hakeem Linda Mohaisen Muhammad Umer Dina Abdulaziz AlHammadi Shtwai Alsubai Nisreen Innab Imran Ashraf |
| author_facet | In-seop Na Asma Aldrees Abeer Hakeem Linda Mohaisen Muhammad Umer Dina Abdulaziz AlHammadi Shtwai Alsubai Nisreen Innab Imran Ashraf |
| author_sort | In-seop Na |
| collection | DOAJ |
| description | Facial emotion recognition (FER) can serve as a valuable tool for assessing emotional states, which are often linked to mental health. However, mental health encompasses a broad range of factors that go beyond facial expressions. While FER provides insights into certain aspects of emotional well-being, it can be used in conjunction with other assessments to form a more comprehensive understanding of an individual's mental health. This research work proposes a framework for human FER using UNet image segmentation and transfer learning with the EfficientNetB4 model (called FacialNet). The proposed model demonstrates promising results, achieving an accuracy of 90% for six emotion classes (happy, sad, fear, pain, anger, and disgust) and 96.39% for binary classification (happy and sad). The significance of FacialNet is judged by extensive experiments conducted against various machine learning and deep learning models, as well as state-of-the-art previous research works in FER. The significance of FacialNet is further validated using a cross-validation technique, ensuring reliable performance across different data splits. The findings highlight the effectiveness of leveraging UNet image segmentation and EfficientNetB4 transfer learning for accurate and efficient human facial emotion recognition, offering promising avenues for real-world applications in emotion-aware systems and effective computing platforms. Experimental findings reveal that the proposed approach performs substantially better than existing works with an improved accuracy of 96.39% compared to existing 94.26%. |
| format | Article |
| id | doaj-art-d1880bb29cd94e93bace04c34157eeca |
| institution | Kabale University |
| issn | 1662-5188 |
| language | English |
| publishDate | 2024-12-01 |
| publisher | Frontiers Media S.A. |
| record_format | Article |
| series | Frontiers in Computational Neuroscience |
| spelling | doaj-art-d1880bb29cd94e93bace04c34157eeca2024-12-16T13:32:47ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882024-12-011810.3389/fncom.2024.14851211485121FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning modelIn-seop Na0Asma Aldrees1Abeer Hakeem2Linda Mohaisen3Muhammad Umer4Dina Abdulaziz AlHammadi5Shtwai Alsubai6Nisreen Innab7Imran Ashraf8Division of Culture Contents, Chonnam National University, Yeosu, Republic of KoreaDepartment of Informatics and Computer Systems, College of Computer Science, King Khalid University, Abha, Saudi ArabiaDepartment of Information Technology, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Saudi ArabiaDepartment of Information Technology, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Saudi ArabiaDepartment of Computer Science & Information Technology, The Islamia University of Bahawalpur, Bahawalpur, PakistanDepartment of Information Systems, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, Riyadh, Saudi ArabiaDepartment of Computer Science, College of Computer Engineering and Sciences, Prince Sattam Bin Abdulaziz University, Al-Kharj, Saudi ArabiaDepartment of Computer Science and Information Systems, College of Applied Sciences, AlMaarefa University, Diriyah, Saudi ArabiaDepartment of Information and Communication Engineering, Yeungnam University, Gyeongsan, Republic of KoreaFacial emotion recognition (FER) can serve as a valuable tool for assessing emotional states, which are often linked to mental health. However, mental health encompasses a broad range of factors that go beyond facial expressions. While FER provides insights into certain aspects of emotional well-being, it can be used in conjunction with other assessments to form a more comprehensive understanding of an individual's mental health. This research work proposes a framework for human FER using UNet image segmentation and transfer learning with the EfficientNetB4 model (called FacialNet). The proposed model demonstrates promising results, achieving an accuracy of 90% for six emotion classes (happy, sad, fear, pain, anger, and disgust) and 96.39% for binary classification (happy and sad). The significance of FacialNet is judged by extensive experiments conducted against various machine learning and deep learning models, as well as state-of-the-art previous research works in FER. The significance of FacialNet is further validated using a cross-validation technique, ensuring reliable performance across different data splits. The findings highlight the effectiveness of leveraging UNet image segmentation and EfficientNetB4 transfer learning for accurate and efficient human facial emotion recognition, offering promising avenues for real-world applications in emotion-aware systems and effective computing platforms. Experimental findings reveal that the proposed approach performs substantially better than existing works with an improved accuracy of 96.39% compared to existing 94.26%.https://www.frontiersin.org/articles/10.3389/fncom.2024.1485121/fullfacial emotion recognitionUNETEfficientNettransfer learningimage processing |
| spellingShingle | In-seop Na Asma Aldrees Abeer Hakeem Linda Mohaisen Muhammad Umer Dina Abdulaziz AlHammadi Shtwai Alsubai Nisreen Innab Imran Ashraf FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model Frontiers in Computational Neuroscience facial emotion recognition UNET EfficientNet transfer learning image processing |
| title | FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model |
| title_full | FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model |
| title_fullStr | FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model |
| title_full_unstemmed | FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model |
| title_short | FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model |
| title_sort | facialnet facial emotion recognition for mental health analysis using unet segmentation with transfer learning model |
| topic | facial emotion recognition UNET EfficientNet transfer learning image processing |
| url | https://www.frontiersin.org/articles/10.3389/fncom.2024.1485121/full |
| work_keys_str_mv | AT inseopna facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT asmaaldrees facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT abeerhakeem facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT lindamohaisen facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT muhammadumer facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT dinaabdulazizalhammadi facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT shtwaialsubai facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT nisreeninnab facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel AT imranashraf facialnetfacialemotionrecognitionformentalhealthanalysisusingunetsegmentationwithtransferlearningmodel |