Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement
Abstract Diabetes prediction is an important topic in the field of medical health. Accurate prediction can help early intervention and reduce patients’ health risks and medical costs. This paper proposes a data preprocessing method, including removing outliers, filling missing values, and using spar...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-024-83902-6 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841559568705912832 |
---|---|
author | Jian Zhao Hanlin Gao Lei Sun Lijuan Shi Zhejun Kuang Haiyan Wang |
author_facet | Jian Zhao Hanlin Gao Lei Sun Lijuan Shi Zhejun Kuang Haiyan Wang |
author_sort | Jian Zhao |
collection | DOAJ |
description | Abstract Diabetes prediction is an important topic in the field of medical health. Accurate prediction can help early intervention and reduce patients’ health risks and medical costs. This paper proposes a data preprocessing method, including removing outliers, filling missing values, and using sparse autoencoder (SAE) feature enhancement. This study proposes a new method for type 2 diabetes classification using a dual Convolutional Neural Network (CNN) teacher-student distillation model (DCTSD-Model), aiming to improve the accuracy and reliability of diabetes prediction. The variables of the original data are expanded by SAE to enhance the expressive power of the features. The proposed CNN and DCTSD-Model models are evaluated on the feature enhancement dataset using 10-fold cross validation. The experimental results show that after data preprocessing, DCTSD-Model adopts the dual teacher model knowledge distillation method to help the student model learn rich category information by generating soft labels, and uses weighted random samplers to learn samples of different categories, which solves the category imbalance problem and achieves excellent classification performance. The accuracy of DCTSD-Model on the classification task reached 98.57%, which is significantly higher than other models, showing higher classification ability and reliability. This method provides an effective solution for diabetes prediction and lays a solid foundation for further research and application. |
format | Article |
id | doaj-art-6ec3ce72341f4a55a95c5a93fe02b24b |
institution | Kabale University |
issn | 2045-2322 |
language | English |
publishDate | 2025-01-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj-art-6ec3ce72341f4a55a95c5a93fe02b24b2025-01-05T12:23:12ZengNature PortfolioScientific Reports2045-23222025-01-0115111510.1038/s41598-024-83902-6Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancementJian Zhao0Hanlin Gao1Lei Sun2Lijuan Shi3Zhejun Kuang4Haiyan Wang5College of Computer Science and Technology, Changchun UniversityCollege of Computer Science and Technology, Changchun UniversityCollege of Computer Science and Technology, Changchun UniversityCollege of Electronic Information Engineering, Changchun UniversityCollege of Computer Science and Technology, Changchun UniversityCollege of Computer Science and Technology, Changchun UniversityAbstract Diabetes prediction is an important topic in the field of medical health. Accurate prediction can help early intervention and reduce patients’ health risks and medical costs. This paper proposes a data preprocessing method, including removing outliers, filling missing values, and using sparse autoencoder (SAE) feature enhancement. This study proposes a new method for type 2 diabetes classification using a dual Convolutional Neural Network (CNN) teacher-student distillation model (DCTSD-Model), aiming to improve the accuracy and reliability of diabetes prediction. The variables of the original data are expanded by SAE to enhance the expressive power of the features. The proposed CNN and DCTSD-Model models are evaluated on the feature enhancement dataset using 10-fold cross validation. The experimental results show that after data preprocessing, DCTSD-Model adopts the dual teacher model knowledge distillation method to help the student model learn rich category information by generating soft labels, and uses weighted random samplers to learn samples of different categories, which solves the category imbalance problem and achieves excellent classification performance. The accuracy of DCTSD-Model on the classification task reached 98.57%, which is significantly higher than other models, showing higher classification ability and reliability. This method provides an effective solution for diabetes prediction and lays a solid foundation for further research and application.https://doi.org/10.1038/s41598-024-83902-6 |
spellingShingle | Jian Zhao Hanlin Gao Lei Sun Lijuan Shi Zhejun Kuang Haiyan Wang Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement Scientific Reports |
title | Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement |
title_full | Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement |
title_fullStr | Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement |
title_full_unstemmed | Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement |
title_short | Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement |
title_sort | type 2 diabetes prediction method based on dual teacher knowledge distillation and feature enhancement |
url | https://doi.org/10.1038/s41598-024-83902-6 |
work_keys_str_mv | AT jianzhao type2diabetespredictionmethodbasedondualteacherknowledgedistillationandfeatureenhancement AT hanlingao type2diabetespredictionmethodbasedondualteacherknowledgedistillationandfeatureenhancement AT leisun type2diabetespredictionmethodbasedondualteacherknowledgedistillationandfeatureenhancement AT lijuanshi type2diabetespredictionmethodbasedondualteacherknowledgedistillationandfeatureenhancement AT zhejunkuang type2diabetespredictionmethodbasedondualteacherknowledgedistillationandfeatureenhancement AT haiyanwang type2diabetespredictionmethodbasedondualteacherknowledgedistillationandfeatureenhancement |