Diabetic retinopathy grading using curvelet CNN with optimized SSO activations and wavelet-based image enhancement
Background: Diabetic retinopathy (DR) is a significant risk of blindness among diabetic patients, necessitating early and accurate detection. Existing methods often fall short in identifying key markers like hard exudates (HE), leading to challenges in assessing disease severity. Issues: Diabetes pa...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2025-01-01
|
Series: | Ain Shams Engineering Journal |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2090447924006208 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Background: Diabetic retinopathy (DR) is a significant risk of blindness among diabetic patients, necessitating early and accurate detection. Existing methods often fall short in identifying key markers like hard exudates (HE), leading to challenges in assessing disease severity. Issues: Diabetes patients need to be diagnosed early for diabetic retinopathy (DR) to reduce the risk of blindness. Many conventional methods fail to detect hard run-in retinopathy images used to determine diabetes severity. Method: In this paper, a novel Curvelet convolutional neural networks (CCNN) framework has been proposed to detect DR. Initially, the input retinal fundus images (RFI) are denoised using Wavelet Integrated Retinex (WIR) Algorithm to reduce the noise artifacts. After that, Curvelet convolutional neural networks (CCNN) are utilized to categorize the image as normal and abnormal. Furthermore, the Salp Swarm Optimization (SSO) algorithm is employed to enhance the classification performance of CCNN. Results: The proposed method achieves a remarkable 99.46 % accuracy, significantly surpassing the performance of leading CNN-based models. The Proposed Curvelet CNN approach enhances the overall accuracy by 2.17 %, 7.42 %, and 20.46 % better than DenseNet 121, Triple-DRNet, and EfficientNetB4 respectively. |
---|---|
ISSN: | 2090-4479 |