Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels

Deep learning has excelled in image classification, but noisy labels in large datasets pose a significant challenge, impacting performance and generalization. To tackle this, we propose a novel co-training method using cyclic learning rates. This method trains two networks simultaneously, each selec...

Full description

Saved in:
Bibliographic Details
Main Authors: Ying Zheng, Yu Gu, Pingping Bai, Dong Yuan, Siqi Zhou, Xin Lyu, Ang Chen
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10829578/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841542567724843008
author Ying Zheng
Yu Gu
Pingping Bai
Dong Yuan
Siqi Zhou
Xin Lyu
Ang Chen
author_facet Ying Zheng
Yu Gu
Pingping Bai
Dong Yuan
Siqi Zhou
Xin Lyu
Ang Chen
author_sort Ying Zheng
collection DOAJ
description Deep learning has excelled in image classification, but noisy labels in large datasets pose a significant challenge, impacting performance and generalization. To tackle this, we propose a novel co-training method using cyclic learning rates. This method trains two networks simultaneously, each selecting clean samples based on loss values to optimize the other’s parameters, reducing overfitting and confirmation bias. The cyclic learning rate allows networks to oscillate between underfitting and overfitting, enhancing the distinction between clean and noisy samples. Our approach improves noise detection accuracy and robustness against label noise on datasets like CIFAR-10, CIFAR-100, and Clothing1M. Especially on CIFAR-10 and CIFAR-100 with 40% symmetric noise ratio, and Clothing1M, it outperforms the most relevant O2U-Net by 2.59%, 6.11%, and 0.57% in test accuracy, respectively, demonstrating superior noise resistance and classification accuracy under various noise conditions. Comprehensive experiments confirm the effectiveness of our method, advancing image classification in the presence of noisy labels.
format Article
id doaj-art-939dc708791f48d8b28af6c13b47012c
institution Kabale University
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-939dc708791f48d8b28af6c13b47012c2025-01-14T00:02:38ZengIEEEIEEE Access2169-35362025-01-01136292630510.1109/ACCESS.2025.352633210829578Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy LabelsYing Zheng0https://orcid.org/0009-0007-9993-7643Yu Gu1https://orcid.org/0009-0005-1941-6668Pingping Bai2https://orcid.org/0009-0001-7665-4911Dong Yuan3https://orcid.org/0009-0005-8167-369XSiqi Zhou4Xin Lyu5https://orcid.org/0000-0003-1862-2070Ang Chen6https://orcid.org/0009-0003-0406-5936State Key Laboratory of Hydraulic Engineering Intelligent Construction and Operation, Tianjin University, Tianjin, ChinaShandong Province Water Transfer Project Operation and Maintenance Center, Jinan, ChinaShandong Province Water Transfer Project Operation and Maintenance Center, Jinan, ChinaShandong Water Conservancy Survey and Design Institute Company Ltd., Jinan, ChinaCollege of Computer and Information, Hohai University, Nanjing, ChinaCollege of Computer and Information, Hohai University, Nanjing, ChinaCollege of Computer and Information, Hohai University, Nanjing, ChinaDeep learning has excelled in image classification, but noisy labels in large datasets pose a significant challenge, impacting performance and generalization. To tackle this, we propose a novel co-training method using cyclic learning rates. This method trains two networks simultaneously, each selecting clean samples based on loss values to optimize the other’s parameters, reducing overfitting and confirmation bias. The cyclic learning rate allows networks to oscillate between underfitting and overfitting, enhancing the distinction between clean and noisy samples. Our approach improves noise detection accuracy and robustness against label noise on datasets like CIFAR-10, CIFAR-100, and Clothing1M. Especially on CIFAR-10 and CIFAR-100 with 40% symmetric noise ratio, and Clothing1M, it outperforms the most relevant O2U-Net by 2.59%, 6.11%, and 0.57% in test accuracy, respectively, demonstrating superior noise resistance and classification accuracy under various noise conditions. Comprehensive experiments confirm the effectiveness of our method, advancing image classification in the presence of noisy labels.https://ieeexplore.ieee.org/document/10829578/Deep learningcyclic learning rateco-trainingimage classificationnoisy label
spellingShingle Ying Zheng
Yu Gu
Pingping Bai
Dong Yuan
Siqi Zhou
Xin Lyu
Ang Chen
Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
IEEE Access
Deep learning
cyclic learning rate
co-training
image classification
noisy label
title Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
title_full Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
title_fullStr Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
title_full_unstemmed Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
title_short Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
title_sort cyclic learning rate based co training for image classification with noisy labels
topic Deep learning
cyclic learning rate
co-training
image classification
noisy label
url https://ieeexplore.ieee.org/document/10829578/
work_keys_str_mv AT yingzheng cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels
AT yugu cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels
AT pingpingbai cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels
AT dongyuan cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels
AT siqizhou cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels
AT xinlyu cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels
AT angchen cycliclearningratebasedcotrainingforimageclassificationwithnoisylabels