An improved sample selection framework for learning with noisy labels.
Deep neural networks have powerful memory capabilities, yet they frequently suffer from overfitting to noisy labels, leading to a decline in classification and generalization performance. To address this issue, sample selection methods that filter out potentially clean labels have been proposed. How...
Saved in:
| Main Authors: | Qian Zhang, Yi Zhu, Ming Yang, Ge Jin, Yingwen Zhu, Yanjun Lu, Yu Zou, Qiu Chen |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Public Library of Science (PLoS)
2024-01-01
|
| Series: | PLoS ONE |
| Online Access: | https://doi.org/10.1371/journal.pone.0309841 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A review on label cleaning techniques for learning with noisy labels
by: Jongmin Shin, et al.
Published: (2024-12-01) -
Learning with noisy labels via clean aware sharpness aware minimization
by: Bin Huang, et al.
Published: (2025-01-01) -
Cyclic Learning Rate-Based Co-Training for Image Classification With Noisy Labels
by: Ying Zheng, et al.
Published: (2025-01-01) -
Evaluating deep learning models for classifying OCT images with limited data and noisy labels
by: Aleksandar Miladinović, et al.
Published: (2024-12-01) -
Adversarially robust generalization from network crowd noisy labels
by: Chicheng Ma, et al.
Published: (2025-02-01)