Publication: Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
dc.citedby | 1 | |
dc.contributor.author | Guo G. | en_US |
dc.contributor.author | Wei Z. | en_US |
dc.contributor.author | Cheng J. | en_US |
dc.contributor.authorid | 58805753200 | en_US |
dc.contributor.authorid | 58805777500 | en_US |
dc.contributor.authorid | 22833734200 | en_US |
dc.date.accessioned | 2025-03-03T07:48:18Z | |
dc.date.available | 2025-03-03T07:48:18Z | |
dc.date.issued | 2024 | |
dc.description.abstract | The task of continual learning is to design algorithms that can address the problem of catastrophic forgetting. However, in the real world, there are noisy labels due to inaccurate human annotations and other factors, which seem to exacerbate catastrophic forgetting. To tackle both catastrophic forgetting and noise issues, we propose an innovative framework. Our framework leverages sample uncertainty to purify the data stream and selects representative samples for replay, effectively alleviating catastrophic forgetting. Additionally, we adopt a semi-supervised approach for fine-tuning to ensure the involvement of all available samples. Simultaneously, we incorporate contrastive learning and entropy minimization to mitigate noise memorization in the model. We validate the effectiveness of our proposed method through extensive experiments on two benchmark datasets, CIFAR-10 and CIFAR-100. For CIFAR-10, we achieve a performance gain of 2% under 20% noise conditions. ? The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2024. | en_US |
dc.description.nature | Final | en_US |
dc.identifier.doi | 10.1007/978-981-99-8543-2_40 | |
dc.identifier.epage | 510 | |
dc.identifier.scopus | 2-s2.0-85181983496 | |
dc.identifier.spage | 498 | |
dc.identifier.uri | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85181983496&doi=10.1007%2f978-981-99-8543-2_40&partnerID=40&md5=f79b7a4392845c0d29b3d4190ac18737 | |
dc.identifier.uri | https://irepository.uniten.edu.my/handle/123456789/37178 | |
dc.identifier.volume | 14432 LNCS | |
dc.pagecount | 12 | |
dc.publisher | Springer Science and Business Media Deutschland GmbH | en_US |
dc.source | Scopus | |
dc.sourcetitle | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | |
dc.subject | Catastrophic forgetting | |
dc.subject | Continual learning | |
dc.subject | Feature enhancement | |
dc.subject | Noisy data | |
dc.subject | Noisy labels | |
dc.subject | Real-world | |
dc.subject | Replay | |
dc.subject | Sample features | |
dc.subject | Samples selection | |
dc.subject | Uncertainty | |
dc.subject | Entropy | |
dc.title | Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement | en_US |
dc.type | Conference paper | en_US |
dspace.entity.type | Publication |