Neural Netw. 2026 Mar 22;201:108889. doi: 10.1016/j.neunet.2026.108889. Online ahead of print.
ABSTRACT
Federated learning, a vital paradigm in modern machine learning, enables private and decentralised training of models that is crucial for learning from sensitive data. Noisy label learning, another vital paradigm in modern machine learning, addresses the training of models from the data with potentially incorrect labels. Their integration, namely federated learning with noisy labels (FLNL), is an emerging but challenging topic arising from the practice of machine learning, which, however, still lacks a review of its research progress. The aim of this paper is to fill in this gap. We first summarise four core challenges to FLNL: localised label noise, across-client heterogeneity of label noise, localised overfitting to label noise, and inadequate benchmarking. We then propose a taxonomy to categorise current FLNL studies into four types that address the four challenges correspondingly: sample-wise methods, client-wise methods, model-wise methods, and benchmark-wise studies. This work offers the first comprehensive and concise review dedicated to FLNL; moreover, we also provide future research directions for this rapidly evolving and practically significant field.
PMID:41930546 | DOI:10.1016/j.neunet.2026.108889