Learning from noisy labels with deep neural networks: A survey

H Song, M Kim, D Park, Y Shin… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …

Deep learning in electron microscopy

JM Ede - Machine Learning: Science and Technology, 2021 - iopscience.iop.org
Deep learning is transforming most areas of science and technology, including electron
microscopy. This review paper offers a practical perspective aimed at developers with …

Robust training under label noise by over-parameterization

S Liu, Z Zhu, Q Qu, C You - International Conference on …, 2022 - proceedings.mlr.press
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …

Does label smoothing mitigate label noise?

M Lukasik, S Bhojanapalli, A Menon… - … on Machine Learning, 2020 - proceedings.mlr.press
Label smoothing is commonly used in training deep learning models, wherein one-hot
training labels are mixed with uniform label vectors. Empirically, smoothing has been shown …

Polyloss: A polynomial expansion perspective of classification loss functions

Z Leng, M Tan, C Liu, ED Cubuk, X Shi… - arXiv preprint arXiv …, 2022 - arxiv.org
Cross-entropy loss and focal loss are the most common choices when training deep neural
networks for classification problems. Generally speaking, however, a good loss function can …

Deep learning with label differential privacy

B Ghazi, N Golowich, R Kumar… - Advances in neural …, 2021 - proceedings.neurips.cc
Abstract The Randomized Response (RR) algorithm is a classical technique to improve
robustness in survey aggregation, and has been widely adopted in applications with …

A survey of label-noise representation learning: Past, present and future

B Han, Q Yao, T Liu, G Niu, IW Tsang, JT Kwok… - arXiv preprint arXiv …, 2020 - arxiv.org
Classical machine learning implicitly assumes that labels of the training data are sampled
from a clean distribution, which can be too restrictive for real-world scenarios. However …

Provably consistent partial-label learning

L Feng, J Lv, B Han, M Xu, G Niu… - Advances in neural …, 2020 - proceedings.neurips.cc
Partial-label learning (PLL) is a multi-class classification problem, where each training
example is associated with a set of candidate labels. Even though many practical PLL …

[PDF][PDF] Can cross entropy loss be robust to label noise?

L Feng, S Shu, Z Lin, F Lv, L Li, B An - Proceedings of the twenty-ninth …, 2021 - ijcai.org
Trained with the standard cross entropy loss, deep neural networks can achieve great
performance on correctly labeled data. However, if the training data is corrupted with label …

Beyond class-conditional assumption: A primary attempt to combat instance-dependent label noise

P Chen, J Ye, G Chen, J Zhao, PA Heng - Proceedings of the AAAI …, 2021 - ojs.aaai.org
Supervised learning under label noise has seen numerous advances recently, while
existing theoretical findings and empirical results broadly build up on the class-conditional …