Learning from noisy labels with deep neural networks: A survey
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …
amounts of big data. However, the quality of data labels is a concern because of the lack of …
Deep learning in electron microscopy
JM Ede - Machine Learning: Science and Technology, 2021 - iopscience.iop.org
Deep learning is transforming most areas of science and technology, including electron
microscopy. This review paper offers a practical perspective aimed at developers with …
microscopy. This review paper offers a practical perspective aimed at developers with …
Robust training under label noise by over-parameterization
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …
than training samples, have dominated the performances of modern machine learning …
Does label smoothing mitigate label noise?
Label smoothing is commonly used in training deep learning models, wherein one-hot
training labels are mixed with uniform label vectors. Empirically, smoothing has been shown …
training labels are mixed with uniform label vectors. Empirically, smoothing has been shown …
Polyloss: A polynomial expansion perspective of classification loss functions
Cross-entropy loss and focal loss are the most common choices when training deep neural
networks for classification problems. Generally speaking, however, a good loss function can …
networks for classification problems. Generally speaking, however, a good loss function can …
Deep learning with label differential privacy
Abstract The Randomized Response (RR) algorithm is a classical technique to improve
robustness in survey aggregation, and has been widely adopted in applications with …
robustness in survey aggregation, and has been widely adopted in applications with …
A survey of label-noise representation learning: Past, present and future
Classical machine learning implicitly assumes that labels of the training data are sampled
from a clean distribution, which can be too restrictive for real-world scenarios. However …
from a clean distribution, which can be too restrictive for real-world scenarios. However …
Provably consistent partial-label learning
Partial-label learning (PLL) is a multi-class classification problem, where each training
example is associated with a set of candidate labels. Even though many practical PLL …
example is associated with a set of candidate labels. Even though many practical PLL …
[PDF][PDF] Can cross entropy loss be robust to label noise?
Trained with the standard cross entropy loss, deep neural networks can achieve great
performance on correctly labeled data. However, if the training data is corrupted with label …
performance on correctly labeled data. However, if the training data is corrupted with label …
Beyond class-conditional assumption: A primary attempt to combat instance-dependent label noise
Supervised learning under label noise has seen numerous advances recently, while
existing theoretical findings and empirical results broadly build up on the class-conditional …
existing theoretical findings and empirical results broadly build up on the class-conditional …