Partial label learning: Taxonomy, analysis and outlook
Partial label learning (PLL) is an emerging framework in weakly supervised machine
learning with broad application prospects. It handles the case in which each training …
learning with broad application prospects. It handles the case in which each training …
A survey of label-noise representation learning: Past, present and future
Classical machine learning implicitly assumes that labels of the training data are sampled
from a clean distribution, which can be too restrictive for real-world scenarios. However …
from a clean distribution, which can be too restrictive for real-world scenarios. However …
Provably consistent partial-label learning
Partial-label learning (PLL) is a multi-class classification problem, where each training
example is associated with a set of candidate labels. Even though many practical PLL …
example is associated with a set of candidate labels. Even though many practical PLL …
Progressive identification of true labels for partial-label learning
Partial-label learning (PLL) is a typical weakly supervised learning problem, where each
training instance is equipped with a set of candidate labels among which only one is the true …
training instance is equipped with a set of candidate labels among which only one is the true …
Instance-dependent partial label learning
Partial label learning (PLL) is a typical weakly supervised learning problem, where each
training example is associated with a set of candidate labels among which only one is true …
training example is associated with a set of candidate labels among which only one is true …
Do we need zero training loss after achieving zero training error?
Overparameterized deep networks have the capacity to memorize training data with
zero\emph {training error}. Even after memorization, the\emph {training loss} continues to …
zero\emph {training error}. Even after memorization, the\emph {training loss} continues to …
Leveraged weighted loss for partial label learning
As an important branch of weakly supervised learning, partial label learning deals with data
where each instance is assigned with a set of candidate labels, whereas only one of them is …
where each instance is assigned with a set of candidate labels, whereas only one of them is …
Learning from a complementary-label source domain: theory and algorithms
In unsupervised domain adaptation (UDA), a classifier for the target domain is trained with
massive true-label data from the source domain and unlabeled data from the target domain …
massive true-label data from the source domain and unlabeled data from the target domain …
Learning with multiple complementary labels
A complementary label (CL) simply indicates an incorrect class of an example, but learning
with CLs results in multi-class classifiers that can predict the correct class. Unfortunately, the …
with CLs results in multi-class classifiers that can predict the correct class. Unfortunately, the …
Sigua: Forgetting may make learning with noisy labels more robust
Given data with noisy labels, over-parameterized deep networks can gradually memorize
the data, and fit everything in the end. Although equipped with corrections for noisy labels …
the data, and fit everything in the end. Although equipped with corrections for noisy labels …