Learning from noisy labels with deep neural networks: A survey
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …
amounts of big data. However, the quality of data labels is a concern because of the lack of …
A survey of uncertainty in deep neural networks
Over the last decade, neural networks have reached almost every field of science and
become a crucial part of various real world applications. Due to the increasing spread …
become a crucial part of various real world applications. Due to the increasing spread …
Part-based pseudo label refinement for unsupervised person re-identification
Unsupervised person re-identification (re-ID) aims at learning discriminative representations
for person retrieval from unlabeled data. Recent techniques accomplish this task by using …
for person retrieval from unlabeled data. Recent techniques accomplish this task by using …
Generating training data with language models: Towards zero-shot language understanding
Pretrained language models (PLMs) have demonstrated remarkable performance in various
natural language processing tasks: Unidirectional PLMs (eg, GPT) are well known for their …
natural language processing tasks: Unidirectional PLMs (eg, GPT) are well known for their …
Robust training under label noise by over-parameterization
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …
than training samples, have dominated the performances of modern machine learning …
Spice: Semantic pseudo-labeling for image clustering
The similarity among samples and the discrepancy among clusters are two crucial aspects
of image clustering. However, current deep clustering methods suffer from inaccurate …
of image clustering. However, current deep clustering methods suffer from inaccurate …
Comparing kullback-leibler divergence and mean squared error loss in knowledge distillation
Knowledge distillation (KD), transferring knowledge from a cumbersome teacher model to a
lightweight student model, has been investigated to design efficient neural architectures …
lightweight student model, has been investigated to design efficient neural architectures …
Scarf: Self-supervised contrastive learning using random feature corruption
Self-supervised contrastive representation learning has proved incredibly successful in the
vision and natural language domains, enabling state-of-the-art performance with orders of …
vision and natural language domains, enabling state-of-the-art performance with orders of …
Deep learning with label differential privacy
Abstract The Randomized Response (RR) algorithm is a classical technique to improve
robustness in survey aggregation, and has been widely adopted in applications with …
robustness in survey aggregation, and has been widely adopted in applications with …
Boundary smoothing for named entity recognition
Neural named entity recognition (NER) models may easily encounter the over-confidence
issue, which degrades the performance and calibration. Inspired by label smoothing and …
issue, which degrades the performance and calibration. Inspired by label smoothing and …