Enhancing Model Robustness and Efficiency in Noisy Data Scenarios

The recent advancements in the research area primarily focus on enhancing the robustness and efficiency of machine learning models, particularly in the presence of noisy or incomplete data. A significant trend is the integration of self-supervised learning and iterative refinement techniques to mitigate the impact of instance-dependent label noise, which is more prevalent and challenging to address than instance-independent noise. These methods leverage robust feature representations learned without relying on potentially noisy labels, followed by iterative pseudo-label refinement to progressively improve label quality. Another notable direction is the exploration of cost-effective annotation strategies for object detection, questioning the necessity of annotating small-size instances and proposing alternative methods to achieve comparable performance without such annotations. Additionally, there is a growing emphasis on developing robust training strategies for models dealing with noisy correspondence and partial label learning, often employing novel frameworks that partition data into different types and adaptively leverage significant samples. These developments collectively aim to advance the field by enabling more reliable and efficient training of deep learning models in real-world scenarios characterized by noisy and incomplete data.

Sources

Mitigating Instance-Dependent Label Noise: Integrating Self-Supervised Pretraining with Pseudo-Label Refinement

Mixed Blessing: Class-Wise Embedding guided Instance-Dependent Partial Label Learning

Rethinking Annotation for Object Detection: Is Annotating Small-size Instances Worth Its Cost?

Tiny Object Detection with Single Point Supervision

Robust Noisy Correspondence Learning via Self-Drop and Dual-Weight

Self-Paced Learning Strategy with Easy Sample Prior Based on Confidence for the Flying Bird Object Detection Model Training

CoDTS: Enhancing Sparsely Supervised Collaborative Perception with a Dual Teacher-Student Framework

GradStop: Exploring Training Dynamics in Unsupervised Outlier Detection through Gradient Cohesion

Optimized Gradient Clipping for Noisy Label Learning

Built with on top of