Advances in Conformal Prediction and Uncertainty Quantification

The field of machine learning is moving towards a greater emphasis on reliability and uncertainty quantification, particularly in high-stakes domains. Researchers are exploring new statistical frameworks and methods to estimate worst-case failures and provide formal guarantees on the validity of predictions. Conformal prediction is emerging as a key framework for achieving these goals, with applications in speech emotion recognition, image captioning evaluation, and scenario optimization. A significant innovation in this area is the development of risk-calibrated approaches, which enable task-specific adaptation and customizable loss functions. Another important trend is the use of Extreme Value Theory (EVT) to estimate extreme error probabilities and provide robust estimation of catastrophic failure probabilities. Noteworthy papers in this area include:

  • A study on risk-calibrated affective speech recognition, which proposes a stochastic calibrative framework for emergent uncertainty quantification.
  • A paper on bridging conformal prediction and scenario optimization, which establishes a theoretical connection between these two frameworks.
  • A study on a new statistical framework for extreme error probability in high-stakes domains, which applies EVT to synthetic and real-world datasets.

Sources

Risk-Calibrated Affective Speech Recognition via Conformal Coverage Guarantees: A Stochastic Calibrative Framework for Emergent Uncertainty Quantification

Modeling speech emotion with label variance and analyzing performance across speakers and unseen acoustic conditions

Bridging conformal prediction and scenario optimization

New Statistical Framework for Extreme Error Probability in High-Stakes Domains for Reliable Machine Learning

A Conformal Risk Control Framework for Granular Word Assessment and Uncertainty Calibration of CLIPScore Quality Estimates

Built with on top of