Generative Models and Optimization for Inverse Problems in Imaging and Machine Learning

Current Developments in the Research Area

Recent advancements in the field have seen a significant shift towards integrating generative models and optimization techniques to address complex inverse problems in imaging and machine learning. The focus has been on developing efficient algorithms that leverage the strengths of both traditional optimization methods and modern deep learning architectures. This integration aims to enhance the performance of tasks such as image restoration, super-resolution, deblurring, and inpainting, while also addressing the computational challenges associated with these tasks.

One of the key directions in the field is the development of plug-and-play (PnP) methods that combine pre-trained denoisers with optimization schemes. These methods have shown state-of-the-art performance on various inverse problems but have faced limitations in more generative tasks like inpainting. Recent work has proposed combining PnP frameworks with generative models such as Flow Matching (FM) to overcome these limitations. This approach defines a time-dependent denoiser using a pre-trained FM model, resulting in a computationally efficient and memory-friendly algorithm that alternates between gradient descent steps, reprojections, and denoising.

Another significant development is the exploration of optimization in the Bures-Wasserstein space, which has gained popularity due to its connections with variational inference and Wasserstein gradient flows. Recent studies have introduced novel variance-reduced estimators based on control variates, which have been shown to improve optimization bounds and achieve order-of-magnitude improvements over previous methods. These advancements are particularly noteworthy in scenarios where Monte Carlo methods suffer from high variance and poor performance.

The field has also seen progress in solving the Schrödinger Bridge problem using iterative Markovian fitting procedures. Recent work has demonstrated that combining iterative Markovian fitting with iterative proportional fitting (IPF) can lead to a unified framework for solving Schrödinger Bridge problems under more general settings. This combined approach, termed Iterative Proportional Markovian Fitting (IPMF), has shown both theoretical and practical convergence, opening new avenues for research in this area.

In the realm of semi-supervised learning, there has been a focus on developing models that seamlessly integrate paired and unpaired data through data likelihood maximization techniques. These models have been shown to connect with inverse entropic optimal transport, allowing for the application of recent advances in computational OT to establish lightweight learning algorithms. This approach has demonstrated effective learning of conditional distributions using both paired and unpaired data simultaneously.

Scalability has also been a major focus, with the introduction of simulation-free approaches for solving the Entropic Unbalanced Optimal Transport (EUOT) problem. These methods derive dual formulations and optimality conditions from stochastic optimal control interpretations, leading to scalable algorithms that do not require expensive simulation costs during training and evaluation.

Noteworthy Papers

  1. PnP-Flow: Plug-and-Play Image Restoration with Flow Matching - This paper introduces a novel algorithm that combines PnP methods with Flow Matching, demonstrating superior results in various imaging tasks.
  2. Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold - The proposed variance-reduced estimator shows significant improvements over previous Bures-Wasserstein methods, particularly in scenarios of high variance.
  3. Diffusion & Adversarial Schr"odinger Bridges via Iterative Proportional Markovian Fitting - This work presents a unified framework for solving Schrödinger Bridge problems, combining iterative Markovian fitting with iterative proportional fitting.
  4. Inverse Entropic Optimal Transport Solves Semi-supervised Learning via Data Likelihood Maximization - The proposed learning paradigm seamlessly integrates paired and unpaired data, leveraging advances in computational OT for efficient learning.
  5. Scalable Simulation-free Entropic Unbalanced Optimal Transport - This paper introduces a simulation-free algorithm for solving EUOT, significantly improving scalability in generative modeling tasks.

Sources

PnP-Flow: Plug-and-Play Image Restoration with Flow Matching

Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold

Diffusion & Adversarial Schr\"odinger Bridges via Iterative Proportional Markovian Fitting

Inverse Entropic Optimal Transport Solves Semi-supervised Learning via Data Likelihood Maximization

Scalable Simulation-free Entropic Unbalanced Optimal Transport

Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions

Neural Sampling from Boltzmann Densities: Fisher-Rao Curves in the Wasserstein Geometry

Diffusion State-Guided Projected Gradient for Inverse Problems

Improving Neural Optimal Transport via Displacement Interpolation

Online Posterior Sampling with a Diffusion Prior

Overcoming False Illusions in Real-World Face Restoration with Multi-Modal Guided Diffusion Model

Towards Unsupervised Blind Face Restoration using Diffusion Prior

Learning Efficient and Effective Trajectories for Differential Equation-based Image Restoration

Leveraging Multimodal Diffusion Models to Accelerate Imaging with Side Information

ReFIR: Grounding Large Restoration Models with Retrieval Augmentation

Score-Based Variational Inference for Inverse Problems

Log-concave Sampling over a Convex Body with a Barrier: a Robust and Unified Dikin Walk

A noise-corrected Langevin algorithm and sampling by half-denoising

InstantIR: Blind Image Restoration with Instant Generative Reference

Through the Looking Glass: Mirror Schr\"odinger Bridges

BELM: Bidirectional Explicit Linear Multi-step Sampler for Exact Inversion in Diffusion Models

On Barycenter Computation: Semi-Unbalanced Optimal Transport-based Method on Gaussians

Generalizing Stochastic Smoothing for Differentiation and Gradient Estimation

Built with on top of