Advances in Generative Models and Information Theory

The recent developments in the research area of generative models and information theory have shown significant advancements, particularly in the areas of sampling techniques and information decomposition. The field is moving towards more efficient and generalized sampling methods, with a focus on improving the quality and diversity of generated samples. Innovations in score-based generative models and neural operators have demonstrated strong generalization capabilities, enabling the prediction of score functions for unseen probability distributions. Additionally, the integration of optimal transport theory with diffusion models has provided new insights into addressing prior distribution mismatches, thereby enhancing the sampling process. The introduction of null models for information theory has also facilitated more meaningful comparisons across complex systems, improving the robustness of information-theoretic analyses. Notably, the development of feature-guided score diffusion and the complete decomposition of KL error using refined information have further advanced the field by enabling more accurate conditional density sampling and efficient use of data in generative tasks. These advancements collectively indicate a shift towards more sophisticated and versatile models that can handle a wider range of distributions and data types, with potential applications in few-shot learning and zero-shot conditional sampling.

Sources

A phase transition in sampling from Restricted Boltzmann Machines

Score Neural Operator: A Generative Model for Learning and Generalizing Across Multiple Probability Distributions

The velocity jump Langevin process and its splitting scheme: long time convergence and numerical accuracy

Null models for comparing information decomposition across complex systems

Feature-guided score diffusion for sampling conditional densities

A Complete Decomposition of KL Error using Refined Information and Mode Interaction Selection

Training Neural Samplers with Reverse Diffusive KL Divergence

Information-theoretic Analysis of the Gibbs Algorithm: An Individual Sample Approach

Solving Prior Distribution Mismatch in Diffusion Models via Optimal Transport

Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers

Built with on top of