The recent developments in the research area of generative models and information theory have shown significant advancements, particularly in the areas of sampling techniques and information decomposition. The field is moving towards more efficient and generalized sampling methods, with a focus on improving the quality and diversity of generated samples. Innovations in score-based generative models and neural operators have demonstrated strong generalization capabilities, enabling the prediction of score functions for unseen probability distributions. Additionally, the integration of optimal transport theory with diffusion models has provided new insights into addressing prior distribution mismatches, thereby enhancing the sampling process. The introduction of null models for information theory has also facilitated more meaningful comparisons across complex systems, improving the robustness of information-theoretic analyses. Notably, the development of feature-guided score diffusion and the complete decomposition of KL error using refined information have further advanced the field by enabling more accurate conditional density sampling and efficient use of data in generative tasks. These advancements collectively indicate a shift towards more sophisticated and versatile models that can handle a wider range of distributions and data types, with potential applications in few-shot learning and zero-shot conditional sampling.