The recent advancements in generative modeling have seen a significant shift towards enhancing the efficiency and quality of sample generation. A notable trend is the refinement of diffusion models, where innovations such as noise level correction and non-normal diffusion processes are being explored to improve sample quality and flexibility in model design. Additionally, normalizing flows are re-emerging as powerful generative models, with new architectures and training techniques pushing the boundaries of likelihood estimation and sample diversity. The integration of deterministic ODE-based samplers and lines matching models is also contributing to more efficient and accurate sampling processes. Furthermore, the analysis and mitigation of model collapse in rectified flow models are addressing critical issues in iterative training, ensuring sustained performance and efficiency. Overall, these developments are collectively driving the field towards more robust, efficient, and versatile generative models.