Advances in Neural Network Approximation and Generative Models

The field of neural networks is rapidly advancing, with a focus on improving the approximation capabilities of various architectures. Recent research has demonstrated that transformers can overcome the curse of dimensionality, and new techniques have been proposed to enhance the expressive power of these models. For instance, the use of normalizing flows and autoregressive models has shown promise in improving the performance of Bayesian optimization and generative modeling. Furthermore, the development of conformalized methods has enabled the provision of uncertainty quantification and statistical guarantees in synthetic data generation. Noteworthy papers include: Transformers Can Overcome the Curse of Dimensionality, which investigates the approximation of Holder continuous functions by transformers. Latent Bayesian Optimization via Autoregressive Normalizing Flows, which proposes a normalizing flow-based approach to Bayesian optimization. Statistical Guarantees in Synthetic Data through Conformal Adversarial Generation, which presents a framework for generating synthetic data with provable statistical guarantees.

Sources

Transformers Can Overcome the Curse of Dimensionality: A Theoretical Study from an Approximation Perspective

On Dimension-Free Transformer: An Application of STP to AI

Latent Bayesian Optimization via Autoregressive Normalizing Flows

Kolmogorov-Arnold Networks: Approximation and Learning Guarantees for Functions and their Derivatives

Conformalized-KANs: Uncertainty Quantification with Coverage Guarantees for Kolmogorov-Arnold Networks (KANs) in Scientific Machine Learning

Universal Approximation with Softmax Attention

Learning Energy-Based Generative Models via Potential Flow: A Variational Principle Approach to Probability Density Homotopy Matching

Hyper-Transforming Latent Diffusion Models

Provable wavelet-based neural approximation

Statistical Guarantees in Synthetic Data through Conformal Adversarial Generation

Built with on top of