Efficiency and Robustness in High-Dimensional Optimization and Regression

Advances in Bayesian Optimization and Robust Regression

Recent developments in the field have seen significant advancements in Bayesian optimization and robust regression techniques. The focus has been on improving efficiency, scalability, and robustness in high-dimensional spaces and under non-Gaussian noise conditions. Key innovations include novel sampling methods for Bayesian algorithm execution, hierarchical search space partitioning, and robust sparse regression algorithms that handle non-isotropic designs. These advancements not only enhance computational performance but also broaden the applicability of these methods to complex, real-world problems.

Noteworthy papers include one introducing a hierarchical Bayesian optimization algorithm that significantly outperforms state-of-the-art methods in high-dimensional benchmarks and real-world tasks. Another paper presents a robust Gaussian process model that effectively handles sparse outliers, demonstrating strong performance in diverse regression and optimization tasks.

Sources

Practical Bayesian Algorithm Execution via Posterior Sampling

K-step Vector Approximate Survey Propagation

Optimizing Posterior Samples for Bayesian Optimization via Rootfinding

Data subsampling for Poisson regression with pth-root-link

HiBO: Hierarchical Bayesian Optimization via Adaptive Search Space Partitioning

Robust Sparse Regression with Non-Isotropic Designs

Robust Gaussian Processes via Relevance Pursuit

Built with on top of