mlr3mbo: Bayesian Optimization in R
New modular R package outperforms industry standards like Optuna and Ax in extensive YAHPO Gym benchmarks.
A research team including Marc Becker, Lennart Schneider, and Bernd Bischl has introduced mlr3mbo, a modular and comprehensive toolbox for Bayesian optimization (BO) within the R programming ecosystem. The package is designed for both applied settings and research, supporting features like single- and multi-objective optimization, multi-point proposals, batch and asynchronous parallelization, and robust error handling. Its modular architecture allows users to either deploy standard BO variants or construct entirely custom algorithms from flexible building blocks, making it particularly valuable for researchers developing novel optimization approaches.
To validate its performance, the team conducted extensive empirical evaluations using the surrogate-based benchmark suite YAHPO Gym. They performed a coordinate descent search over mlr3mbo's configuration space to identify robust default settings for different problem types, including numeric and mixed-hierarchical optimization regimes. The software was benchmarked against a wide range of established optimizers, including HEBO, SMAC3, Ax, and Optuna. The results demonstrate that mlr3mbo achieves state-of-the-art performance, positioning it as a competitive and flexible new tool for hyperparameter tuning and black-box optimization tasks.
- Achieves state-of-the-art performance in benchmarks against HEBO, SMAC3, Ax, and Optuna
- Supports single/multi-objective optimization, batch parallelization, and custom algorithm construction
- Provides robust default configurations for numeric and mixed-hierarchical problems via YAHPO Gym analysis
Why It Matters
Provides data scientists and researchers with a powerful, flexible, and high-performing open-source alternative for hyperparameter tuning and complex optimization tasks.