Published on Tue Apr 05 2016

Bayesian Optimization with Exponential Convergence

Kenji Kawaguchi, Leslie Pack Kaelbling, Tomás Lozano-Pérez

Most Bayesian optimization methods require auxiliary optimization. Our approach eliminates both requirements and achieves an exponential convergence rate.

0
0
0
Abstract

This paper presents a Bayesian optimization method with exponential convergence without the need of auxiliary optimization and without the delta-cover sampling. Most Bayesian optimization methods require auxiliary optimization: an additional non-convex global optimization problem, which can be time-consuming and hard to implement in practice. Also, the existing Bayesian optimization method with exponential convergence requires access to the delta-cover sampling, which was considered to be impractical. Our approach eliminates both requirements and achieves an exponential convergence rate.

Wed May 30 2018
Machine Learning
A Flexible Framework for Multi-Objective Bayesian Optimization using Random Scalarizations
Bayesian optimization techniques for the multi-objective setting are pertinent when the evaluation of the functions in question are expensive. Traditional Bayesian methods are aimed at recovering the Pareto front of these objectives. In this work, we propose a strategy based on random scalarizations of the objectives.
0
0
0
Wed Jul 01 2020
Machine Learning
Bayesian Coresets: Revisiting the Nonconvex Optimization Perspective
Bayesian coresets have emerged as a promising approach for implementing Bayesian inference. The Bayesian coreset problem involves selecting a subset of the data samples. The selected subset closely approximates the posterior inference using the full dataset.
0
0
0
Tue May 22 2018
Machine Learning
Optimization, fast and slow: optimally switching between local and Bayesian optimization
BlOSSOM is the first Bayesian Optimization algorithm. It combines multiple alternative acquisition functions and traditional local optimization at each step.
0
0
0
Wed Oct 10 2018
Machine Learning
Combining Bayesian Optimization and Lipschitz Optimization
Bayesian optimization and Lipschitz optimization have developed alternative techniques for optimizing black-box functions. In this work, we explore strategies to combine these techniques for better global optimization. We propose ways to use the Lippedchitz continuity assumption within traditional BO algorithms.
0
0
0
Mon Oct 21 2013
Machine Learning
Stochastic Gradient Descent, Weighted Sampling, and the Randomized Kaczmarz algorithm
We obtain an improved finite-sample guarantee on the linear convergence of the gradient descent method. We also discuss importance sampling for SGD more broadly and show how it can improve convergence also in other scenarios.
0
0
0
Thu May 21 2020
Machine Learning
Global Optimization of Gaussian processes
0
0
0