Published on Tue Nov 19 2013

Gaussian Process Optimization with Mutual Information

Emile Contal, Vianney Perchet, Nicolas Vayatis

In this paper, we analyze a generic algorithm scheme for sequential global optimization using Gaussian processes. We confirm the efficiency of this algorithm on synthetic and real tasks against natural competitor, GP-UCB, and also the Expected Improvement heuristic.

0
0
0
Abstract

In this paper, we analyze a generic algorithm scheme for sequential global optimization using Gaussian processes. The upper bounds we derive on the cumulative regret for this generic algorithm improve by an exponential factor the previously known bounds for algorithms like GP-UCB. We also introduce the novel Gaussian Process Mutual Information algorithm (GP-MI), which significantly improves further these upper bounds for the cumulative regret. We confirm the efficiency of this algorithm on synthetic and real tasks against the natural competitor, GP-UCB, and also the Expected Improvement heuristic.

Wed Jun 27 2012
Machine Learning
Joint Optimization and Variable Selection of High-dimensional Gaussian Processes
Maximizing high-dimensional, non-convex functions through noisy observations is a notoriously hard problem, but one that arises in many applications. In this paper, we tackle this challenge by modeling the unknown function as asample from a high- dimensional Gaussian process (GP)
0
0
0
Mon Oct 19 2015
Machine Learning
Optimization for Gaussian Processes via Chaining
We generalize the GP-UCB algorithm to arbitrary kernels and search spaces. We use a notion of localized chaining to control the supremum of a Gaussian process. We provide a novel optimization scheme based on the computation of covering numbers.
0
0
0
Mon Jun 12 2017
Machine Learning
Dealing with Integer-valued Variables in Bayesian Optimization with Gaussian Processes
Bayesian optimization (BO) methods are useful for optimizing functions that are expensive to evaluate, lack an analytical expression and can be contaminated by noise. These methods rely on a probabilistic model of the objective function, typically a Gaussian process. We show that this can lead to problems in
0
0
0
Mon Apr 26 2021
Machine Learning
One-parameter family of acquisition functions for efficient global optimization
0
0
0
Tue Oct 27 2020
Machine Learning
A Computationally Efficient Approach to Black-box Optimization using Gaussian Process Models
We develop a computationally efficient algorithm that reduces the complexity of the prevailing GP-UCB family of algorithms. The algorithm is also shown to have order-optimal performance (up to a poly-logarithmic factor)
0
0
0
Fri Nov 10 2017
Machine Learning
GPflowOpt: A Bayesian Optimization Library using TensorFlow
GPflowOpt is based on the popular GPflow library for Gaussian processes. It leverages the benefits of TensorFlow including automatic differentiation and parallelization. Design goals focus on a framework that is easy to extend with customized acquisition functions and models.
0
0
0