Published on Wed Oct 19 2011

Gaussian Process Regression Networks

Andrew Gordon Wilson, David A. Knowles, Zoubin Ghahramani

We introduce a new regression framework, Gaussian process regression networks (GPRN) GPRN combines the structural properties of Bayesian neural networks with the non-parametric flexibility of Gaussian processes. We derive both efficient Markov chain and Monte Carlo procedures for this model.

0
0
0
Abstract

We introduce a new regression framework, Gaussian process regression networks (GPRN), which combines the structural properties of Bayesian neural networks with the non-parametric flexibility of Gaussian processes. This model accommodates input dependent signal and noise correlations between multiple response variables, input dependent length-scales and amplitudes, and heavy-tailed predictive distributions. We derive both efficient Markov chain Monte Carlo and variational Bayes inference procedures for this model. We apply GPRN as a multiple output regression and multivariate volatility model, demonstrating substantially improved performance over eight popular multiple output (multi-task) Gaussian process models and three multivariate volatility models on benchmark datasets, including a 1000 dimensional gene expression dataset.

Tue Feb 20 2018
Machine Learning
The Gaussian Process Autoregressive Regression Model (GPAR)
Multi-output regression models must exploit dependencies between outputs to maximise predictive performance. The Gaussian Process Autoregressive Regression (GPAR) model is a scalable multi-output GP model. GPAR's efficacy is demonstrated on a variety of synthetic and real-world problems.
0
0
0
Tue Oct 27 2015
Machine Learning
Blitzkriging: Kronecker-structured Stochastic Gaussian Processes
Blitzkriging is a new approach to fast inference for Gaussian processes. It scales cubically in the number of 'inducing inputs' to factorise the model.
0
0
0
Tue Oct 09 2012
Machine Learning
Gaussian process modelling of multiple short time series
We present techniques for effective Gaussian process (GP) modelling of multiple short time series. We propose avoiding over-fitting by constraining the GP length-scale to values that focus most of the energy spectrum to frequencies below the Nyquist frequency corresponding to the sampling frequency in the data set.
0
0
0
Tue Jul 21 2020
Machine Learning
MAGMA: Inference and Prediction with Multi-Task Gaussian Processes
We investigate the problem of multiple time series forecasting. We propose a multi-task Gaussian process framework to model batches of individuals. We also provide formulas integrating this common mean process. This approach greatly improves the predictive performance far from observations.
0
0
0
Tue Sep 22 2020
Machine Learning
An Intuitive Tutorial to Gaussian Processes Regression
Gaussian processes regression (GPR) models have been widely used in machine learning applications. This tutorial aims to provide an intuitive understanding of the Gaussian process.
1
2
17
Thu Jan 30 2020
Neural Networks
Transport Gaussian Processes for Regression
Gaussian process (GP) priors are non-parametric generative models for Bayesian inference. GP models rely on Gaussianity, an assumption that does not hold in several real-world scenarios. We propose a methodology to construct stochastic processes under a single unified approach.
0
0
0