Posts Tagged ‘Gaussian processes’

I attended the Bayesian Optimization workshop at NIPS 2015, and the following summarizes what was going on in the workshop from my perspective. This post primarily serves my self-interest in not losing these notes. But it may be useful for others as well.



The workshop was effectively run by Bobak Shahriari and Roberto Calandra. In the beginning of the workshop, Bobak Shahriari was giving a brief introduction to Bayesian Optimization (BO), motivating the entire setting of data-efficient global black-box optimization and the gap that this workshop will address. (more…)


I just returned from the Gaussian Process Summer School in Sheffield, followed by a Workshop on Bayesian Optimization. The aim of the GPSS was to expose people to Gaussian processes. This was done by some introductory lectures on GP regression, followed by some generalizations (e.g., classification, the GP-LVM, sparse GPs, Bayesian optimization) and some talks on current GP research.


Andrew Gordon Wilson and Hannes Nickisch:
Kernel Interpolation for Scalable Structured Gaussian Processes
ICML 2015

This paper was clearly one of my highlights at ICML and falls into the category of large-scale kernel machines, one of the trends at ICML. Wilson and Nickisch combine the advantages of inducing point and structure-exploiting (e.g., Kronecker/Toeplitz) approaches.

The key idea behind structured kernel interpolation is (more…)

Daniel Hernandez-Lobato and Jose Miguel Hernandez-Lobato:
Scalable Gaussian Process Classification via Expectation Propagation

The paper is about large-scale Gaussian process classification. Unlike many others, the authors use Expectation Propagation (and not Variational Inference) for approximate inference. An approximate marginal likelihood expression is derived that factorizes over the data instances, which allows for distributed inference and training. Training is additionally sped up by using mini-batches of data.

An interesting part of the paper is that (more…)