Impressions: Gaussian Process Summer School

Posted: 2015-09-18 in conferences, research
Tags: ,

I just returned from the Gaussian Process Summer School in Sheffield, followed by a Workshop on Bayesian Optimization. The aim of the GPSS was to expose people to Gaussian processes. This was done by some introductory lectures on GP regression, followed by some generalizations (e.g., classification, the GP-LVM, sparse GPs, Bayesian optimization) and some talks on current GP research.

As in many summer schools, the learning curve was steep, but we could get our hands dirty in the labs: ipython-notebooks with examples (all within GPy) and exercises were provided, so that we can learn about the effects of kernel length-scales, various acquisition functions for Bayesian optimization etc. Although I have been working with GPs for a while, I really enjoyed this part: The GPy toolbox abstracts all the complications away and provides an intuitive interface (tab-completion is your friend!) for dealing with regression, classification, Bayesian optimization, etc.

This is Nando de Freitas talking about Bayesian optimization:
IMG_20150916_114623

This is a picture of the social event:
IMG_20150916_183527

Overall, the GPSS was a great experience. If you have the chance, try it next year.

Here is the list of talks including links to the videos.

Introduction to GPs, Neil Lawrence, University of Sheffield

GPs And Latent Variable Models, Neil Lawrence, University of Sheffield

Bayesian Latent Variable Modelling With GPs, Andreas Damianou, University of Sheffield

Kernel Design, Nicolas Durrande, Ecole de Mine de St Etienne

Fitting Covariance and Multi-output Gaussian Processes, Neil Lawrence, University of Sheffield

Sparse Gaussian Processes, James Hensman, University of Sheffield

Gaussian Processes with non-Gaussian Likelihoods, Alan Saul, University of Sheffield

Distributed Gaussian Processes, Marc Deisenroth, Imperial College

Global Optimization with Gaussian Processes, Javier González, University of Sheffield

Bayesian Optimization, Nando de Freitas, University of Oxford and Google DeepMind

Incorporating structural priors in Gaussian random field models, David Ginsbourger, IDIAP and University of Bern

ABC and history matching with GPs, Rich Wilkinson, University of Nottingham

Advertisements
Comments
  1. jmexiax says:

    is there another technique more recent (or similar) than the gaussian process? (not deep learning)

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s