ICML Impressions

Posted: 2015-07-13 in conferences, research

I just returned from ICML. I’m summarizing a few impressions below.

A) Trends:

  • Deep Learning, obviously. There was a lot of Deep Learning going on at ICML. The corresponding sessions and the workshop filled the biggest lecture hall (>1000 people) quite easily. Neil Lawrence summarized in nicely in a Tweet: https://twitter.com/lawrennd/status/619446139876679681
    All other fields suffered somewhat from the Deep Learning enthusiasm. Most of the DL research is pursued by Google and Facebook (and their collaborators). However, some (independent) groups are also looking at the topic. An impression I got is that Deep Learning is now slowly entering a saturation phase: Besides “bigger data” and “deeper networks”, deep learning has now been applied to more or less all interesting datasets, and only a few papers discussed algorithmic novelties. Let’s see what is coming in the next years.
  • Large-scale kernels. There were a lot of Gaussian process papers that were dealing with scaling them up. It seems Deep Learning has pushed the GP (and kernel) community to finally think seriously about scaling these methods up to larger data sets. The approaches are quite different, ranging from kernel interpolation, structure exploitation, efficiently solving linear systems to distributed computing. In the large-scale kernel machines workshop on Saturday somebody was talking about Deep Convolutional Kernel Networks, which seem to be performing very well on the deep learning benchmark tasks. This is clearly promising and I’m waiting for follow-up work.
  • Reinforcement Learning. Since ICML is the main conference for RL (and there was EWRL attached as an ICML workshop), there was quite a bit of RL going on at ICML. From regret bounds to robot control you could find quite a bit (of course, also something with deep learning…). And EWRL attracted a record-size audience.
  • Bayesian optimization and automatic machine learning. Bayesian optimization is getting more attention because it’s so useful. Applications are all over the place and used by pretty much all the big companies to train their deep networks.

B) Originality of Research
Quite an interesting topic – ICML’s deep learning session started with Juergen Schmidhuber questioning the originality of much of the Deep Learning research since he did pretty much all of this in the 90s (on a smaller scale, of course) or other people even much earlier. His post is quite revealing  and it highlights problems of our research (and not restricted to deep learning): Research progresses so quickly that it is impossible to catch up with all the stuff that has been going on in the last 150 years, and we keep re-inventing things. Juergen Schmidhuber was clearly a highlight of ICML.

C) Industry Hires
Recently, there were a lot of acquisitions from industry. For instance, Twitter hired Ryan Adams (Harvard) and Hugo Larochelle (Sherbrooke) and a few post-docs through their Bayesian optimization startup Whetlab. Uber hired Drew Bagnell and Jeff Schneider (both CMU), which I find quite interesting. DeepMind is hiring many people with an RL background (e.g., Martin Riedmiller (Freiburg), Hado van Hasselt (Alberta), Peter Sunehag (ANU), Thore Graepel (MSR)).

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s