With this blog post, it is our pleasure to unveil the NeurIPS paper awards for 2019, and share more information on the selection process for these awards.

Outstanding Paper Awards

We’re continuing the tradition of highlighting some of the most notable accepted papers at the conference. The NeurIPS 2019 Outstanding Paper Committee includes of Bob Williamson, Michele Sebag, Samuel Kaski, Brian Kingsbury and Andreas Krause. They made their recommendations as follows.

We asked the Outstanding Paper Committee to choose from the set of papers that had been selected for oral presentation. Before looking at the papers, they agreed on the following criteria to guide their selection.

  • Potential to endure — Focused on the main game, not sidelines. Likely that people will still care about this in decades to come.

They also agreed on some criteria that they would like to avoid:

  • Inefficient — Steering away from work that only stand out due to resource profligacy (achieved a higher league table ranking largely by virtue of squandering huge resources)

Finally, they determined it appropriate to introduce an additional Outstanding New Directions Paper Award, to highlight work that distinguished itself in setting a novel avenue for future research.

They had access to the papers, the reviewer reports and comments from the (senior) area chairs.

They then did an initial triage to come up with a short list of three papers and an extended list of eight papers. The committee members then each individually evaluated these eight papers on their own. Each committee member gave each paper a ranking. They then shared these rankings with each other. In one case the committee members sought additional expert opinion (and took it into account in their decision making).

Ultimately, the committee members were in strong agreement independently, and after brief discussion reached the following consensus recommendation for outstanding paper awards:

Outstanding Paper Award

  • Distribution-Independent PAC Learning of Halfspaces with Massart Noise
    The paper studies the learning of linear threshold functions for binary classification in the presence of unknown, bounded label noise in the training data. It solves a fundamental, and long-standing open problem by deriving an efficient algorithm for learning in this case. This paper makes tremendous progress on a long-standing open problem at the heart of machine learning: efficiently learning half-spaces under Massart noise. To give a simple example highlighted in the paper, even weak learning disjunctions (to error 49%) under 1% Massart noise was open. This paper shows how to efficiently achieve excess risk equal to the Massart noise level plus epsilon (and runs in time poly(1/epsilon), as desired). The algorithmic approach is sophisticated and the results are technically challenging to establish. The final goal is to be able to efficiently get excess risk equal to epsilon (in time poly(1/epsilon)).

Outstanding New Directions Paper Award

  • Uniform convergence may be unable to explain generalization in deep learning
    The paper presents what are essentially negative results showing that many existing (norm based) bounds on the performance of deep learning algorithms don’t do what they claim. They go on to argue that they can’t do what they claim when they continue to lean on the machinery of two-sided uniform convergence. While the paper does not solve (nor pretend to solve) the question of generalisation in deep neural nets, it is an ``instance of the fingerpost’’ (to use Francis Bacon’s phrase) pointing the community to look in a different place.

The committee members also wanted to highlight the following papers for honorable mentions:

Honorable Mention Outstanding Paper Award

  • Nonparametric Density Estimation & Convergence Rates for GANs under Besov IPM Losses
    The paper shows, in a rigorous theoretical manner, that GANs can outperform linear methods in density estimation (in terms of rates of convergence). Leveraging prior results on wavelet shrinkage, the paper offers new insight into the representational power of GANs. Specifically, the authors derive minimax convergence rates for non-parametric density estimation under a large class of losses (so-called integral probability metrics) within a large function class (Besov spaces). Reviewers felt this paper would have significant impact for researchers working on non-parametric estimation and GANs.

Honorable Mention Outstanding New Directions Paper Award

  • Putting An End to End-to-End: Gradient-Isolated Learning of Representations
    The paper revisits the layer-wise building of deep networks, using self-supervised criteria inspired from van Oord et al. (2018), specifically the mutual information between the representation of the current input, and input close in space or time. As noted by reviewers, such self-organization in perceptual networks might give food for thought at the cross-road of algorithmic perspectives (sidestepping end-to-end optimization, its huge memory footprint and computational issues), and cognitive perspectives (exploiting the notion of so-called slow features and going toward more “biologically plausible” learning processes).

Congratulations to all authors for their great contribution to our research community! We also thank the members of the selection committee for taking on what certainly was a very difficult task.

Test of Time Award

As in previous years we created a committee to select a paper published 10 years ago at NeurIPS and that was deemed to have had a particularly significant and lasting impact on our community.

Amir Globerson, Antoine Bordes, Francis Bach and Iain Murray agreed to take on this task, and we are extremely thankful for their meticulous and arduous work.

They started from a list of 18 accepted papers to NeurIPS 2009 that have had the most citations since their publication. They then focussed their search on papers that have enjoyed a sustained impact, meaning that recent papers are still meaningfully referring to and building on the work outlined in those papers. The committee also wanted to be able to identify a precise contribution to the field that made the selected paper stand out, and that the paper be well-written enough to be accessible to most of the community today. With these goals in mind, each member of the committee championed a short list of papers.

Ultimately, they identified the following paper that they felt struck the best balance of an important contribution, lasting impact, and broad appeal:

Dual Averaging Method for Regularized Stochastic Learning and Online Optimization

Congratulations to Lin Xiao for single-handedly having had such an enduring impact on our community!

Alina Beygelzimer, Emily Fox, Florence d’Alché-Buc, Hugo Larochelle
NeurIPS 2019 Program Chairs

Written by

Tweets sent to this account are not actively monitored. To contact us please go to http://neurips.cc/Help/Contact

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store