NIPS 2005 Workshop on the

Accuracy-Regularization Frontier

Friday, December 9th, 2005
Westin Resort and Spa, Whistler, BC, Canada

Program

Workshop home | Description | Abstracts of contributed presentations

The workshop will be held as part of the NIPS (Neural Information Processing Systems) workshop program, and in conjunction to the main NIPS conference December 5th-8th in Vancouver. The workshop will meet in two sessions with a daytime break for informal exchange and/or other activities.

We hope for a dynamic workshop, and so the program below is subject to change. In particular, if mid-day discussions will cause delays in the program, some of the short talks might be postponed to the times set aside for discussions at the end of the morning or afternoon sessions.

Workshop Program

Morning Session, Friday 7:30AM - 10:30AM
7:30AM
Invited talk: Some Statistical Aspects of Regularization Frontiers, Trevor Hastie, drawing on work with past and present students, and colleagues at Stanford
This talk will be partly speculative, and touch on the following topics:
  • SVMs and the role of regularization
  • boosting and its forward-stagewise regularization path
  • degrees of freedom and inference along paths
  • lars and svmpath software for R, and other programs.
8:20AM Grouped and Hierarchical Model Selection through Composite Absolute Penalties, Peng Zhao, Guilherme V. Rocha, and Bin Yu
8:40AM
Short talks:
Considering cost asymmetry in learning SVMs, Francis Bach, David Heckerman, Eric Horvitz
Exploring the regularization path for adaptive Gaussian kernel SVMs, Roland Memisevic, Nathan Srebro, Sam Roweis
Training-set method for choosing a regularization/accuracy, Saharon Rosset, Nathan Srebro
8:55AMBreak
9:10AM
Invited talk: Mathematical analysis of regularized online algorithms, Ding Xuan Zhou
In this talk we consider online learning algorithms based on regularization schemes in reproducing kernel Hilbert spaces (RKHS). Some mathematical analysis is presented. For classification with a general convex loss function, the excess misclassification error is bounded in terms of the choice of the regularization parameter and the step size. For regression with the least square loss, we consider samples drawn recursively according to a stochastic kernel (random walk), and error analysis is done in the L2-space as well as in the RKHS. The error bounds are given by means of the approximation power of the RKHS.
10:00AM A Convex Approach to Learning the Ridge based on CV, K. Pelckmans, J.A.K. Suykens, B. De Moor
10:20AMDiscussion / short talks
Skiing, other activities and posters, 10:30AM - 3:30PM
Afternoon Session, 3:30PM - 6:30PM
3:30PM
Invited tutorial: Bicriterion convex optimization, Lieven Vandenberghe
This tutorial will provide an introduction to bicriterion convex optimization and trade-off analysis. We will review some basic definitions and properties, discuss different parametrizations, and describe methods for computing trade-off curves. We will also discuss special techniques for trade-off analysis in linear and quadratic programming. The material will be illustrated with examples from machine learning.
4:20PM Basis Pursuit Learning and Multi-Objective Optimization, Martin Brown, Nick Costen, Georgios Papadopoulos
4:40PM Computation of the entire regularization path for SVM in practice, Alexandre Belloni and Katya Scheinberg
5:00PMBreak
5:15PM
Short talks:
The Kernel LARS algorithm, Stéphane Canu, Vincent Guigue, Alain Rakotomamonjy and Gilles Gasso
Computing regularization paths for learning multiple kernels, Francis R. Bach, Romain Thibaux, Michael I. Jordan
The support vector decomposition machine, Francisco Pereira and Geoff Gordon
5:30PMDiscussion / delayed or additional short talks

Last modified: Wed Nov 30 17:09:25 Eastern Standard Time 2005