Description
This software learns the best of (finitely many) graphs for semi--supervised learning.
Combining Graph Laplacians for Semi--Supervised Learning.
Code
The Matlab code is available here. The code also includes implementations of a few image transformations such as tangent distances.
Learning the kernel continuously
Description
With this method, we can learn convex combinations of, say, Gaussian kernels with parameters in a given range.
Learning Convex Combinations of Continuously Parameterized Basic Kernels.
A DC-Programming Algorithm for Kernel Selection.
Code (using DC programming).
Description
This is a method for learning multiple tasks simultaneously, assuming that they share a set of common features. It is based on regularizing the spectrum of the tasks matrix. An example of such a method is regularization with the trace norm.
Convex Multi-Task Feature Learning.
A Spectral Regularization Framework for Multi-Task Structure Learning.
School data [H. Goldstein. Multilevel modelling of survey data. The Statistician, 40:235, 1991].
The data set from [Lenk et al.] is not in the public domain. Please request it from the authors of that paper.
Note: to use a nonlinear kernel, one can run the above code on the Gram matrix after a preprocessing with a Gram-Schmidt or Cholesky decomposition (see Convex Multi-Task Feature Learning).
Accelerated optimization for composite regularizers
Description
This optimization method solves regularization problems with regularizers R(Bx) where R is a nonsmooth function, with an easy to compute proximity operator, and B is a linear map. It uses a combination of proximal methods with acceleration.
Efficient First Order Methods for Linear Composite Regularizers.
Last modified: Thu Apr 7 15:11:06 CDT 2011