The kernelized Stochastic Batch Perceptron (SBP), described in our ICML'12
paper, is a fast kernel SVM optimization algorithm with training runtime
guarantees which are better than those of any other known approach. In order to
get good *testing* runtime, one must find sparse solutions, which, as we
show in our ICML'13 paper, can be derived from a SBP solution (or indeed any
approximate SVM solution) using a computationally inexpensive post-processing
step. The combination of these two algorithms yields a kernelized SVM
optimization procedure which has the best known training runtime, measured in
terms of the number of kernel evaluations required to achieve some desired
level of generalization performance, and the best *possible* solution
sparsity (up to a small constant factor) among solutions which are supported on
the training set.

The source code on this web page implements both of these algorithms, and, when compiled, will create a set of command-line tools suitable for basic usage. These tools should be regarded primarily as example programs--to get full use of our code, one should call our optimization routines directly from C++.

- Andrew Cotter
- Shai Shalev-Shwartz
- Nathan Srebro

- Andrew Cotter, Shai Shalev-Shwartz, Nathan Srebro. "The Kernelized Stochastic Batch Perceptron". International Conference on Machine Learning. 2012. (ICML pdf, slides, poster) (long pdf)
- Andrew Cotter, Shai Shalev-Shwartz, Nathan Srebro. "Learning Optimally Sparse Support Vector Machines". International Conference on Machine Learning. 2013. (ICML pdf, slides, poster) (long pdf)

The implementation is in C++. All development and testing was done under Linux. In order to compile, you'll require the boost libraries.

All source code is copyrighted, and licensed under the GPLv3.

TAR.GZ ZIP Source code issvm_src.tgz (63K) issvm_src.zip (117K)

At the moment, this implementation supports the SBP and SMO SVM optimization algorithms, as well as the kernelized Perceptron, and the "sparsifier" of our ICML'13 paper.