Overview
========

The OOPS package contains an implementation of several online solvers
for Support Vector Machines (SVMs) with a linear kernel. Specifically,
the package contains an implementation of optimistic and
non-optimistic variants of the proximal SVM solver given in (Do et
al. 2009) and the adaptive online gradient descent algorithm in
(Bartlett et al. 2008), as well as an implementation of the PEGASOS
algorithm (Shalev-Shwartz et al. 2007).

All algorithms operate in the primal, and solve the following
optimization problem, with the x_i being training examples and the y_i
the corresponding labels, +1 or -1. <w,x_i> denotes the Euclidean inner
product between vectors w and x_i.

min 1/2 lambda ||w||^2 + 1/m sum_{i=1}^{m} max(0, 1 - y_i <w,x_i>) .
 w 

Note that there is no bias term included. If you wish to use a bias
term, add an additional feature to each training example with value 1.

Typically, the optimistic proximal regularization algorithm of (Do
et al. 2009) performs best if the regularization constant lambda is small. 

Any comments or questions regarding this software should be sent to 
Chuan-Sheng Foo (csfoo@cs.stanford.edu).

This package is released under the GNU General Public License (see
license.txt for details).  

Technical references:
---------------------

Peter L. Bartlett, Elad Hazan and Alexander Rakhlin (2008). 
Adaptive online gradient descent. 
In J. Platt, D. Koller, Y. Singer and S. Roweis (Eds.), 
Advances in Neural Information Processing Systems 20, 65–72. MIT Press.

Chuong B. Do, Quoc V. Le and Chuan-Sheng Foo (2009). 
Proximal regularization for online and batch learning. 
Proceedings of the 26th International Conference on Machine Learning 
(pp. 257–264). 

Shai Shalev-Shwartz, Yoram Singer and Nathan Srebro (2007). 
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM.
Proceedings of the 24th International Conference on Machine Learning 
(pp. 807–814).

Compiling the package
=====================

Call "make" to compile the OOPS package, which will produce the "oops"
executable for training an SVM and the "predict" executable for
making predictions using a trained SVM.

Training an SVM
===============

To train an SVM model, call "oops" with the appropriate parameters as
shown below. The input file should be in SVM-Light/LibSVM format.

Optimistic Online Proximal SVM-solver
-------------------------------------
Usage:
  oops <method> <input_file> <model_file> <l2-weight> <iters>

where:
  method is one of the following:
    pegasos - the algorithm of (Shalev-Shwartz et al. 2007)
    proximal - proximal regularization without the optimistic algorithm
    adaptive - the algorithm of (Bartlett et al. 2008)
    opt-proximal - the optimistic algorithm in (Do et al. 2009) 
    opt-adaptive - an optimistic variant of (Bartlett et al. 2008)

  l2-weight is the regularization parameter lambda

  iters is the number of training examples to process before stopping

Making predictions using a trained model
========================================

To use a trained model for testing, call "predict" as follows. 
The input file should be in SVM-Light/LibSVM format.

SVM Predict
-----------
Usage:
  predict <model_file> <input_file> <output_file>

The <output_file> will now contain the predicted labels, one per line.
