Pascal Germain, Researcher in Machine Learning

This page is deprecated! My new webage is here.


We show that the support vector machine (SVM) is a particular case of a more general class of
data-dependent classifiers known as majority votes of sample-compressed classifiers. Using the
PAC-Bayes theory, we obtain risk bounds for these majority votes. We also provide learning 
algorithms for constructing these majority votes that minimize the proposed risk bounds. 
Our numerical experiments indicate that the proposed learning algorithms are competitive with
the SVM.

[ Link to text file ]

<< Go back.