In this chapter the idea of using Support Vector Machines in pattern classification is presented. The formulation of SVM is constructed starting from a simple linear maximum margin classifier. The importance of capacity control to avoid over fitting discussed. Finally the claim that SVM training achieves the lowest necessary capacity for a given classification task will be investigated.

A general two-class pattern classification problem is posed as follows :

- Given
*l*i.i.d. sample:

where , for*i*=1, ... ,*l*is a feature vector of length*d*and is the class label for data point . - Find a classifier with the decision function,
*f*(*x*) such that*y*=*f*(*x*), where*y*is the class label for*x*.

The performance of the classifier is measured in terms of classification error which is defined in Eqn. .

- Empirical Risk Minimisation
- Structural Risk Minimisation
- VC dimension and VC confidence
- Linear Support Vector Machine - A Maximum Margin Classifier
- Extending SVM to a Soft Margin Classifier
- Nonlinear Support Vector Machine
- SVM training and SRM

Thu Sep 10 11:05:30 BST 1998