3.7. Support Vector Machines

3.7.1. Separable Classes

In this section, an alternative rationale for designing linear classifiers will be adopted. We will start with the two-class linearly separable task, and then we will extend the method to more general cases where data are not separable.

Let xi, i = 1, 2,…, N, be the feature vectors of the training set, X. These belong to either of two classes, ω1, ω2, which are assumed to be linearly separable. The goal, once more, is to design a hyperplane(3.71)that classifies correctly all the training vectors. As we have already discussed in Section 3.3, such a hyperplane is not unique. The perceptron ...

Get Pattern Recognition, 4th Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.