Chapter 17. Support Vector Machines
17.0 Introduction
To understand support vector machines, we must understand hyperplanes. Formally, a hyperplane is an n – 1 subspace in an n-dimensional space. While that sounds complex, it actually is pretty simple. For example, if we wanted to divide a two-dimensional space, we’d use a one-dimensional hyperplane (i.e., a line). If we wanted to divide a three-dimensional space, we’d use a two-dimensional hyperplane (i.e., a flat piece of paper or a bed sheet). A hyperplane is simply a generalization of that concept into n dimensions.
Support vector machines classify data by finding the hyperplane that maximizes the margin between the classes in the training data. In a two-dimensional example with two classes, we can think of a hyperplane as the widest straight “band” (i.e., line with margins) that separates the two classes.
In this chapter, we cover training support vector machines in a variety of situations and dive under the hood to look at how we can extend the approach to tackle common problems.
17.1 Training a Linear Classifier
Problem
You need to train a model to classify observations.
Solution
Use a support vector classifier (SVC) to find the hyperplane that maximizes the margins between the classes:
# Load libraries
from
sklearn.svm
import
LinearSVC
from
sklearn
import
datasets
from
sklearn.preprocessing
import
StandardScaler
import
numpy
as
np
# Load data with only two classes and two features
iris
=
datasets
.
load_iris
()
features ...
Get Machine Learning with Python Cookbook now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.