Recall the SVM optimization problem
The data points only appear as the inner product
As long as we can calculate the inner product in the feature space, we do not need the mapping explicitly
Many common geometric operations (angles, distances) can be expressed by inner products
Define the kernel function \(K\) by: \[K(x_i, x_j) = \phi \left(x_i\right)^T \phi \left(x_j\right)\]
sklearn has implementations for a variety of SVM methods:
sklearn.svm.SVC
linear
: \(\langle x, x'\rangle\)polynomial
: \((\gamma \langle x, x'\rangle + r)^d\)rbf
: \(\exp(-\gamma \|x-x'\|^2)\)sigmoid
: \(\tanh(\gamma \langle x,x'\rangle + r)\)sklearn.svm.NuSVC
sklearn.svm.LinearSVC