Skip to Content
Python Machine Learning, Second Edition - Second Edition
book

Python Machine Learning, Second Edition - Second Edition

by Sebastian Raschka, Jared Huffman, Vahid Mirjalili, Ryan Sun
September 2017
Intermediate to advanced
622 pages
15h 13m
English
Packt Publishing
Content preview from Python Machine Learning, Second Edition - Second Edition

Using kernel principal component analysis for nonlinear mappings

Many machine learning algorithms make assumptions about the linear separability of the input data. You learned that the perceptron even requires perfectly linearly separable training data to converge. Other algorithms that we have covered so far assume that the lack of perfect linear separability is due to noise: Adaline, logistic regression, and the (standard) SVM to just name a few.

However, if we are dealing with nonlinear problems, which we may encounter rather frequently in real-world applications, linear transformation techniques for dimensionality reduction, such as PCA and LDA, may not be the best choice. In this section, we will take a look at a kernelized version of PCA, ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Machine Learning - Third Edition

Python Machine Learning - Third Edition

Sebastian Raschka, Vahid Mirjalili
Python Machine Learning

Python Machine Learning

Sebastian Raschka

Publisher Resources

ISBN: 9781787125933Supplemental Content