Skip to Content
Python Machine Learning By Example - Second Edition
book

Python Machine Learning By Example - Second Edition

by Yuxi (Hayden) Liu
February 2019
Beginner to intermediate
382 pages
10h 1m
English
Packt Publishing
Content preview from Python Machine Learning By Example - Second Edition

Best practice 16 – reducing overfitting

We've touched on ways to avoid overfitting when discussing the pros and cons of algorithms in the last practice. We herein formally summarize them, as follows:

  • Cross-validation, a good habit that we have built over all of the chapters in this book.
  • Regularization. It adds penalty terms to reduce the error caused by fitting the model perfectly on the given training set.
  • Simplification, if possible. The more complex the mode is, the higher chance of overfitting. Complex models include a tree or forest with excessive depth, a linear regression with high degree polynomial transformation, and an SVM with a complicated kernel.
  • Ensemble learning, combining a collection of weak models to form a stronger one. ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Machine Learning by Example - Third Edition

Python Machine Learning by Example - Third Edition

Yuxi (Hayden) Liu
Python Machine Learning, Second Edition - Second Edition

Python Machine Learning, Second Edition - Second Edition

Sebastian Raschka, Jared Huffman, Vahid Mirjalili, Ryan Sun

Publisher Resources

ISBN: 9781789616729Supplemental Content