Skip to Main Content
Accelerate Model Training with PyTorch 2.X
book

Accelerate Model Training with PyTorch 2.X

by Maicon Melo Alves
April 2024
Intermediate to advanced content levelIntermediate to advanced
230 pages
5h 12m
English
Packt Publishing
Content preview from Accelerate Model Training with PyTorch 2.X

6

Simplifying the Model

Have you heard about parsimony? Parsimony, in the context of model estimation, concerns keeping a model as simple as possible. Such a principle comes from the assumption that complex models (models with a higher number of parameters) overfit the training data, thus reducing the capacity to generalize and make good predictions.

In addition, simplifying neural networks has two main benefits: reducing the model training time and making the model feasible to run in resource-constrained environments. One of the approaches to simplifying a model relies on reducing the number of parameters of the neural network by employing pruning and compression techniques.

In this chapter, we show how to simplify a model by reducing the number ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

PyTorch Recipes: A Problem-Solution Approach to Build, Train and Deploy Neural Network Models

PyTorch Recipes: A Problem-Solution Approach to Build, Train and Deploy Neural Network Models

Pradeepta Mishra

Publisher Resources

ISBN: 9781805120100Supplemental Content