April 2024
Intermediate to advanced
230 pages
5h 12m
English
Have you heard about parsimony? Parsimony, in the context of model estimation, concerns keeping a model as simple as possible. Such a principle comes from the assumption that complex models (models with a higher number of parameters) overfit the training data, thus reducing the capacity to generalize and make good predictions.
In addition, simplifying neural networks has two main benefits: reducing the model training time and making the model feasible to run in resource-constrained environments. One of the approaches to simplifying a model relies on reducing the number of parameters of the neural network by employing pruning and compression techniques.
In this chapter, we show how to simplify a model by reducing the number ...