April 2022
Intermediate to advanced
366 pages
7h 43m
English
Deep learning models become more accurate the more data they have for training. The most spectacular Deep Learning models, such as ImageNet, are trained on millions of images and often require a massive amount of computing power. To put things into perspective, the amount of power used to train OpenAI's GPT3 model could power an entire city. Unsurprisingly, the cost of training such Deep Learning models from scratch is prohibitive for most projects.
This begs the question: do we really need to train a Deep Learning model from scratch each time? One way of getting around this problem, rather than training Deep Learning models from scratch, is to borrow representations from an already trained ...
Read now
Unlock full access