May 2018
Beginner
490 pages
13h 16m
English
Anybody who has run MNIST with 60,000 handwritten image examples on a laptop knows that it takes some time for a machine learning program to train and test these examples. Whether a machine learning program or a deep learning convolutional network is applied, it uses a lot of the local machine's resources. Even if you run it on training on GPUs (short for graphics processing unit) hoping to get better performance than with CPUs, it still takes a lot of time for the training process to run through all the learning epochs.
If you go on and you want to train your program on images, CIFAR-10, a-60,000 image subset of the tiny image dataset, will consume even more resources on your local machine.
Suppose ...
Read now
Unlock full access