Skip to Content
Deep Learning with TensorFlow - Second Edition
book

Deep Learning with TensorFlow - Second Edition

by Giancarlo Zaccone, Vihan Jain, Md. Rezaul Karim, Motaz Saad
March 2018
Intermediate to advanced content levelIntermediate to advanced
484 pages
10h 31m
English
Packt Publishing
Content preview from Deep Learning with TensorFlow - Second Edition

Distributed computing

DL models have to be trained on a large amount of data to improve their performance. However, training a deep network with millions of parameters may take days, or even weeks. In Large Scale Distributed Deep Networks, Dean et al. proposed two paradigms, namely model parallelism and data parallelism, which allow us to train and serve a network model on multiple physical machines. In the following section, we introduce these paradigms with a focus on distributed TensorFlow capabilities.

Model parallelism

Model parallelism gives every processor the same data but applies a different model to it. If the network model is too big to fit into one machine's memory, different parts of the model can be assigned to different machines. ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Deep Learning with TensorFlow

Deep Learning with TensorFlow

Giancarlo Zaccone, Fabrizio Milo, Md. Rezaul Karim, Ahmed Menshawy
Deep Learning with TensorFlow 2 and Keras - Second Edition

Deep Learning with TensorFlow 2 and Keras - Second Edition

Antonio Gulli, Dr. Amita Kapoor, Sujit Pal
TensorFlow for Deep Learning

TensorFlow for Deep Learning

Bharath Ramsundar, Reza Bosagh Zadeh

Publisher Resources

ISBN: 9781788831109Supplemental Content