Learning TensorFlow

Book description

Roughly inspired by the human brain, deep neural networks trained with large amounts of data can solve complex tasks with unprecedented accuracy. This practical book provides an end-to-end guide to TensorFlow, the leading open source software library that helps you build and train neural networks for computer vision, natural language processing (NLP), speech recognition, and general predictive analytics.

Authors Tom Hope, Yehezkel Resheff, and Itay Lieder provide a hands-on approach to TensorFlow fundamentals for a broad technical audience—from data scientists and engineers to students and researchers. You’ll begin by working through some basic examples in TensorFlow before diving deeper into topics such as neural network architectures, TensorBoard visualization, TensorFlow abstraction libraries, and multithreaded input pipelines. Once you finish this book, you’ll know how to build and deploy production-ready deep learning systems in TensorFlow.

  • Get up and running with TensorFlow, rapidly and painlessly
  • Learn how to use TensorFlow to build deep learning models from the ground up
  • Train popular deep learning models for computer vision and NLP
  • Use extensive abstraction libraries to make development easier and faster
  • Learn how to scale TensorFlow, and use clusters to distribute model training
  • Deploy TensorFlow in a production setting

Publisher resources

View/Submit Errata

Table of contents

  1. Preface
  2. 1. Introduction
    1. Going Deep
      1. Using TensorFlow for AI Systems
    2. TensorFlow: What’s in a Name?
    3. A High-Level Overview
    4. Summary
  3. 2. Go with the Flow: Up and Running with TensorFlow
    1. Installing TensorFlow
    2. Hello World
    3. MNIST
    4. Softmax Regression
    5. Summary
  4. 3. Understanding TensorFlow Basics
    1. Computation Graphs
      1. What Is a Computation Graph?
      2. The Benefits of Graph Computations
    2. Graphs, Sessions, and Fetches
      1. Creating a Graph
      2. Creating a Session and Running It
      3. Constructing and Managing Our Graph
      4. Fetches
    3. Flowing Tensors
      1. Nodes Are Operations, Edges Are Tensor Objects
      2. Data Types
      3. Tensor Arrays and Shapes
      4. Names
    4. Variables, Placeholders, and Simple Optimization
      1. Variables
      2. Placeholders
      3. Optimization
    5. Summary
  5. 4. Convolutional Neural Networks
    1. Introduction to CNNs
    2. MNIST: Take II
      1. Convolution
      2. Pooling
      3. Dropout
      4. The Model
    3. CIFAR10
      1. Loading the CIFAR10 Dataset
      2. Simple CIFAR10 Models
    4. Summary
  6. 5. Text I: Working with Text and Sequences, and TensorBoard Visualization
    1. The Importance of Sequence Data
    2. Introduction to Recurrent Neural Networks
      1. Vanilla RNN Implementation
      2. TensorFlow Built-in RNN Functions
    3. RNN for Text Sequences
      1. Text Sequences
      2. Supervised Word Embeddings
      3. LSTM and Using Sequence Length
      4. Training Embeddings and the LSTM Classifier
    4. Summary
  7. 6. Text II: Word Vectors, Advanced RNN, and Embedding Visualization
    1. Introduction to Word Embeddings
    2. Word2vec
      1. Skip-Grams
      2. Embeddings in TensorFlow
      3. The Noise-Contrastive Estimation (NCE) Loss Function
      4. Learning Rate Decay
      5. Training and Visualizing with TensorBoard
      6. Checking Out Our Embeddings
    3. Pretrained Embeddings, Advanced RNN
      1. Pretrained Word Embeddings
      2. Bidirectional RNN and GRU Cells
    4. Summary
  8. 7. TensorFlow Abstractions and Simplifications
    1. Chapter Overview
      1. High-Level Survey
    2. contrib.learn
      1. Linear Regression
      2. DNN Classifier
      3. FeatureColumn
      4. Homemade CNN with contrib.learn
    3. TFLearn
      1. Installation
      2. CNN
      3. RNN
      4. Keras
      5. Pretrained models with TF-Slim
    4. Summary
  9. 8. Queues, Threads, and Reading Data
    1. The Input Pipeline
    2. TFRecords
      1. Writing with TFRecordWriter
    3. Queues
      1. Enqueuing and Dequeuing
      2. Multithreading
      3. Coordinator and QueueRunner
    4. A Full Multithreaded Input Pipeline
      1. tf.train.string_input_producer() and tf.TFRecordReader()
      2. tf.train.shuffle_batch()
      3. tf.train.start_queue_runners() and Wrapping Up
    5. Summary
  10. 9. Distributed TensorFlow
    1. Distributed Computing
      1. Where Does the Parallelization Take Place?
      2. What Is the Goal of Parallelization?
    2. TensorFlow Elements
      1. tf.app.flags
      2. Clusters and Servers
      3. Replicating a Computational Graph Across Devices
      4. Managed Sessions
      5. Device Placement
    3. Distributed Example
    4. Summary
  11. 10. Exporting and Serving Models with TensorFlow
    1. Saving and Exporting Our Model
      1. Assigning Loaded Weights
      2. The Saver Class
    2. Introduction to TensorFlow Serving
      1. Overview
      2. Installation
      3. Building and Exporting
    3. Summary
  12. A. Tips on Model Construction and Using TensorFlow Serving
    1. Model Structuring and Customization
      1. Model Structuring
      2. Customization
    2. Required and Recommended Components for TensorFlow Serving
      1. What Is a Docker Container and Why Do We Use It?
      2. Some Basic Docker Commands
  13. Index

Product information

  • Title: Learning TensorFlow
  • Author(s): Tom Hope, Yehezkel S. Resheff, Itay Lieder
  • Release date: August 2017
  • Publisher(s): O'Reilly Media, Inc.
  • ISBN: 9781491978511