This chapter demonstrates the key concepts of how TensorFlow is built and how it works with simple and intuitive examples. You will get acquainted with the basics of TensorFlow as a numerical computation library using dataflow graphs. More specifically, you will learn how to manage and create a graph, and be introduced to TensorFlow’s “building blocks,” such as constants, placeholders, and Variables.

TensorFlow allows us to implement machine learning algorithms by creating and computing operations that interact with one another. These interactions form what we call a “computation graph,” with which we can intuitively represent complicated functional architectures.

We assume a lot of readers have already come across the mathematical concept of a graph. For those to whom this concept is new, a graph refers to a set of interconnected *entities*, commonly called *nodes* or *vertices*. These nodes are connected to each other via edges. In a dataflow graph, the edges allow data to “flow” from one node to another in a directed manner.

In TensorFlow, each of the graph’s nodes represents an operation, possibly applied to some input, and can generate an output that is passed on to other nodes. By analogy, we can think of the graph computation as an assembly line where each machine (node) either gets or creates its raw material (input), processes it, and then passes the output to other machines in an orderly ...

Start Free Trial

No credit card required