Errata

Learning TensorFlow

Errata for Learning TensorFlow

Submit your own errata for this product.

The errata list is a list of errors and their corrections that were found after the product was released.

The following errata were submitted by our customers and have not yet been approved or disproved by the author or editor. They solely represent the opinion of the customer.

Color Key: Serious technical mistake Minor technical mistake Language or formatting error Typo Question Note Update

Version Location Description Submitted by Date submitted
Printed Page 5
3rd paragraph

Hi
In the book chapter 2, you have the first example of Tensorflow for learning

You get data from:
from tensorflow.examples.tutorials.mnist import input_data

However this seem not really to be available anymore,

Do you have an update of your example code with
tf.keras.datasets.mnist.load_data()

Kell Michael Jensen  Jan 10, 2022 
Printed Page 16
2nd line of code

27 from tensorflow.examples.tutorials.mnist import input_data


Error code below:

ModuleNotFoundError: No module named 'tensorflow.examples.tutorials'


Has the code been moved?
DO I need to add quotes

I am running it on Anaconda3 Jupitor.

Thanks for your help.

Jeffrey V Winston  Dec 08, 2019 
PDF, ePub Page 46
line 14

Earlier, `wb` is created as an empty list, but when appending the results of the current run, it is printed as:

> wb_.append(sess.run([w,b]))

And it should read:

> wb.append(sess.run([w,b]))

A similar error can be found on page 48

Andy Cyca  May 01, 2019 
Printed Page 74
2nd text paragraph

A misplaced ( after hidden_layer_size

Aart Bik  Nov 09, 2019 
Printed Page 90
1st code block

Hi,

Loving the book - I think I might have found a small error. It may have arisen due to the version of Tensorflow that I am using?

Let me know if you agree with my proposal.

python: 3.6.3
tensorflow==1.6.0

upon running the code as written this error is raised:

ValueError: Variable lstm/rnn/basic_lstm_cell/kernel already exists, disallowed. Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope? Originally defined at:

This is caused due to the following:

If you use one variable twice (or more times), you should first time use with tf.variable_scope('scope_name', reuse=False): and the next time with tf.variable_scope('scope_name', reuse=True):.

Or you can use method tf.variable_scope.reuse_variables()

In this case the tf.variable_scope("lstm") is used twice without being explicitly reused.

To address I did the following:
```
with tf.variable_scope("lstm") as scope:

lstm_cell = tf.contrib.rnn.BasicLSTMCell(hidden_layer_size,
forget_bias=1.0)
scope.reuse_variables()
outputs, states = tf.nn.dynamic_rnn(lstm_cell, embed,
sequence_length = _seqlens,
dtype=tf.float32)
```

Tom Miller  Apr 13, 2018 
Printed Page 209
7th paragraph from the bottom

When I changed from
opt = tf.train.DradientDecsentOptimizer(0.5)
to
opt = tf.train.AdamOptimizer(0.5)
an critical error appears.
saying,
“Attempting to use uninitialized value beta1_power_1”

Is there any way to properly initialize the Adam Optimizer in the class encapsulation style such as shown on the page 208-209?

Tomohisa Kumagai  Dec 18, 2019