Skip to Content
Hands-On Transfer Learning with Python
book

Hands-On Transfer Learning with Python

by Dipanjan Sarkar, Raghav Bali, Tamoghna Ghosh
August 2018
Intermediate to advanced
438 pages
12h 3m
English
Packt Publishing
Content preview from Hands-On Transfer Learning with Python

Write operation

Each write head receives an erase vector, et, and an add vector, at, to reset and write to memory, just like an LSTM cell, as follows: Mt(i) ← Mt(i) [1- et (i) wt (i) ] + wt (i) at (i).

Here is the pseudo code for the preceding operations:

mem_size = 128 #The size of memory mem_dim = 16 #The dimensionality for memory shift_range = 1 # defining shift[-1, 0, 1]## last output layer from LSTM controller: last_output## Previous memory state: M_prevdef Linear(input_, output_size, stddev=0.5):  '''Applies a linear transformation to the input data: input_  implements dense layer with tf.random_normal_initializer(stddev=stddev)  as weight initializer  ''''def get_controller_head(M_prev, last_output, is_read=True):   k = tf.tanh(Linear(last_output, ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Hands-On Transfer Learning with TensorFlow 2.0

Hands-On Transfer Learning with TensorFlow 2.0

Margaret Maynard-Reid

Publisher Resources

ISBN: 9781788831307Supplemental Content