O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Decoder network

Like the encoder, for the decoder network, we will utilize an RNN with GRU cells. We will also have nlyrs number of GRU layers, created with dropout , using the tf.contrib.rnn.DropoutWrapper wrapper class. We will utilize the BahdanauAttention mechanism to incorporate attention on the output of the encoder:

def decoding_layer(dec_emb_op, embs, enc_op, enc_st, v_size, txt_len,                    summ_len,mx_summ_len, rnsize, word2int, dprob, batch_size, nlyrs):        for l in range(nlyrs):        with tf.variable_scope('dec_rnn_layer_{}'.format(l)):            gru = tf.contrib.rnn.GRUCell(rnn_len)            cell_dec = tf.contrib.rnn.DropoutWrapper(gru,input_keep_prob = dprob) out_l = Dense(v_size, kernel_initializer = tf.truncated_normal_initializer(mean = 0.0, stddev=0.1)) ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required