O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Encoder network

For the encoder component, we utilize a bidirectional RNN with GRU cells. In place of the GRU cell, we could also use an LSTM. The reader may experiment with an LSTM to look at the differences in the model's performance:

def get_cell(csize,dprob):    rnc = GRUCell(csize)    rnc = DropoutWrapper(rnc, input_keep_prob = dprob)    return rncdef encoding_layer(csize, len_s, nl, rinp, dprob):    for l in range(nl):        with tf.variable_scope('encoding_l_{}'.format(l)):            rnn_frnt = get_cell(csize,dprob)            rnn_bkwd = get_cell(csize,dprob)            eop, est = tf.nn.bidirectional_dynamic_rnn(rnn_frnt, rnn_bkwd,                                                                     rinp,                                                                    len_s,                                                                    dtype=tf.float32)    eop = tf.concat(eop,2)    return eop, est
Note that we have concatenated the encoder's output, as this is bidirectional.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required