O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Building the graph

The graph is built using the high-level seq2seq_model function. Let's look at the code for building the graph and optimizer:

train_graph = tf.Graph()with train_graph.as_default():        data_inp, tgts, lrt, dprobs, len_summ, max_len_summ, len_txt = model_inputs()    tr_op, inf_op = seq2seq_model(tf.reverse(data_inp, [-1]),                                                      tgts,                                                       dprobs,                                                       len_txt,                                                      len_summ,                                                      max_len_summ,                                                      len(word2int)+1,                                                      rnn_len,                                                       n_layers,                                                       word2int,                                                      batch_size)        tr_op = tf.identity(tr_op.rnn_output, 'tr_op')    inf_op = tf.identity(inf_op.sample_id, name='predictions')        seq_masks = tf.sequence_mask(len_summ, max_len_summ, dtype=tf.float32, name='masks')    with tf.name_scope("optimizer"):        tr_cost = sequence_loss(tr_op,tgts,seq_masks) optzr = tf.train.AdamOptimizer(lrt) ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required