O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Sequence to sequence

Next, we will look at the sequence to sequence high-level function that wraps all of this together. This takes the word embeddings of the input text sentence, creates the encoding/decoding layer, and produces the logits as output. The op_tr and op_inf objects represent the predictions during training and inference, respectively:

def seq2seq_model(data_inp, data_summ_tgt, dprob, len_txt, len_summ, max_len_summ,                   v_size, rnsize, nlyrs, word2int, batch_size):        inp_emb = word_emb_matrix    word_embs = tf.Variable(inp_emb, name="word_embs")    inp_enc_emb = tf.nn.embedding_lookup(word_embs, data_inp)    op_enc, st_enc = encoding_layer(rnsize, len_txt, nlyrs, inp_enc_emb, dprob)     inp_dec = process_encoding_input(data_summ_tgt, word2int, ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required