O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Training

Next, a function is created to train the data. The placeholders are created for the question pairs and their labels. The output of the model created in the preceding function is taken through cross-entropy softmax as the loss function. Using the Adam optimizer, the model weights are optimized, as follows: 

def train(train_x1, train_x2, train_y, val_x1, val_x2, val_y, max_sent_len, char_map, epochs=2, batch_size=1024, num_classes=2):    with tf.name_scope('Placeholders'):        x1_pls = tf.placeholder(tf.int32, shape=[None, max_sent_len])        x2_pls = tf.placeholder(tf.int32, shape=[None, max_sent_len])        y_pls = tf.placeholder(tf.int64, [None])        keep_prob = tf.placeholder(tf.float32)  # Dropout

Next, the model is created and followed by logit computation. ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required