O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Question module

The question module does the same embedding lookup and temporal summation as the input module. The embedding matrix, and hence the word vocabulary, is shared between the two modules:

    def _question_module(self, questions):        with tf.variable_scope("QuestionModule"):            questions_emb = tf.nn.embedding_lookup(                self.word_emb_matrix, questions)            return tf.reduce_sum(questions_emb, 1)  

Since we are building the most conceptually simple memory network possible, we do not make use of more sophisticated sentence representation models, such as recurrent neural networks (RNNs) or Convolutional Neural Networks (CNNs). The modular nature of the architecture makes it easy to do further experiments on this.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required