July 2018
Beginner to intermediate
312 pages
8h 31m
English
The question module does the same embedding lookup and temporal summation as the input module. The embedding matrix, and hence the word vocabulary, is shared between the two modules:
def _question_module(self, questions): with tf.variable_scope("QuestionModule"): questions_emb = tf.nn.embedding_lookup( self.word_emb_matrix, questions) return tf.reduce_sum(questions_emb, 1)
Since we are building the most conceptually simple memory network possible, we do not make use of more sophisticated sentence representation models, such as recurrent neural networks (RNNs) or Convolutional Neural Networks (CNNs). The modular nature of the architecture makes it easy to do further experiments on this.
Read now
Unlock full access