O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Neural network architecture

As we have now built the word vectors from the character-word embedding concatenation, we will run a bidirectional LSTM over the sequence of word embeddings, to get a semantic representation of the embeddings, using the concatenated hidden states (forward and backward) of the bidirectional LSTM. This is shown in the following figure:

Using TensorFlow to implement this is straightforward, and is quite similar to how we implemented the earlier character-level embedding learning LSTM. However, unlike in the earlier case, we are interested in the hidden states of each time step:

bi_dir_cell_fw = tf.contrib.rnn.LSTMCell(hidden_state_size) ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required