Skip to Content
Deep Learning Quick Reference
book

Deep Learning Quick Reference

by Mike Bernico
March 2018
Intermediate to advanced
272 pages
7h 53m
English
Packt Publishing
Content preview from Deep Learning Quick Reference

Input and embedding layer architecture

In the last chapter, we trained an LSTM with a set of lags from a time series. Here our lags are really the words in a sequence. We will use these words to predict the sentiment of the reviewer. In order to get from a sequence of words to an input vector that considers the semantic value of those words, we can use an embedding layer.

Using the Keras functional API, the embedding layer is always the second layer in the network after the input layer. Let's look at how these two layers fit together:

input = Input(shape=(sequence_length,), name="Input")embedding = Embedding(input_dim=vocab_size, output_dim=embedding_dim,                      input_length=sequence_length, name="embedding")(input)

Our input layer needs to know ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Keras Deep Learning Cookbook

Keras Deep Learning Cookbook

Rajdeep Dua, Sujit Pal, Manpreet Singh Ghotra
Deep Learning with Keras

Deep Learning with Keras

Antonio Gulli, Sujit Pal

Publisher Resources

ISBN: 9781788837996Supplemental Content