3.1 Introducing embeddings3.1.1 What are embeddings?3.1.2 Why are embeddings important?3.2 Building blocks of language: Characters, words, and phrases3.2.1 Characters3.2.2 Words, tokens, morphemes, and phrases3.2.3 N-grams3.3 Tokenization, stemming, and lemmatization3.3.1 Tokenization3.3.2 Stemming3.3.3 Lemmatization3.4 Skip-gram and continuous bag of words (CBOW)3.4.1 Where word embeddings come from3.4.2 Using word associations3.4.3 Linear layers3.4.4 Softmax3.4.5 Implementing Skip-gram on AllenNLP3.4.6 Continuous bag of words (CBOW) model3.5 GloVe3.5.1 How GloVe learns word embeddings3.5.2 Using pretrained GloVe vectors3.6 fastText3.6.1 Making use of subword information3.6.2 Using the fastText toolkit3.7 Document-level embeddings3.8 Visualizing embeddingsSummary