As we know, there are two layers in the neural network for generating word vectors. We will start to look at each of the layers and their input and output in detail. Here, we are not including the math behind the word2vec model in this section. Later in this chapter, we will also look at the math behind word2vec, and I will let you know at that point of time to map your dots for better interpretation.
Let's understand the task of each layer in brief:
- Input layer: An input layer has as many neurons as there are words in the vocabulary for training
- Hidden layer: The hidden layer size in terms of neurons is the dimensionality of the resulting word vectors
- Output layer: The output layer has the same number ...