Now that we have training and validation data prepared, let's create a skip-gram model in TensorFlow.
We start by defining the hyper-parameters:
batch_size = 128embedding_size = 128skip_window = 2n_negative_samples = 64ptb.skip_window=2learning_rate = 1.0
- The batch_size is the number of pairs of target and context words to be fed into the algorithms in a single batch
- The embedding_size is the dimension of the word vector or embedding for each word
- The ptb.skip_window is the number of words to be considered in the context of the target words in both directions
- The n_negative_samples is the number of negative samples to be generated by the NCE loss function, explained further in this chapter