skip-gram model with TensorFlow

Now that we have training and validation data prepared, let's create a skip-gram model in TensorFlow.

We start by defining the hyper-parameters:

batch_size = 128embedding_size = 128skip_window = 2n_negative_samples = 64ptb.skip_window=2learning_rate = 1.0
  • The batch_size is the number of pairs of target and context words to be fed into the algorithms in a single batch
  • The embedding_size is the dimension of the word vector or embedding for each word
  • The ptb.skip_window is the number of words to be considered in the context of the target words in both directions
  • The n_negative_samples is the number of negative samples to be generated by the NCE loss function, explained further in this chapter
In some tutorials, ...

Get Mastering TensorFlow 1.x now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.