Building a skip-gram model

For our first step, we will import the Python modules necessary for our example:

from tensorflow.contrib.tensorboard.plugins import projector import osimport numpy as npimport tensorflow as tf

The projector module from TensorFlow provides the necessary methods for us to add our word vectors for visualization on TensorBoard. Subsequently, we will create a dictionary with all of the model parameters that we will be using to train our Word2vec model:

# Parameters related to training the modelmodel_params = {     "vocab_size": 50000,   # Maximum number of words     "batch_size": 64,      # Batch size for every training step    "embedding_size": 200, # Dimensions of the word embedding vectors "num_negatives": 64, # Number of negative ...

Get Hands-On Natural Language Processing with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.