One disadvantage of word2vec is that it only uses the local context of words and doesn't consider their global co-occurrences. In this way, the model loses a readily available, valuable source of information. As the name suggests, the Global Vectors for Word Representation (GloVe) model tries to solve this (https://nlp.stanford.edu/pubs/glove.pdf).
The algorithm starts with the global word-word co-occurrence matrix, X. A cell, Xij, indicates how often the word j appears in the context of word i. The following table shows the co-occurrence matrix for a window with size n = 2 of the sequence I like DL. I like NLP. I enjoy cycling: