Training our dependency parsers

Again, if you have read Chapter 4, Gensim - Vectorizing Text and Transformations and n-grams, Chapter 5, POS-Tagging and Its applications, and Chapter 6, NER-Tagging and Its applications, then you would be comfortable with the theory behind training our own models in spaCy. We would recommend that you go back and read Vector transformations in Gensim section from chapter 4 and Training our own POS-taggers section from chapter 5 to refresh your ideas on what exactly training means in context with machine learning and in particular, spaCy.

Again, the advantage with spaCy is that we don't need to care about the algorithm being used under the hood, or which features are the best to select for dependency parsing ...

Get Natural Language Processing and Computational Linguistics now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.