O'Reilly logo

Natural Language Processing and Computational Linguistics by Bhargav Srinivasa-Desikan

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Training our dependency parsers

Again, if you have read Chapter 4, Gensim - Vectorizing Text and Transformations and n-grams, Chapter 5, POS-Tagging and Its applications, and Chapter 6, NER-Tagging and Its applications, then you would be comfortable with the theory behind training our own models in spaCy. We would recommend that you go back and read Vector transformations in Gensim section from chapter 4 and Training our own POS-taggers section from chapter 5 to refresh your ideas on what exactly training means in context with machine learning and in particular, spaCy.

Again, the advantage with spaCy is that we don't need to care about the algorithm being used under the hood, or which features are the best to select for dependency parsing ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required