13
Cross-Lingual and Multilingual Language Modeling
Up to this point, you have learned a lot about transformer-based architectures, from encoder-only models to decoder-only models and from efficient transformers to long-context transformers. You also learned about semantic text representation based on a Siamese network. However, we discussed all these models in terms of monolingual problems. We assumed that these models just understand a single language and are not capable of having a general understanding of text, regardless of the language itself. In fact, some of these models have multilingual variants: multilingual bidirectional encoder representations from transformers (mBERT), multilingual text-to-text transfer transformer (mT5), and multilingual ...
Get Mastering Transformers - Second Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.