December 2019
Intermediate to advanced
468 pages
14h 28m
English
In this section, we'll implement a basic text-generation example with the help of the transformers 2.1.1 library (https://huggingface.co/transformers/), released by Hugging Face. This is a well-maintained and popular open source package that implements different transformer language models, including BERT, transformer-XL, XLNet, OpenAI GPT, GPT-2, and others. We'll use a pretrained transformer-XL model to generate new text based on an initial input sequence. The goal is to give you a brief taste of the library:
import torchfrom transformers import TransfoXLLMHeadModel, TransfoXLTokenizer
The TransfoXLLMHeadModel and TransfoXLTokenizer phrases are the implementations ...
Read now
Unlock full access