Appendix III — Generic Text Completion with GPT-2

This appendix is the detailed explanation of the Generic text completion with GPT-2 section in Chapter 7, The Rise of Suprahuman Transformers with GPT-3 Engines. This section describes how to implement a GPT-2 transformer model for generic text complexion.

You can read the usage of this notebook directly in Chapter 7 or build the program and run it in this appendix to get more profound knowledge of how a GPT model works.

We will clone the OpenAI_GPT_2 repository, download the 345M-parameter GPT-2 transformer model, and interact with it. We will enter context sentences and analyze the text generated by the transformer. The goal is to see how it creates new content.

This section is divided into ...

Get Transformers for Natural Language Processing - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.