Chapter 31The Promise of AI and Generative Pre-Trained Transformer Models in Medicine

—William B. Weeks

Because this chapter is based on several editorials that discuss how artificial intelligence (AI)–generated generative pre-trained transformer (GPT) large language models can be used in medicine, it has a different structure than earlier chapters. We first describe what these models do and then discuss their potential application in radiology, facilitating self-care management and decision-making, and improving the public's health.

What Are GPT Models and What Do They Do?

The release of OpenAI's ChatGPT in November 2022 brought advanced AI technology to the broader public. Generative pre-trained transformer (GPT) models excel at generating natural, coherent, and grammatically accurate text, thereby significantly advancing the field of artificial intelligence. It has the potential to transform multiple industries, including medicine.

These models are trained on massive amounts of data. Essentially, they scour the data to generate the statistically next most logical and used word, considering coherence, sentence structure, and the prompt used. For instance, if you asked a GPT model to write an essay on domesticated animals, it might start with a sentence beginning, “Domesticated animals include…” and then, using algorithms trained on either a specified body of knowledge or the Internet, insert the next most commonly used word in a word series of that construct, in this case, ...

Get AI for Good now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.