Context & Memory: Making AI More Real
The Problem: No Memory
GPT is a generative text model, meaning it produces new text by predicting what comes next based on the user’s input. The model was trained on a large corpus of text, including books, articles, and websites, and it used this data to learn patterns and relationships between words and phrases.
By default, the model has no memory when you initiate a conversation with it. This means each input is treated independently, without any context or information carried over from the previous user prompts. While this may not be ideal for human-friendly interactions, it allows the model to generate more diverse and less repetitive text.
In some cases, carrying over context is useful and necessary. ...
Get OpenAI GPT For Python Developers now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.