Errata

Building Applications with AI Agents

Errata for Building Applications with AI Agents

Submit your own errata for this product.

The errata list is a list of errors and their corrections that were found after the product was released.

The following errata were submitted by our customers and have not yet been approved or disproved by the author or editor. They solely represent the opinion of the customer.

Color Key: Serious technical mistake Minor technical mistake Language or formatting error Typo Question Note Update

Version Location Description Submitted by Date submitted
O'Reilly learning platform Page Chapter 2. Designing Agent Systems, Our First Agent System
in code

In the example code, in the `call_model(state)` function, the line

# 1st LLM pass: decides whether to call our tool
AIMessage = ChatOpenAI(model="gpt-5", temperature=0)(full)
out = [first]

this variable first is not defined.

Doug  Sep 21, 2025 
O'Reilly learning platform Page Chapter 7. Learning in Agentic Systems
Chapter 7. Learning in Agentic Systems

It is stated two times in chapter 7 that fine tuning negatively affects LLMs inference speed. In reality LLM fine tuning doesn't decrease inference speed, it may actually increase it in some cases. I'm referring to these passages.

1. "If GPU availability is limited, annotation is expensive, or inference speed is a priority, consider nonparametric strategies like retrieval-augmented generation."
In reality, RAG will definitely decrease the overall speed of inference because it introduced database lookup and data retrieval.

2. "That said, fine-tuning large models demands serious resources ... and real-time deployments may suffer from higher inference latency as a result."

Alex  Oct 07, 2025 
O'Reilly learning platform Page Chapter 2, Section Our First Agent System
The code snipped in the first page of Chapter 2

The Code snippet mentions
AIMessage = ChatOpenAI(model="gpt-5", temperature=0)(full)
out = [first]
The code should instead be

first:AIMessage = ChatOpenAI(model="gpt-5", temperature=0)(full)
out = [first]


If this is not done the variable "first" will appear as undefined

Amit Wats  Nov 03, 2025 
PDF Page 72
2nd paragraph

from langchain_core.messages import
HumanMessage messages = [HumanMessage("What is the weather today?")]

should be written as:

from langchain_core.messages import HumanMessage
messages = [HumanMessage("What is the weather today?")]

Masoud Azizi  Dec 03, 2025 
PDF Page 75
1st

final_response = llm_with_tools.invoke(messages)
print(final_response.content)

should not be under the `for` loop, rather unindented.

Masoud Azizi  Dec 03, 2025 
PDF Page 82
end of the page

class AgentState(TypedDict):
messages: Sequence[Any] # A list of BaseMessage/HumanMessage/...

it would be more pythonic to write it as:

class AgentState(TypedDict):
messages: list[BaseMessage] # A list of AIMessage/HumanMessage/...

as HumanMessage as actually a BaseMessage

Masoud Azizi  Dec 03, 2025 
PDF Page 117
2nd paragraph

the graph variable is not instantiated, it'd be:

builder = StateGraph(MessagesState)
builder.add_node("call_model", call_model)
builder.add_edge(START, "call_model")
graph = builder.compile()

Masoud Azizi  Dec 04, 2025