Chapter 9. A Recipe for Sustainable AI
In the 1990s, many systems stored dates using two digits for the year—“97” for 1997, “99” for 1999, and so on—to save memory and simplify string handling. This worked fine until the year 2000 approached. Suddenly, “00” could mean 1900 or 2000, and countless systems risked malfunctioning or crashing because they could not disambiguate the century. This led to the global frenzy known as the Y2K bug.1
In the scramble to patch systems before the millennium, developers used various stopgap fixes. One such hack involved adding a “Y2K compliance” flag or setting arbitrary rules to interpret dates—like assuming any year less than 20 meant 2000—2019, and any year 20 or higher meant 1920—1999. The code might look like this:
definterpret_year(year):ifyear<20:return2000+yearelse:return1900+year
And next to it, developers would often write comments like:
# Temporary Y2K fix. Remove by 2010.
Of course, 2010 came and went, but the code remained. No one was sure what systems depended on this logic. Removing it could have subtle effects—date misinterpretations, billing errors, archival failures—so it stayed. Teams eventually became too afraid to touch it. In some legacy banking and insurance systems, versions of this logic persisted for decades.
This is a textbook case of technical debt in software engineering,2 the implicit cost of choosing an easy or limited solution now instead of a better approach that would take more effort. While shortcuts ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access