The death of Agile?
In this edition of the Radar column, we examine the big picture around Agile, and look at what it means and what it doesn't.
I read an article on the death of Agile. It’s not the first article I’ve seen claiming that Agile is dead, and I’m sure it won’t be the last.
The problem with the claim that Agile is “dead” is that, as Eben Hewitt said to me in a conversation, “no movement has succeeded until it has become a parody of itself.” This is a brilliant point that applies perfectly to Agile. What is modern Agile? Fetishizing standups. Fetishizing pair programming. Fetishizing unit testing. Scrum fetishizes working in short, intense bursts. (In the 80s, I worked for a company that had a lot of practices that anticipated Scrum. All I can say is that they were extremely unhealthy. Buy me a beer and I may tell you more.) None of these are bad in and of themselves, but they all miss the point, particularly when they become a ritual. (Standup meetings that are over an hour long? Yep, been invited to those. Refused to attend after the first one.) More often than not, the result of fetishization is—nothing at all. A few existing practices are re-named, a few new rituals are created, and work goes on exactly as it did before—but now it’s “agile.” That kind of Agile needs to die. I won’t be among the mourners.
The one thing I don’t see, and the one thing that more than anything else captures the value in Agile, is the ongoing conversation between the customer (however that’s conceived) and the developer. This is important. Agile is not, and never was, about getting developers to write software faster. (Scrum might have been…) Agile is about getting developers in touch with the people who are the actual users and customers, regularly and repeatedly, so that the project doesn’t inevitably wander off course and produce something (in the words of Douglas Adams) “almost, but not quite, entirely unlike tea.”
I’m not an Agile fundamentalist, or even a serious Agile evangelist. But it’s worth some time thinking about the Agile Manifesto and what it means, maybe even meditating on it. If you were involved with professional programming in the 80s and 90s, you may remember how radical it was (and, in many shops, still is) to put software developers in touch with users and customers. Neckbeards? Geeks and nerds? They might tell the customer that some feature they want is impossible! They might tell the truth! Then what would sales do?
Well, it turns out if you want software to work, the developers have to talk to the people who need that software to work. You can’t leave that to sales people or marketers. You can’t create a “product owner” and say that’s their job—especially since, more often than not, “product owners” don’t have meaningful contact with customers themselves. (True story: at a former company, a salesperson made a major sale based on a feature that not only couldn’t possibly be implemented, it didn’t even make sense. We were very lucky not to be sued.) When a project is going off track because some requirement wasn’t understood properly, you need to fix that as soon as possible—not after a year-long development process. If Agile’s first principle is frequent interaction with the customer, its second (and a close corollary) is frequent mid-course corrections. Those mid-course corrections are always less painful than getting to the end and finding that you’ve built the wrong thing. And that’s a big clue about what the word “Agile” means. It’s not about getting software developers to write code faster. It’s about learning when you need to change direction, and then doing it. It’s about correcting small mistakes before they become big ones, before they’re amplified by a multi-year, multi-million dollar budget.
So I really don’t want to hear that Agile doesn’t work for large projects and so on. I don’t care what you call it, but large projects (a) are rarely successful, regardless of the methodology, because they (b) get overloaded with a lot of features that nobody needs but that sound good, and (c) forget what the customer or user really needs or wants. Large projects are necessary, but they have a momentum that tends to drive them off course. Once a project starts going in the wrong direction, it tends to keep going in the wrong direction. That was true before Agile (didn’t Isaac Newton say that?), and will remain true after Agile. Agile provides the tools to keep those projects on track; whether those tools are used, or only get lip service, isn’t the fault of the methodology.
Twenty or so years after the Agile Manifesto was written, Agile faces a number of challenges. The most important is discovering how to work with data science and artificial intelligence projects. Development timelines for these projects aren’t as predictable as traditional software; they stretch the meaning of “testing” in strange ways; they aren’t deterministic. Progress in developing software tends to be slow, incremental, and fairly well understood. It’s reasonable to have something to demo in two weeks (or whatever interval you choose). With AI, you can easily spend months searching for a model—and showing up to one standup meeting after another saying “ran more experiments, didn’t make progress,” until one day it works. (Perhaps the appropriate yardstick for AI projects is the experiment itself, not the code committed to git.)
Can Agile work for large teams? Large teams present their own problems, but it’s ironic to see writers scorning the “two pizza group” concept because it can’t possibly work for large organizations. Do they know where the concept came from? I don’t know how many lines of code support Amazon’s core business, or how many software developers work on them, but I am sure that those are very large numbers. But again: value interactions over documentation. Make those interactions possible by dividing the project and keeping the groups small. And you can keep the principle of constant contact with the customer; you just have to be careful about understanding who your customer is, and what they mean. (This has to do with the concept of bounded context from Domain Driven Design.)
I don’t think Agile is “perfect”; I don’t even know what “perfect” would mean in this context. I do think that Agile, as described in the Manifesto, points to a number of problems that persist in software development, and offers plausible solutions. Sadly, these solutions are more honored in the breach than in the observance; and Eben was right when he said that no methodology has succeeded until it has become a parody of itself. But if Eben is right, then the solution isn’t looking beyond Agile, but becoming self-aware and critical of our own actions. Why, when things change do they remain the same? What are the power struggles, the rewards and punishments, that lie behind the old and new methodologies? When processes change, who wins, who loses, and why? In any organization, answering those questions will say a lot about how Agile becomes self-parody.
But really, do you know what it would mean for Agile to succeed? We’d forget about it. We’d just do it. Frequent contact with customers, good in-person communications between team members, along with practices like source control and testing, would just be in the air, like our Wi-Fi networks. We wouldn’t agonize over those practices, or create rituals and ceremonies around them. They’d simply be what we do. — Mike Loukides
Radar data points: Recent research and analysis
In “5 key areas for tech leaders to watch in 2020,” we examined search and usage data from the O’Reilly online learning platform. This data contains notable signals about the trends, topics, and issues tech leaders need to watch and explore.
- Python is preeminent. It’s the single most popular programming language on O’Reilly, and it accounts for 10% of all usage. This year’s growth in Python usage was buoyed by its increasing popularity among data scientists and machine learning (ML) and artificial intelligence (AI) engineers.
- Software architecture, infrastructure, and operations are each changing rapidly. The shift to cloud native design is transforming both software architecture and infrastructure and operations. Also: infrastructure and operations is trending up, while DevOps is trending down. Coincidence? Probably not, but only time will tell.
- ML + AI are up, but passions have cooled. Up until 2017, the ML+AI topic had been amongst the fastest growing topics on the platform. Growth is still strong for such a large topic, but usage slowed in 2018 (+13%) and cooled significantly in 2019, growing by just 7%. Within the data topic, however, ML+AI has gone from 22% of all usage to 26%.
- Still cloud-y, but with a possibility of migration. Strong usage in cloud platforms (+16%) accounted for most cloud-specific growth. But sustained interest in cloud migrations—usage was up almost 10% in 2019, on top of 30% in 2018—gets at another important emerging trend.
- Security is surging. Aggregate security usage spiked 26% last year, driven by increased usage for two security certifications: CompTIA Security (+50%) and CompTIA CySA+ (+59%). There’s plenty of security risks for business executives, sysadmins, DBAs, developers, etc., to be wary of.
We also recently conducted a survey looking at the state of data quality. As we suspected, the topic was brimming with interest—we quickly received more than 1,900 responses to our survey request.
Key survey results:
- The C-suite is engaged with data quality. CxOs, vice presidents, and directors account for 20% of all survey respondents. Data scientists and analysts, data engineers, and the people who manage them comprise 40% of the audience; developers and their managers, about 22%.
- Data quality might get worse before it gets better. Comparatively few organizations have created dedicated data quality teams. Just 20% of organizations publish data provenance and data lineage. Most of those who don’t say they have no plans to start.
- Adopting AI can help data quality. Almost half (48%) of respondents say they use data analysis, machine learning, or AI tools to address data quality issues. Those respondents are more likely to surface and address latent data quality problems. Can AI be a catalyst for improved data quality?
- Organizations are dealing with multiple, simultaneous data quality issues. They have too many different data sources and too much inconsistent data. They don’t have the resources they need to clean up data quality problems. And that’s just the beginning.
- The building blocks of data governance are often lacking within organizations. These include the basics, such as metadata creation and management, data provenance, data lineage, and other essentials.
Be sure to check out our archive of Radar research and analysis.