The Future of Technology and Proprietary Software

by
December 2003

In celebration of its 25th anniversary, InfoWorld did a feature on where technology has been and where it's headed: 25 Years of Technology. I answered some questions for that piece about the future of technology and proprietary software. Many of my comments were included at the end of the article, but I'd like to include them here in their entirety, as well.

Question: If you were to take a stab at labeling the technology eras of the future, out to the year 2028, what would you call them? If, for example, we consider the 1980s the PC era, and the mid-90s marked the start of the Internet era, how would you define the eras to come in 5 years, 10 years, 15 years, 25 years?

Answer: My favorite phrase for the next five years is Dave Stutz's "software above the level of a single device." Consider Apple's iTunes for a moment as a paradigmatic example:

  • Runs on a Mac or PC desktop/laptop
  • Syncs with a handheld device (iPod)
  • Does ad-hoc local area network sharing via Rendezvous
  • Has a web database/e-commerce backend in the music store

Over the next five years, all apps will follow this pattern. I tend to call this the "true Internet era." We could also call it Internet 3.0 (with the old telnet/ftp era being Internet 1.0, and the web being 2.0).

On a related note, in my talk The Internet Paradigm Shift (PowerPoint), I ask my audiences how many of them use Linux. Depending on the audience, 20-80 percent raise their hands (usually towards the lower end of that range). Then I ask how many use Google. Close to 100 percent raise their hands. The point: We've been conditioned by the desktop era to think of the software we "use" as that which is running on the machine in front of us, when all of the "killer apps" of the Internet era (from Google to Amazon to eBay to MapQuest) run on at least two--our access device, and a backend server farm. And if you look at how the most advanced users are working, you might see a situation in which someone accesses Google from a laptop connected to a cell phone via Bluetooth, through the cell network to a remote data server, and from there out across the Internet. We're building a huge, decentralized computing fabric.

When a computing paradigm changes, it takes at least a decade for the world to catch up. Consider how years after the advent of the PC in the early 80s, Ken Olsen of DEC could still deride it as a toy. It wasn't till the 90s that it really became clear that the PC was the center of gravity of the computing universe. In short, I think we've got a long way to go before we realize the full potential of the Internet era. We're going to see network effects--and network-effect businesses--having impact on fields from politics to human interaction.

That being said, I also very much like IBM's phrase, "pervasive computing," which emphasizes not just the Internet but the omnipresence of computing. Because, of course, "software above the level of a single device" means much more than what we used to call a computer. We are starting to see the real blurring of handhelds, cell phones, cameras, and other consumer devices. Everything is becoming connected, and computing truly is becoming pervasive.

Wireless is a big part of this. (That is, you could also call it the mobile era, or the unwired era.) As people get seamlessly connected, wherever they are, devices become less important, even throwaway, and the continuity of the user's data becomes most important. An interesting corollary is that a huge part of the value premium of Internet-era powerhouses like Amazon and eBay is not in their software, but in the critical mass of participating users that they have.

It's hard to see even ten years out--the pace of change is increasing. However, it's clear that life sciences are going to have a huge impact in years ahead. While the human genome project hasn't lived up to the short-term hype, it's clear that we're getting very close to many breakthroughs. And any one of many could so redefine society that we'd immediately consider it a "new era."

I expect to see a lot more happening with robotics, as well as with human augmentation. The Segway is the thin end of a wedge. We'll start with disabled people wearing powered exoskeletons and other devices, and then they'll be adopted by otherwise healthy people. I do also expect to see a progress through wearable computers to "embedded" computers--chips and devices embedded in the human body for everything from health maintenance to communications, and in a dystopian vein, even location tracking.

Speaking of location, location-based services are going to be huge in the next decade, to the extent that any device that doesn't know where you are, and filter your data accordingly, will be considered broken.

Distributed, low-power sensor networks are going to revolutionize many devices. In some ways, you can think of the next ten years being about the interpenetration of the real and the virtual worlds. People and things are going to get wired (or rather unwired), not just "computers" in the traditional sense.

We'll also see computerized assistants that, while not AI in the traditional sense, will seem magical to today's users. (Heck, Google would be magical to someone from ten years ago. You can quickly find an answer to almost any question from a huge distributed knowledge repository, using search techniques that come up with close to the right answer from very small clues and a minimal interface.)

We're starting to see an interesting resurgence of interest in space, both from India and China, and also from U.S.-based entrepreneurs.

In a darker vein, it's also the era that will see the end of privacy, or rather, the illusion of privacy.

Question: What is the future for proprietary software?

Answer: About the same as the future for proprietary hardware. That is, as any industry matures, many elements that were previously high-value become standardized, and then commoditized. But that doesn't mean that there is no longer any possibility of proprietary value. A commodity PC has a proprietary "Intel Inside" processor; an open Internet has "Cisco inside." But it's increasingly difficult to charge a premium even for innovations in hardware these days--Nokia says that its new phone designs are knocked off in Asia within a few weeks of introduction. So there's really a pressure to find value in other ways.

Another source of value is in design. In his essay "The Birth of the Big, Beautiful Art Market" (collected in the book Air Guitar: Essays on Art and Democracy), Dave Hickey describes how, after WW II, Harley Earl of GM turned the marketing of automobiles "from being about what they do to what they mean." His point was, as industries become commoditized, intangibles play a greater role in product differentiation. This is now happening in the computer market. Apple has been a pioneer in marketing computers for what they mean rather than what they do. Everything from the 1984 ad to "Think Different" speaks to the self-image of the user who chooses an Apple product. But the rest of the market is catching up to them.

In short, the kind of premium that proprietary software has enjoyed in the Microsoft era is likely to be significantly reduced, till it is equivalent to the proprietary value that, say, Mercedes has relative to Toyota or BMW or GM. Anything good will be copied quickly. There will be some engineering advantages, but new market momentum often comes from design innovations rather than technical superiority.

Return to: tim.oreilly.com

Print