A 5G future
In this edition of the Radar column, we explore the limitations and possibilities of high-speed 5G connectivity.
For the past year, 5G cell technology has generated a lot of excitement–and a lot of hype. The specifications are impressive: 5G will provide a peak data rate of up to 20 Gbps (with 100 Mbps of “user experienced data rate”) to mobile devices: cell phones, smart cars, and a lot of devices that haven’t been invented yet. It’s difficult to imagine mobile applications that will require that much data, and 5G’s proponents seem willing to promise just about anything. What will 5G mean in practice? If it’s going to make any real difference, we’ll need to think that through.
The most obvious change 5G might bring about isn’t to cell phones but to local networks, whether at home or in the office. Back in the 1980s, Nicholas Negroponte said everything wired will become wireless, and everything wireless will become wired. What happens to “last mile” connectivity, which seems to be stuck somewhere around 50 Mbps for homes and several times that for business service? It would be great to have an alternative to the local cable monopoly for high-bandwidth connectivity. We were supposed to have fiber to the home by now. I don’t, do you? High-speed networks through 5G may represent the next generation of cord cutting. Can 5G replace wired broadband, allowing one wireless service for home and mobile connectivity? I don’t need more bandwidth for video conferences or movies, but I would like to be able to download operating system updates and other large items in seconds rather than minutes. Anyone who has ever built a Docker container has experienced “now we wait for some giant things to download and be uncompressed.” Those waits can be significant, even if you’re on a corporate network. They could disappear.
Rural connectivity is a persistent problem; many rural users (and some urban users) are still limited to dial-up speeds. Although the industry claims that 5G will provide better connectivity for rural areas, I’m skeptical. Because 5G uses higher frequencies than 4G, and higher frequencies are more subject to path loss, 5G cells have to be smaller than 4G/LTE cells. If carriers won’t build cell towers for current technology, they aren’t likely to build even more towers for 5G. I suspect rural communities will be left in the dark–again.
As far as mobile and embedded devices go, I don’t see why I need a gigabit on my phone, except perhaps to serve as a Wi-Fi hub when traveling. Phones are a painful way to watch movies–more about that later. 5G enthusiasts frequently say it’s an enabling technology for autonomous vehicles (AV), which will need high bandwidth to download maps and images, and perhaps even to communicate with each other: AV heaven is a world in which all vehicles are autonomous and can therefore collaboratively plan traffic. That may well require 5G–though again, I wonder who is going to make the investment in building out rural networks. Autonomous vehicles that only work in urban or suburban areas are less useful. For applications like communication between AVs, latency–how long it takes to get a response–is more likely to be a bigger limitation than raw bandwidth, and is subject to limits imposed by physics. There are impressive estimates for latency for 5G, but reality has a tendency to be harsh on such predictions. Reliability will be an even bigger problem than latency. Remember your last trip to New York or San Francisco? Cell service in major cities is often poor because signals are reflected from buildings and attenuated (weakened) as they pass through. Those problems get worse as you go higher in frequency, as 5G does. Whether you’re interested in AVs or some other applications, making mobile connections more reliable is more important than making them faster. 5G intends to do so by trading off congestion against signal quality. That’s a plausible tradeoff, but it remains to be seen whether it works.
Pete Warden, who is working on machine learning for very low power devices, says 5G is only marginally useful for the applications he cares about. When you’re trying to build a device that will run for months on a coin battery, you realize that the radio takes much more power than the CPU. You have to keep the radio off as much as possible, transmitting data in short, brief bursts. So what about industrial IoT (IIoT), and sensors that can be built into a sticker and slapped on to machinery? That might be a 5G application–but as Warden has said, the real win here is eliminating batteries and power cords, which in turn requires careful use of low-power networking. 5G isn’t ideal for that, and the first indications are that it will require more power than current technologies.
Regardless of power consumption, I’m not convinced we’ll have lots of IoT devices shipping data back to their respective motherships. We’ve seen the reaction to news that Amazon’s Echo and Google Home send recordings of conversations back to the server. And we’re already seeing devices like smart thermostats and light bulbs being used for harassment. As privacy regulation takes hold and techniques like federated learning become more widespread, the need–and desire–for shipping our data far and wide will inevitably decrease.
So where is 5G useful? Let’s get back to home networking. I’d gladly give up my 50 Mbps wired connection for gigabit wireless. Again, that’s the ultimate cable cutting, and it creates significant new possibilities. I might not want to watch 4K video on my phone (given current screen technology, to say nothing of our eyes’ angular resolution, high-resolution video on a phone is meaningless), but I might want to send video from my phone to my television using Chromecast.
I’m satisfied with my current Wi-Fi deployment, but I wonder whether I’d even need Wi-Fi in a 5G world. Perhaps, for security and privacy reasons, it makes sense to separate a local network from the rest of the world. But that’s also a problem that 5G vendors could solve; virtual LANs (VLANs) are hardly a new concept. Gigabit connectivity to laptops, with the cell network providing a VLAN, could also replace office networks. In either case, some hard guarantees about privacy and security would be needed. Given service providers’ records on user tracking, that may be too much to ask.
If we can get some enforceable guarantees about privacy and security on ISP-provided VLANs, I can imagine bigger changes. I’ve long thought it makes little sense to maintain disk drives (whether rust-based or solid-state) that periodically fail and need to be backed up. I do regular backups, but I know I’m the exception. What would the world look like if all of our storage was in the cloud, and access to that storage was so fast we didn’t care? What if all of your documents were in Google Docs, all of your music was in your favorite streaming service? That vision isn’t entirely new; Sun Microsystems had the idea back in the 1990s, and that’s essentially the vision behind Google’s Chromebooks.
How would our usage patterns change with 5G? I have 30 or 40 GB of photos. I could upload them all to Google Photos or some other service, but at 50 Mbps down and 10 Mbps up, that’s not something you want to think about. At a gigabit, you don’t have to think twice. I’ve always been unimpressed by streaming services for music and video, at least partly because they’re least available when you most want them: when you’re flying or on a train, in at a technical conference with 3,000 attendees maxing out the hotel’s network. (Someone once told me “so download everything you’re likely to want to listen to before leaving.” Really.) But with gigabit microcells, this suddenly makes sense. Maybe not on flights, which are out of range of cell towers and where WoeFi will remain the order of the day, and maybe not when you’re driving through rural areas, but if I can get a gigabit network to my phone, why should I care about Amtrak’s slow Wi-Fi or network congestion in my hotel? If an office can get that kind of bandwidth to my laptop, with adequate guarantees for cloud security, why should we worry about office LANs?
Whether that excites you or not, that strikes me as a significantly new pattern: we won’t care where our data is. We won’t need to worry about backups. We won’t need to worry (as much) about outages. We can bring our networks with us. We won’t even need to worry as much about security; Google, Amazon, and Microsoft all do better backups than I ever will, are much better at surviving network disruption, and know an awful lot more about how to protect my data. If Google can push their users to two-factor authentication (2FA) or the use of a security dongle, that’s a huge step toward safe computing. Those cloud providers will, of course, have to guarantee this data remains private–as private as it is when it lives on a personal disk drive or an office fileserver. That’s a problem that’s eminently solvable.
The implications for business are even more important. Home users think in gigabytes; businesses are increasingly involved with tera- or petabytes. It’s a lot easier to move large datasets when you have ubiquitous gigabit networks. Whether that’s training data for AI applications or just lots of transaction records, businesses move data, and lots of it. With our current technology, the best way to move a huge amount of data is, all too often, to put disk drives on a truck. 5G brings us a lot closer to solving that problem–if we can get hard guarantees about security and privacy. Businesses are even less likely than users to appreciate some third party using their data for their own purposes.
I’m sure that 5G will also lead to a new generation of smart devices that can use the bandwidth–devices we haven’t imagined yet. But I’m more interested in something I can imagine, decoupling myself from my data: having access to it any time, any where, without carrying it around, or stashing it on some kind of machine in the closet. That’s the real promise of 5G. — Mike Loukides
Radar data points: Recent research and analysis
We recently conducted a survey on serverless architecture adoption. We were pleasantly surprised at the high level of response: more than 1,500 respondents from a wide range of locations, companies, and industries participated. The high response rate tells us that serverless is garnering significant mindshare in the community.
Key findings from the serverless survey include:
- 40% of respondents work at organizations that have adopted serverless architecture in some form or another. Reduced operational costs and automatic scaling are the top serverless benefits cited by this group.
- Of the 60% of respondents whose companies haven’t adopted serverless, the leading concerns about the paradigm are security and fear of the unknown.
- About 50% of respondents who adopted serverless three-plus years ago consider their implementations successful or extremely successful, a contrast to the 35% of those adopting serverless a year or less ago experiencing a successful or extremely successful implementation—a gap that suggests serverless experience pays off.
- Respondents who have implemented serverless made custom tooling the top tool choice—implying that vendors’ tools may not fully address what organizations need to deploy and manage a serverless infrastructure.
Read “O’Reilly serverless survey 2019: Concerns, what works, and what to expect” for full results. Also be sure to check out our archive of Radar research and analysis.
O’Reilly conferences combine expert insights from industry leaders with hands-on guidance about today’s most important technology topics.
We hope you’ll join us at our upcoming events:
O’Reilly Software Architecture Conference in New York, February 23-26, 2020
Strata Data Conference in San Jose, March 15-18, 2020
O’Reilly Artificial Intelligence Conference in San Jose, March 15-18, 2020