In this best of 2016 episode, I revisit a conversation from earlier this year with Cory Doctorow, a journalist, activist, and science fiction writer. We discuss the unexpected places where digital rights management (DRM) pops up, how it hinders artistic expression and legitimate security research, and the ill-anticipated (and often dangerous) consequences of copyright exemptions.
Early in 2016, Cory and the Electronic Frontier Foundation (EFF) launched a lawsuit against the U.S. government. They are representing two plaintiffs—Matthew Green and Bunnie Huang—in a case that challenges the constitutionality of Section 1201 of the Digital Millennium Copyright Act (DMCA). The DMCA is a notoriously complicated copyright law that was passed in 1998. Section 1201 is the part that relates to bypassing DRM. The law says that it's against the rules to bypass DRM, even for lawful purposes, and it imposes very severe civil and criminal penalties. There's a $500,000 fine and a five-year prison sentence for a first offense provided for in the statute. Here, Cory explains some of the more subtle consequences that arise from DRM in unexpected places.
An urgent need to protect individual rights and freedoms
Everything has software. Therefore, manufacturers can invoke the DMCA to defend anything they’ve stuck a thin scrim of DRM around, and that defense includes the ability to prevent people from making parts. All they need to do is add a little integrity check, like the ones that have been in printers for forever, that asks, ‘Is this part an original manufacturer's part, or is it a third-party part?’ Original manufacturer's parts get used; third-party parts get refused. Because that check restricts access to a copyrighted work, bypassing it is potentially a felony. Car manufacturers use it to lock you into buying original parts.
This is a live issue. Apple has deprecated the 3.5-millimeter audio jack on their phones in favor of using a digital interface. If they put DRM on that digital audio interface, they can specify at a minute level—and even invent laws about—how customers and plug-in product manufacturers can engage with it. Congress has never said, ‘You're not allowed to record anything coming off your iPhone,’ but Apple could set a “no record” flag on audio coming out of that digital interface. Then they could refuse to give license for users to decrypt the audio, making it illegal to use. Simply by using the device, users would be agreeing to accept and honor that no-record stipulation, and bypassing it would be illegal.
DRM hinders legitimate research and artistic expression
Matthew Green [one of the plaintiffs in the EFF lawsuit] has a National Science Foundation grant to study a bunch of technologies with DRM on them, and the Copyright Office explicitly said he is not allowed to do research on those technologies. The Copyright Office did grant a limited exemption to the DMCA to research consumer products, but it excludes things like aviation systems or payment systems like Green wants to research. Bunnie Huang [the other plaintiff] is running up against similar limitations on bypassing DRM to make narrative films with extracts from movies.
We have one branch of the government refusing to grant these exemptions. We have the highest court in the land saying that without fair use, copyright is not constitutional. And we have two plaintiffs who could be criminal defendants in the future if they continue to engage in the same conduct they've engaged in in the past. This gives us standing to now ask the courts whether it’s constitutional for the DMCA to apply to technologies that enable fair use, and whether the Copyright Office really does have the power to determine what they grant exemptions for. Our winning this case would effectively gut Section 1201 of the DMCA for all of the anticompetitive and the security-limiting applications that it's found so far.
DCMA exemptions can have serious consequences
The Copyright Office granted an exemption for tablets and phones so people could jailbreak them and use alternate stores. This exemption allows individuals to write the necessary software to jailbreak their own personal devices but does not allow individuals to share that tool with anyone else, or publish information about how it works or information that would help someone else make that tool. So, now we have this weird situation where people have to engage in illegal activity (trafficking in a tool by sharing information about how to jailbreak a phone) to allow the average user to engage in a legal activity (jailbreaking their device). This is hugely problematic from a security perspective. Anyone can see the danger of seeking out randos to provide binaries that root a mobile device. To avoid prosecution, those randos are anonymous. And because it’s illegal to give advice about how the tool works, people have no recourse if it turns out that the advice they follow is horribly wrong or ends up poisoning their device with malware.
This is a disaster from stem to stern—we're talking about the supercomputer in your pocket with a camera and a microphone that knows who all your friends are. It's like Canada’s recent legalization of heroin use without legalizing heroin sales. A whole bunch of people died of an overdose because they got either adulterated heroin or heroin that was more pure than they were used to. If the harm reduction you’re aiming for demands that an activity be legal, then the laws should support safe engagement in that activity. Instead, in both the heroin and device jailbreak examples, we have made these activities as unsafe as possible. It's really terrible. The security implications really matter, because we hear about vulnerabilities and zero-days and breaks against IoT devices every day in ways that are really, frankly, terrifying. Last winter, it was people accessing baby monitors; this week, it was ransomware for IoT thermostats and breaks against closed-circuit televisions in homes.