“Without addressing privacy and trust, the Internet of Things will not reach its full potential.”
This refrain can be heard at IoT conferences, in opinion pieces in the press and in normative academic literature. If we don’t “get it right,” then consumers won’t embrace the IoT and all of the wonderful commercial and societal benefits it portends.
This is false.
It’s a nice idea, imagining that concern for privacy and security will curtail or slow technological growth. But don’t believe it: the Internet of Things will develop whether or not privacy and security are addressed. Economic imperative and technology evolution will impel the IoT and its tremendous potential for increased monitoring forward, but citizen concern plays a minor role in operationalizing privacy. Certainly, popular discourse on the subject is important, but developers, designers, policy-makers and manufacturers are the key actors in embedding privacy architectures within new connected devices.
Unsurprisingly, much current research shows that people are still uncomfortable and feel overexposed regarding their privacy. The prolific Pew Research Center tells us:
“…Americans feel privacy is important in their daily lives in a number of essential ways. Yet, they have a pervasive sense that they are under surveillance when in public and very few feel they have a great deal of control over the data that is collected about them and how it is used.”
“Rather high general perception of risks related to the disclosure of personal information online” and “very strong expectations that personal information is used by the website owners / shared with third parties without the users’ knowledge and consent.”
However, it’s very hard to prove that people decline to buy IoT products now or in the future because of these concerns. Such proof would be difficult to obtain: it would require large surveys that not only laid out a common definition of the IoT (something that experts have yet to agree on) and then show a strong relationship between consumers declining to purchase something IoT-like and concerns over its privacy and security risks. Or, specific vendors could disclose lackluster sales of their latest devices, ask consumers why they’re not buying, and the response must then be poor privacy characteristics. (Such disclosures are not in vendors’ interests.) The absence of this proof, however, does not stop professionals and the commentariat from proclaiming this constraint on the growth of connected devices. I call this ‘The Orthodoxy of Chilled Innovation.’
We’ve seen this before
The IoT is only the latest market domain in which we hear this orthodoxy. To wit:
1995: The global marketplace is doomed!
“Unless … adequate protection for copyrighted works is ensured, the vast communications network will not reach its full potential as a true, global marketplace.” (Copyright violation was and is a rampant problem.)
2000: Electronic commerce is doomed!
“The [Federal Trade] Commission believes that its proposed legislation, in conjunction with self-regulation, will ensure important protections for consumer privacy at a critical time in the development of the online marketplace. Without such protections, electronic commerce will not reach its full potential and consumers will not gain the confidence they need in order to participate fully in the electronic marketplace.” (The proposed legislation never came to pass.)
2000: The national information infrastructure is doomed!
“Unless security and privacy are protected, the [national information infrastructure] won’t reach its full potential.” (Seems to be healthy and evolving.)
“A networked economy will only reach its full potential if sectoral boundaries are dismantled and an even take-up of ICT in society is ensured.” (There are plenty of uneven socio-economic qualities to the Internet and related technologies.)
“If we don’t get privacy right then the online consumer will revolt, which will negatively impact everyone involved in online businesses.” (Consumer revolution is a fantasy.)
Time proved these vague assertions hollow: the Internet, the US national information infrastructure and e-commerce are doing just fine. The privacy and security risks have not been addressed in any radical or comprehensive way, and people are still communicating, buying and surfing.
The warm fuzzies
So, what accounts for this orthodoxy? My theory is that it’s an attractive, intuitive argument influenced by the collective vulnerability people feel. Starting from the research that says people are worried about the intrusiveness of technology, one can imagine a desire to believe that our worries will translate into a will to slow things down, or a wariness on the part of IoT vendors. The argument that privacy and security must be addressed for the IoT to blossom, then, can be met with head nodding and warm feelings because it assuages fears.
It is, however, an empty sentiment. The Internet of Things, whatever it is, will happily march along with lousy privacy and security, and we will be the poorer for it. Collective senses of the loss of privacy are a small part of what encourages the improvement of privacy preservation. Certainly, businesses large and small do think about what the populace might find “creepy,” but there is a wide gulf between considering opinions that might affect sales and actually baking privacy into devices. One should not confuse marketing with engineering or business practice. Regarding the IoT, the Orthodoxy of Chilled Innovation ignores recent history and economic logic: businesses seek frictionless transactions, privacy is rarely a differentiator, security and privacy become more opaque topics over time, and businesses behave according to their (absence of) regulatory regimes. The danger of the Orthodoxy is that it may lull people into thinking that something will ensure their sense of privacy loss is addressed before the IoT remakes our world into a digital utopia; a false sense of security.
Privacy does not protect itself, nor do markets arc toward the social goals of privacy and consumer protection on their own. Privacy is a technocratic pursuit: designers, engineers, product managers, risk and compliance managers, and company leaders are ultimately the ones who can actively improve the privacy posture of their devices. Complementing this are technology-neutral information policies that require privacy impact and security assessments and consumer protection. An unpopular view is that privacy is a paternalistic pursuit by the state. Such a view flies in the face of the economically driven belief that self-regulation is the main force by which we should engender privacy, or that “[e]ducating and empowering citizens is the better way” to address privacy failures – two more orthodoxies. Markets in liberal democracies cannot exist without regulation, and regulation itself is not sufficient to effect the protections we seek. Privacy protection occurs through a plurality of necessary but insufficient steps. Wishing is not one of them.