Chapter Twenty One. Five Pitfalls in the Design for Privacy
Scott Lederer, Jason I. Hong, Anind K. Dey, and James A. Landay
TO PARTICIPATE IN MEANINGFUL PRIVACY PRACTICE IN THE CONTEXT OF TECHNICAL SYSTEMS, people require opportunities to understand the extent of the systems’ alignment with relevant practice and to conduct discernible social action through intuitive or sensible engagement with the system. To help designers support these processes, this chapter describes five pitfalls to beware when designing interactive systems—on or off the desktop—with personal privacy implications. These are based on a review of the literature, on analyses of existing privacy-affecting systems, and on our own experiences designing a prototypical user interface for managing privacy in ubiquitous computing (ubicomp).[1]
Introduction
One possible reason why designing privacy-sensitive systems is so difficult is that, by refusing to render its meaning plain and knowable, privacy simply lives up to its name. Instead of exposing an unambiguous public representation for all to see and comprehend, it cloaks itself behind an assortment of meanings, presenting different interpretations to different people. When sociologists look at privacy, they see social nuance that engineers overlook. When cryptologists consider privacy, they see technical mechanisms that everyday people ignore. When the European Union looks at privacy, it sees moral expectations that American policymakers do not. Amid this fog of heterogeneous ...
Get Security and Usability now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.