Chapter 15. Invisible Tools or Emotionally Supportive Pals?
Try juxtaposing these two thoughts:
Researchers are telling us that, emotionally and intellectually, we respond more and more to digital machines as if they were people. Such machines, they say, ought to be designed so as to be emotionally supportive. (“Good morning, John. You seem a little down today. Bummer.”) Stanford social researchers B. J. Fogg and Clifford Nass propose this rule for designers of human-machine interfaces: “Simply put, computers should praise people frequently—even when there may be little basis for the evaluation” (Fogg and Nass 1997). Leaving questions of sincerity and ethics aside, this is thought to be quite reasonable, since machines are obviously becoming ever more human-like in their capabilities.
The common advice from other human-computer interface experts is that we should design computers and applications so that they become invisible—transparent to the various aims and activities we are pursuing. They shouldn’t get in our way. For example, if we are writing, we should be able to concentrate on our writing without having to divert attention to the separate requirements of the word processor.
The two pieces of advice may not necessarily contradict each other, but their conjunction is nevertheless slightly odd. Treat machines like people, but make them invisible if possible? Combining the two ideals wouldn’t say much for our view of people. It sounds as though we’re traveling down two rather ...
Get Devices of the Soul (Hardcover) now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.