Chapter 1. Introducing Interactive Gestures

"One of the things our grandchildren will find quaintest about us is that we distinguish the digital from the real."

William Gibson in a Rolling Stone interview, November 7, 2007

A man wearing special gloves stands in front of a large, translucent screen. He waves his hand in front of it, and objects on the screenmove. It's as though he's conducting an orchestra or is some sort of high-techsorcerer's apprentice, making objects fly about with just a sweep of his arm. He makes another gesture, and a video begins to play. With both hands, he stretches the video to a larger size, filling more of the screen. It's like magic.

Another place, another time: a different man stands in front of an audience. He's running his fingers over a table-size touchscreen before him as though he is a keyboard player in a rock band, his fingers rapidly manipulating images on the screen by dragging them around. He's making lines appear on-screen with his fingers and turning them into silky, ink-like paintings. He's playing, really—showing off. He drags his fingers across the surface and leaves a trail of bubbles. It's also like magic.

The first man doesn't really exist, although you'd probably recognize the actor playing him: Tom Cruise. The scene is from the movie Minority Report (2002), and it gave the general public its first look at a computer that responds to gestures instead of to speech, a keyboard, or a mouse. It was an impressive feat of visual effects, and it made a huge impression on people everywhere, especially interaction designers, some of whom had been working on or thinking about similar systems for years.

The second man does exist, and his name is Jeff Han. Not only did his jumbo touchscreen devices influence Minority Report, but his live demonstrations—first privately and then publicly at the 2006 TED conference[1]—will likely go down in computer history near the "Mother of All Demos" presentation that Doug Engelbart made in 1968, in which he showed now-familiar idioms such as email, hypertext, and the mouse. Han's demos sparked thousands of conversations, blog posts, emails, and commentary.

Jeff Han demos a multitouch touchscreen at the 2006 TED conference. Since then, Han has created Perceptive Pixel, a company that produces these devices for high-end clients. Courtesy TED Conferences, LLC.

Figure 1-1. Jeff Han demos a multitouch touchscreen at the 2006 TED conference. Since then, Han has created Perceptive Pixel, a company that produces these devices for high-end clients. Courtesy TED Conferences, LLC.

Since then, consumer electronics manufacturers such as Nintendo, Apple, Nokia, Sony Ericsson, LG, and Microsoft have all released products that are controlled using interactive gestures. Within the next several years, it's not an exaggeration to say that hundreds of millions of devices will have gestural interfaces. A gesture, for the purposes of this book, is any physical movement that a digital system can sense and respond to without the aid of a traditional pointing device such as a mouse or stylus. A wave, a head nod, a touch, a toe tap, and even a raised eyebrow can be a gesture.

In addition to touchscreen kiosks that populate our airports and execute our banking as ATMs, the most famous of the recent products that use gestures are Nintendo's Wii and Apple's iPhone and iPod Touch. The Wii has a set of wireless controllers that users hold to play its games. Players make movements in space that are then reflected in some way on-screen. The iPhone and iPod Touch are devices that users control via touching the screen, manipulating digital objects with a tap of a fingertip.

Rather than focusing on the technical specs of the gaming console like their competitors, Nintendo designers and engineers focused on the controllers and the gaming experience, creating the Wii, a compelling system that uses gestures to control on-screen avatars. Courtesy Nintendo.

Figure 1-2. Rather than focusing on the technical specs of the gaming console like their competitors, Nintendo designers and engineers focused on the controllers and the gaming experience, creating the Wii, a compelling system that uses gestures to control on-screen avatars. Courtesy Nintendo.

TAP IS THE NEW CLICK

We've entered a new era of interaction design. For the past 40 years, we have been using the same human-computer interaction paradigms that were designed by the likes of Doug Engelbart, Alan Kay, Tim Mott, Larry Tesler, and others at Xerox PARC in the 1960s and 1970s. Cut and paste. Save. Windows. The desktop metaphor. And so many others that we now don't even think about when working on our digital devices. These interaction conventions will continue, of course, but they will also be supplemented by many others that take advantage of the whole human body, of sensors, of new input devices, and of increased processing power.

We've entered the era of interactive gestures.

The next several years will be seminal years for interaction designers and engineers who will create the next generation of interaction design inputs, possibly defining them for decades to come. We will design new ways of interacting with our devices, environment, and even each other. We have an opportunity that comes along only once in a generation, and we should seize it. How we can create this new era of interactive gestures is what this book is about.

Currently, most gestural interfaces can be categorized as either touchscreen or free-form. Touchscreen gestural interfaces—or, as some call them, touch user interfaces (TUIs)—require the user to be touching the device directly. This puts a constraint on the types of gestures that can be used to control it. Free-form gestural interfaces don't require the user to touch or handle them directly. Sometimes a controller or glove is used as an input device, but even more often (and increasingly so) the body is the only input device for free-form gestural interfaces.

Our relationship to our digital technology is only going to get more complicated as time goes on. Users, especially sophisticated users, are slowly being trained to expect that devices and appliances will have touchscreens and/or will be manipulated by gestures. But it's not just early adopters: even the general public is being exposed to more and more touchscreens via airport and retail kiosks and voting machines, and these users are discovering how easy and enjoyable they are to use.



[1] Watch the demo yourself at http://www.ted.com/index.php/talks/view/id/65.

Get Designing Gestural Interfaces now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.