Chapter 1. A Brief History of Industrial and Interaction Design

THIS CHAPTER PROVIDES A brief grounding in the history of industrial and interaction design. It covers key moments and people in each discipline, highlighting pivotal events and noting points of convergence and divergence. The history of personal computing is used to trace advances in interaction design, with particular attention given to the physical or virtual nature of different computing platforms.

Even as these two disciplines find new ways to overlap, it is important to understand their individual histories. Just as empathy with users is the foundation of human-centered design, empathy for the context of other design disciplines is what allows us to productively collaborate. Additional background on industrial design is interspersed throughout the book in conjunction with the examples that illuminate each principle.

Industrial Revolution

For most of history, when people needed a particular object, they either created it themselves or found someone to make it for them. Individuals may have specialized in their production, such as shoemakers and carpenters, but their output was still largely unique creations.

There is evidence that generalized fabrication was used to standardize crossbows and other weaponry as early as the 4th century BC in China.[1] However, it was the rapid improvement of manufacturing capabilities during the Industrial Revolution of the 18th and 19th centuries that signaled the radical shift to mass production of identical goods. For the first time, the act of design became separated from the act of making.

Driven by this change in technology, the field of industrial design emerged to specialize in the design of commercial products that appealed to a broad audience and could be manufactured at scale. In contrast to the craftsmen of the past, these designers were challenged with meeting the needs of a large population, balancing functionality, aesthetics, ergonomics, durability, cost, manufacturability, and marketability.

The Industrial Designers Society of America (IDSA) describes industrial design as a professional service that optimizes “function, value, and appearance for the mutual benefit of both user and manufacturer.”[2] It is the study of form and function, designing the relationship between objects, humans, and spaces. Most commonly, industrial designers work on smaller-scale physical products, the kind you buy and use every day, rather than larger-scale complex environments like buildings or ships.

Whether you realize it or not, industrial design is all around you, supporting and shaping your everyday life. The mobile phone in your pocket, the clock on your wall, the coffeemaker in your kitchen, and the chair you are sitting on. Everything you see, touch, and are surrounded by was designed by someone, and thus influenced by industrial design.

Throughout the 20th century, along with balancing the needs of the user and manufacturer, differences in politics and culture were evident in the design of objects. A rising consumer culture in the post-WWII period meant that manufactured goods doubled as a cultural proxy, intertwining national pride and economic reinvention. Along with regional differences, numerous philosophical and stylistic periods created distinct and recognizable eras within industrial design, including the Bauhaus school, Art Deco, Modernism, and Postmodernism.

Design for Business

On a more individual level, there are many famous industrial designers who have had an outsized influence on the history of the discipline. Raymond Loewy, a French-born American, is often referred to as the “Father of Industrial Design.”[3] Loewy is widely considered to have revolutionized the field by pioneering the role of designer as consultant, working within a wide variety of industries and mediums.

Loewy designed everything from streamlined pencil sharpeners to Coca-Cola vending machines, Studebaker automobiles, and NASA spacecraft interiors. He brought design into the mainstream business spotlight, gracing the cover of Time magazine in October 1949, where it was noted that he “made products irresistible at a time when nobody really wanted to pay for anything.”[4] Loewy intertwined culture, capitalism, and style, establishing a template for how design and business could be mutually beneficial.

Design for People

Henry Dreyfuss is another famous American industrial designer whose work and influence from the mid-20th century are still felt today. Among his iconic designs are the Honeywell T86 thermostat, the Big Ben alarm clock, the Western Electric 500 desk telephone, and the Polaroid SX-70 camera (see Figure 1-1).[5]

Honeywell T86 thermostat and Polaroid SX-70 camera, designed by Henry Dreyfuss (photo credit: Kuen Chang)
Figure 1-1. Honeywell T86 thermostat and Polaroid SX-70 camera, designed by Henry Dreyfuss (photo credit: Kuen Chang)

Dreyfuss was renowned not only for his attention to formal details but also for his focus on the user’s needs. He contributed significantly to the field of ergonomics, pioneering research into how human factors should be considered and incorporated into industrial design. After retiring, this focus on anthropometry and usability led him to author two seminal books: Designing for People in 1955 and The Measure of Man in 1960. His interest in universal accessibility extended to graphics as well, as evidenced by his 1972 book, Symbol Sourcebook: An Authoritative Guide to International Graphic Symbols, in which Dreyfuss catalogs and promotes the use of internationally recognizable symbols over written words.

Dreyfuss felt that “well-designed, mass-produced goods constitute a new American art form and are responsible for the creation of a new American culture.”[6] But he emphasized that good design was for everyone, that “these products of the applied arts are a part of everyday American living and working, not merely museum pieces to be seen on a Sunday afternoon.”[7] He promoted this approach through his own work, but also more broadly in his role as a founding member of the American Society of Industrial Design (ASID). When the ASID merged with the Industrial Designers Institute and the Industrial Design Education Association in 1965 to form the IDSA, Dreyfuss became the first president of the new association. In 1965, he became the first president of the IDSA.

Design for Technology

Along with the needs of business and users, the history of industrial design has been strongly shaped by the introduction of new technologies, which present an opportunity to redesign and improve products. Industrial design has always been a conduit for innovation, translating the latest discoveries of science to meet the needs of everyday people.

Take, for example, the humble chair, a ubiquitous object that has become a laboratory for variation in form and materials. Figure 1-2 shows four chairs, each highlighting a shift in the possibilities of material use and manufacturing capability.

Clockwise from upper left: No.18 Thonet chair, Eames Molded Fiberglass Armchair with rocking base, Chair_One, Air-Chair (photo credit: Thonet, Herman Miller Inc.)
Figure 1-2. Clockwise from upper left: No.18 Thonet chair, Eames Molded Fiberglass Armchair with rocking base, Chair_One, Air-Chair (photo credit: Thonet, Herman Miller Inc.)

The No.18 Thonet chair (1876) was an evolution of experimentation begun by Michael Thonet, with this variation released after his death in 1871.[8] Thonet pioneered a new process of bending beech wood to reduce the number of parts involved, simplifying and strengthening the chair while increasing efficiency in shipping and assembly. The aesthetic was influenced by the technology, with generous curves honestly reflecting the bent wood process.

The stamped steel version of the Eames Molded Fiberglass Chair (1950) features a smooth and continuous organic form, unique in appearance and extremely comfortable. It was originally designed in stamped metal, which proved too costly and prone to rust. Instead, a new manufacturing technique was utilized that allowed fiberglass to cure at room temperature. A boat builder who was familiar with fiberglass helped build early prototypes to prove out the concept.[9]

Jasper Morrison’s Air-Chair (1999) takes reduction of parts to the extreme, as it is constructed out of a single piece of injection-molded polypropylene. Inert gas is pumped into the center of molten plastic, resulting in a solid, light, and economical product that comes off the assembly line fully formed.

Konstantin Grcic’s Chair_One (2004) uses a die-cast aluminum process to achieve an original form that is at once full of voids, yet very solid; angular and sculptural at a glance, yet surprisingly more comfortable than it looks. Grcic says that “a bad chair is one that performs all of the requirements, but remains just a chair. One that I use to sit on, but then I get up and it didn’t mean anything to me.”[10] He believes that what makes good design is something hidden in the relationship you have with the object.

Design for Context

Of the chairs discussed in the previous section, the fiberglass model by the husband-and-wife design team of Charles and Ray Eames deserves further attention. The Eameses are known for their enduringly popular classic furniture designs, most of which are still being manufactured by Herman Miller. Their work often utilized new materials such as molded plywood, wire mesh, and the aforementioned fiberglass.

The Eames Molded Fiberglass Chair won second prize in the 1949 International Low-Cost Furniture Competition, primarily for its innovative base that allows it to adapt to different uses and environments such as nursery, office, home, or school. This notion of adaptability to context is a theme that runs through much of Eames’s multidisciplinary work, which spanned products, photography, film, and architecture.

In 1977, Charles and Ray made Powers of Ten, a short documentary film that explores context by examining the effect of scale. The film begins at the level of human perception, with a couple having a picnic on the Chicago lakeshore, and then zooms out by consecutive factors of ten to reveal the entire universe before zooming inward to the scale of a single atom. The film has been influential in encouraging designers to consider adjacent levels of context—the details of how a design relates to the next level of scale, whether that’s a room or a body part. These details are often overlooked, but as Charles once explained, “The details are not the details. They make the product.”[11]

Designing for Behavior

Continuous evolution of manufacturing capabilities, business needs, human factors, materials, and contexts created a wide spectrum of ways in which industrial designers could express a particular product. However, it was the embedding of electronics into products that resulted in the most radical shift in both design possibilities and people’s relationships with objects. For the first time, the potential behavior and functionality of a product was disconnected from its physical form.

Consider the difference between a chair and a radio. Although chairs vary widely in form and materials, the way that a person uses them is largely self-evident, without instruction or confusion. With a radio, the functionality is more abstract. The shape of a knob may communicate its ability to turn, but not necessarily what it controls.

A designer of electronic products uses a mix of different controls, displays, colors, and words to communicate the purpose of various components and provide clarity in how they work together. When this is done poorly, users can be overwhelmed and confused by the possibilities and interrelationships, requiring them to read a manual before operating the product.

German industrial designer Dieter Rams is a master at simplifying these complex electronic products to their essential form (Figure 1-3). Rams designed simple, iconic products for German household appliance company Braun for over 40 years, where he served as the Chief Design Officer until his retirement in 1995. His understated approach and principle of “less but better” resulted in products with a timeless and universal nature. He was restrained in the amount of language used to label knobs and switches, relying on color and information graphics to communicate a product’s underlying behavior in an intuitive manner.

Braun SK 2 Radio, designed by Dieter Rams (photo credit: Kuen Chang)
Figure 1-3. Braun SK 2 Radio, designed by Dieter Rams (photo credit: Kuen Chang)

Part of Rams’s enduring legacy is his ten principles for good design,[12] which are rooted in his deep industrial design experience and remain relevant decades later to a broad range of designers. The principles we chose for this book overlap with his list, emphasizing those that relate best to UX and interaction design challenges. Much has been written about Rams’s ten principles, and we encourage you to review his list as a jumping-off point for further learning and inspiration.

Rams has influenced many contemporary designers, and between 2008 and 2012 the Less and More retrospective of his work traveled around the world, showcasing over 200 examples of his landmark designs for Braun.[13] During an interview with Gary Hustwit for his 2009 film Objectified, Dieter Rams said that Apple is one of the few companies today that consistently create products in accordance with his principles of good design.

It’s no surprise that Jonathan Ive, Apple’s Chief Design Officer, is a fan of Rams’ work and ethos. Since joining Apple in the early 1990s, the British industrial designer has overseen the launch of radical new product lines with unique and groundbreaking designs, including the iMac, iPhone, iPad, and Apple Watch (Figure 1-4). Regarding these innovations, he emphasizes that being different does not equate to being better. In reference to the first iMac design, Ive has said that “the goal wasn’t to look different, but to build the best integrated consumer computer we could. If as a consequence the shape is different, then that’s how it is.”[14]

Apple Watch, iPad, and iPhone, designed by Jonathan Ive (photo credit: Kuen Chang)
Figure 1-4. Apple Watch, iPad, and iPhone, designed by Jonathan Ive (photo credit: Kuen Chang)

Ive’s approach seems to echo and build upon Rams’s motto of “less but better,” although the products that Apple makes are significantly more complex than the ones that Rams designed for Braun. The physical enclosure and input controls of a computing device are similar to legacy electronics, but the mutable functionality of software on a screen is its own world of complexity. The introduction of the personal computer significantly widened the separation of form and function.

In 2012, Ive was knighted by Queen Elizabeth II for his landmark achievements. In the same year, Sir Jonathan Ive’s role at Apple expanded, from leading industrial design to providing direction for all human interface design across the company.[15] This consolidation of design leadership across physical and digital products speaks to the increasing overlap between these two mediums. The best user experience relies on a harmonious integration of hardware and software, an ongoing challenge throughout the history of computing.

Computing Revolution

Interaction with the first personal computers was entirely text-based. Users typed commands and the computer displayed the result, acting as little more than an advanced calculator. Computers had shrunk in size, but this direct input and output echoed the older mainframe technology. Even the common screen width of 80 characters per line was a reference to the number of holes in a punch card. In the relationship between people and technology, these early computers favored the machine, prioritizing efficient use of the small amount of available processing power.

This early personal computing era can be likened to the time before the Industrial Revolution, with digital craftsmen making machines primarily for themselves or their friends. These computers were the domain of hobbyists, built from kits or custom assembled by enthusiasts who shared their knowledge in local computer clubs.

In 1968, at the Fall Joint Computer Conference in San Francisco, Douglas Engelbart held what became known as “The Mother of All Demos,” in which he introduced the oN-Line System, or NLS. This 90-minute demonstration was a shockingly prescient display of computing innovation, introducing for the first time modern staples such as real-time manipulation of a graphical user interface, hypertext, and the computer mouse.

Early computing pioneer David Liddle talks about the three stages of technology adoption: by enthusiasts, professionals, and consumers. It was the introduction of the graphical user interface, or GUI, that allowed the personal computer to begin its advancement through these phases.

The GUI was the key catalyst in bringing design to software. Even in its earliest incarnations, it signaled what computers could be if they prioritized people, increasing usability and accessibility despite the incredible amount of processing power required. But making software visual did not automatically make computers usable by ordinary people. That would require designers to focus their efforts on the world behind the screen.

In his book Designing Interactions, IDEO cofounder Bill Moggridge relates a story about designing the first laptop computer, the GRiD Compass, in 1979.[16] The industrial design of the Compass had numerous innovations, including the first clamshell keyboard cover. It ran a custom operating system called GRiD-OS, which featured an early graphical user interface, but with no pointing device. Using this GUI prompted him to realize for the first time that his role as a designer shouldn’t stop at the physical form—it needed to include the experiences that people have with software as well.

Years later, Bill Moggridge, along with Bill Verplank, would coin the term “interaction design” as a way of distinguishing design that focuses on digital and interactive experiences from traditional industrial design.

Pioneering computer scientist and HCI researcher Terry Winograd has said that he thinks “Interaction design overlaps with [industrial design], because they both take a very strong user-oriented view. Both are concerned with finding a user group, understanding their needs, then using that understanding to come up with new ideas.”[17] Today we take for granted this approach of designing software by focusing on people, but in the Silicon Valley of the 1980s the seeds of human-centered computing were only just being planted.

The Split Between Physical and Digital

In the 1970s, influenced by Douglas Engelbart’s NLS demonstration, numerous research projects at Xerox PARC explored similar topics. The Xerox Star, released in 1981, was the first commercially available computer with a GUI that utilized the now familiar desktop metaphor. This structure of a virtual office correlated well with the transition that computing was attempting to make from enthusiasts to professional users.

The graphical desktop of the Star featured windows, folders, and icons, along with a “What You See Is What You Get” (WYSIWYG) approach that allowed users to view and manipulate text and images in a manner that represented how they would be printed. These features, among others, were a direct influence on both Apple and Microsoft as they developed their own GUI-based operating systems.

In 1983, Apple released the Lisa, its first computer to utilize a GUI. A year later, it launched the Mac, which became the first GUI-based computer to gain wide commercial success. Microsoft debuted Windows 1.0 in 1985 as a GUI overlay on its DOS operating system, but adoption was slow until 1990, with the release of the much improved Windows 3.0.

Although their operating systems had many similarities, the business models of Apple and Microsoft could not have been more different. Apple was a product company, and made money by selling computers as complete packages of hardware and software. Microsoft made no hardware at all. Instead, it licensed Windows to run on compatible computers made by third-party hardware manufacturers that competed on both features and price.

As businesses embraced computers in every office, they overwhelmingly chose Windows as a more cost-effective and flexible option than the Mac. This majority market share in turn created an incentive for software developers to write programs for Windows. Bill Gates had found a way to create a business model for software that was completely disconnected from the hardware it ran on. In the mid-1990s, even Apple briefly succumbed to pressure and licensed its Mac OS to officially run on Macintosh “clones.”

The potential for design integration that Bill Moggridge had seen between hardware and software was difficult to achieve within this business reality. The platform approach of the Windows operating system had separated the physical and digital parts of the personal computer. Companies tended to focus on hardware or software exclusively, and designers could make few assumptions about how they were combined by end users.

Although the GUI used a spatial metaphor, the variety of monitor sizes and resolutions made it difficult to know how the on-screen graphics would be physically represented. The mouse and the standard 102-key keyboard acted as a generic duo of input devices, dependable but limited. Software emerged as a distinct and autonomous market, which contributed to the largely separate evolution of interaction and industrial design.

As software took on new and varied tasks, interaction designers sought inspiration and expertise not only from traditional design fields but from psychology, sociology, communication studies, and computer science. Meanwhile, industrial designers continued to focus primarily on the physical enclosures of computers and input devices. After all, computing was only one of a vast range of industries that industrial designers worked within.

Information Revolution

In 1982, the Association for Computing Machinery (ACM) recognized the growing need to consider users in the design of software by creating the Special Interest Group on Computer-Human Interaction (SIGCHI). Shortly after, the field of Human-Computer Interaction (HCI) emerged as a recognized subdiscipline of computer science.

Because designing how people use digital systems was so new, and because the task required integrating so many fields of knowledge, it became a vibrant research area within multiple fields of study (psychology, cognitive science, architecture, library science, etc.). In the early days, however, actually making software always required the skills of an engineer. That changed in 1993 with the launch of the Mosaic web browser, which brought to life Tim Berners-Lee’s vision for the World Wide Web. The Internet had been around for years, but the graphical nature of the Web made it much more approachable.

The Web was an entirely new medium, designed from the ground up around networks and virtuality. It presented a clean slate of possibility, open to new forms of interaction, new interface metaphors, and new possibilities for interactive visual expression. Most importantly, it was accessible to anyone who wanted to create their own corner of the Web, using nothing more than the simple HyperText Markup Language (HTML).

From the beginning, web browsers always came with a “View Source” capability that allowed anyone to see how a page was constructed. This openness, combined with the low learning curve of HTML, meant a flood of new people with no background in computer science or design began shaping how we interact with the Web.

The Web hastened the information revolution and accelerated the idea that “information wants to be free.” Free to share, free to copy, and free of physicality. Microsoft Windows had distanced software from the machines it ran on, but the Web pushed interactive environments into an entirely virtual realm. A website could be accessed from any computer, regardless of size, type, or brand.

By the mid-1990s, Wired had described web users as Netizens, socializing in virtual reality was an aspiration, and there was growing excitement that ecommerce could replace brick-and-mortar stores. The narrative of progress in the late 20th century was tied to this triumph of the virtual over the physical. The future of communication, culture, and economics increasingly looked like it would play out in front of a keyboard, in the world on the other side of the screen.

Standing on the shoulders of previous pioneers, the flood of designers native to the Web used the very medium they were building to define new interaction patterns and best practices. The Web had brought about the consumer phase of computing, expanding the scope and influence of interaction design to a level approaching that of its older, industrial cousin.

Smartphones

Early mobile phones had limited functionality, primarily centered on making voice calls and sending SMS messages. The introduction of the Wireless Application Protocol (WAP) brought a primitive browser to phones so they could access limited information services like stock prices, sports scores, and news headlines. But WAP was not a full web experience, and its limited capabilities, combined with high usage charges, led to low adoption.

Even as mobile phones began accumulating additional features such as color screens and high-quality ringtones, their software interactions remained primitive. One contributing factor was the restrictive environment imposed by the carriers. The dominant wireless networks (AT&T, Sprint, T-Mobile, and Verizon) didn’t make the operating systems that powered their phones, but they controlled how they were configured and dictated what software was preinstalled.

Decisions about which applications to include were often tied to business deals and marketing packages, not consumer need or desire. The limited capabilities and difficult installation process for third-party apps meant that they were not widely used. This restrictive environment was the opposite of the openness on the Web—a discrepancy that was strikingly clear by 2007, when Apple launched the iPhone and disrupted the mobile phone market.

Just as Microsoft’s Windows OS had created a platform for desktop software to evolve, it was Apple’s turn to wield a new business model that would dramatically shift the landscape of software and interaction.

Although the original iPhone was restricted to the AT&T network, the design of the hardware and software was entirely controlled by Apple. This freedom from the shackles of the carrier’s business decisions gave the iPhone an unprecedented possibility for a unified experience.

For the original release, that openness was focused on the Web. Mobile Safari was the first web browser on a phone to render the full Web, not a limited WAP experience. A year later, an update to iOS allowed third-party applications to be installed. This was the beginning of yet another new era for interaction design, as the focus shifted not only to a mobile context but to the reintroduction of physicality as an important constraint and design opportunity.

The interaction paradigm of the iPhone and the wave of smartphones that have since emerged uses direct touch manipulation to select, swipe, and pinch as you navigate between and within apps. Touchscreens had existed for decades, but this mass standardization on one particular screen size awoke interaction designers to considering the physical world in a way that desktop software and the Web never had. Respecting the physical dimensions of the screen became critically important to ensure that on-screen elements were large enough for the range of hands that would interact with them.

Knowing the physical dimensions of the touchscreen also led to new opportunities, allowing designers to craft pixel-perfect interface layouts with confidence in how they would be displayed to the end user. This ability to map screen graphics to physical dimensions was concurrent with the rise of a new graphical interface style that directly mimicked the physical world. This visual style, often called skeuomorphism, presents software interfaces as imitations of physical objects, using simulated textures and shadows to invoke rich materials such as leather and metal.

Although often heavy-handed and occasionally in bad taste, these graphical references to physical objects, combined with direct touch manipulation, reduced the learning curve for this new platform. Katherine Hayles, in her book How We Became Posthuman, describes skeuomorphs as “threshold devices, smoothing the transition between one conceptual constellation and another.”[18] The skeuomorphic user interface helped smartphones become the most rapidly adopted new computing platform ever.[19]

Today, skeuomorphic interface styles have fallen out of favor. One reason is that we no longer need their strong metaphors to understand how touchscreens work; we have become comfortable with the medium. Another factor is that touchscreen devices now come in such a wide variety of sizes that designers can no longer rely on their designs rendering with the kind of physical exactness that the early years of the iPhone afforded.

The iPhone was also a bellwether of change for industrial design. Smartphones are convergence devices, embedding disparate functions that render a variety of single-purpose devices redundant. Examples of separate, physical devices that are commonly replaced with apps include the calculator, alarm clock, audio recorder, and camera. Products that traditionally relied on industrial designers to provide a unique physical form were being dematerialized—a phenomenon that investor Marc Andreessen refers to as “software eating the world.”[20]

At the same time, the physical form of the smartphone was very neutral, designed to disappear as much as possible, with a full-screen app providing the device’s momentary purpose and identity. This was a shift from the earlier mobile phones, where the carriers differentiated their models primarily through physical innovation such as the way a phone flipped open or slid out to reveal the keypad.

Even as interaction designers introduced physical constraints and metaphors into their work, industrial designers saw their expertise underutilized. The rise of the smartphone made inventor and entrepreneur Benny Landa’s prediction that “everything that can become digital, will become digital” seem truer than ever. For industrial design, which throughout the 20th century had always defined the latest product innovations, this was a moment of potential identity crisis.

Smart Everything

The general-purpose smartphone continues to thrive, but today these convergence devices are being complemented by an array of single-use “smart” devices. Sometimes referred to collectively as the Internet of Things, these devices use embedded sensors and network connectivity to enhance and profoundly change our interactions with the physical world.

This introduces design challenges and possibilities well beyond a new screen size. Smart devices can augment our natural interactions that are already happening in the world, recording them as data or interpreting them as input and taking action. For example:

  • The Fitbit activity tracker is worn on your wrist, turning every step into data.

  • The Nest Thermostat can detect that you’ve left the house and turn down the temperature.

  • The August Smart Lock can sense your approach and automatically unlock the door.

  • The Apple Watch lets you pay for goods by simply raising your wrist to a checkout reader.

The smartphone required designers to consider the physicality of users in terms of their fingertips. These new connected devices require a broader consideration of a person’s full body and presence in space.

Over the last few decades, opinions have oscillated on the superiority of general-purpose technology platforms versus self-contained “information appliances.” Today’s “smart devices” represent a middle ground, as these highly specialized objects often work in conjunction with a smartphone or web server that provides access to configuration, information display, and remote interactions.

Open APIs allow devices to connect to and affect each other, using output from one as the input to another. Services such as IFTTT (IF This Then That) make automating tasks between connected devices trivial. For example, one IFTTT recipe turns on a Philips Hue light bulb in the morning when your Jawbone UP wristband detects that you have woken up.

Unfortunately, not all connected devices play nice with others, and too often the smartphone is treated as the primary point of interaction. This makes sense when you want to change your home’s temperature while at the office, or check the status of your garage door while on vacation. But if adjusting your bedroom lighting requires opening an app, it certainly doesn’t deserve the label “smart.”

We find ourselves in yet another transitional technology period, where the physical and digital blur together in compelling but incomplete ways. There is potential for connected devices to enhance our lives, giving us greater control, flexibility, and security in our interactions with everyday objects and environments. There is promise that we can interlace our digital and physical interactions, reducing the need for constant engagement with a glowing screen in favor of more ambient and natural interactions within our surroundings. But there is also a danger that connecting all of our things simply amplifies and extends the complexity, frustration, and security concerns of the digital world.

The technical hurdles for the Internet of Things are being rapidly overturned. The primary challenge today lies in designing a great user experience, at the level of both an individual device and how it works within a unified system. This will require designers who can extend beyond their disciplinary silos, who understand the constraints and possibilities at the intersection of digital and physical. The future of user experience isn’t constrained to a screen, which is why interaction designers today need to better understand industrial design.



[1] Joseph Needham, Science and Civilisation in China, Volume 1: Introductory Orientations (Cambridge, UK: Cambridge University Press, 1954).

[2] “What Is Industrial Design?” Industrial Designers Society of America, accessed January 22, 2015, http://www.idsa.org/education/what-is-id.

[3] “The Father of Industrial Design: Raymond Loewy,” The Official Site of Raymond Loewy, accessed January 22, 2015, http://www.raymondloewy.com.

[4] Olivia B. Waxman, “Google Doodle Honors Raymond Loewy, the ‘Father of Industrial Design,’” Time, November 5, 2013, accessed January 22, 2015, http://ti.me/1Nuu0h9.

[5] “Henry Dreyfuss, FIDSA,” Industrial Designers Society of America, accessed January 22, 2015, http://www.idsa.org/content/henry-dreyfuss-fidsa.

[6] Henry Dreyfuss, Designing for People (New York: Simon and Schuster, 1955), 82–83.

[7] Ibid.

[8] “History,” Thonet, accessed January 22, 2015, http://www.thonet.com.au/history/.

[9] Kaitlin Handler, “The History of the Eames Molded Plastic Chairs,” Eames Official Site, May 4, 2014, accessed December 5, 2015, http://bit.ly/1UbYw0l.

[10] “On Design: Konstantin Grcic,” NOWNESS, accessed January 22, 2015, http://bit.ly/1HYjFd9.

[11] Daniel Ostroff, “The Details Are Not the Details...” Eames Office, September 8, 2014, accessed December 5, 2015, http://bit.ly/1MlcCWP.

[12] “Dieter Rams: Ten Principles for Good Design,” Vitsœ, accessed January 22, 2015, https://www.vitsoe.com/gb/about/good-design.

[13] “Less and More: The Design Ethos of Dieter Rams,” San Francisco Museum of Modern Art, accessed January 22, 2015, archived here: http://archv.sfmoma.org/exhib_events/exhibitions/434.

[14] Leander Kahney, Jony Ive: The Genius Behind Apple’s Greatest Products (New York: Penguin Putnam Inc., 2013), 125.

[15] “Apple Announces Changes to Increase Collaboration Across Hardware, Software & Services,” Apple Inc., October 29, 2012, accessed January 22, 2015, http://apple.co/1mrHjox.

[16] Bill Moggridge, Designing Interactions (Cambridge, MA: MIT, 2007).

[17] Jenny Preece, Yvonne Rogers, and Helen Sharp, Interaction Design: Beyond Human-Computer Interaction (Hoboken, NJ: Wiley, 2002), 70.

[18] Katherine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago, IL: University of Chicago Press, 1999), 17.

[19] Michael DeGusta, “Are Smart Phones Spreading Faster than Any Technology in Human History?” Technology Review, May 9, 2012, accessed January 20, 2015, http://bit.ly/1fEBvj0.

[20] Chris Anderson, “The Man Who Makes the Future: Wired Icon Marc Andreessen,” Wired, April 24, 2012, accessed December 17, 2014, http://bit.ly/1IoZe92.

Get Understanding Industrial Design now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.