User testing Hackaball with children
User testing Hackaball with children (source: Made by Many)

User experience (UX) design and human–computer interaction (HCI) emerged in a world of desktop computers. But our experience of computing has changed radically in the past 10–15 years. Many of our interactions now take place on mobile phones, tablets, ereaders, and smart TVs. And it’s common to use one service across multiple devices with different form factors (Figure 1-1).

Figure 1-1. BBC iPlayer can be used on connected TVs, smartphones, tablets , PCs, game consoles, and set-top boxes (image: BBC)

We’re still figuring out the best ways to design for new devices and experiences. Interactions can happen in a wide variety of contexts, especially for mobile devices. They can happen on a variety of scales, from tiny wrist-tops, to smartphones, to TV user interfaces (UIs) viewed from 10 feet away. Even academic researchers in HCI have published relatively few papers on cross-platform design.

The “Internet of Things” (IoT) refers to the growing range of everyday objects acquiring connectivity, sensing abilities, and increased computing power. In consumer terms, some common categories currently include:

  • Connected home technology (such as thermostats, lighting, and energy monitoring)
  • Wearables (such as activity/fitness trackers and “smart” watches)
  • Medical/wellness devices (such as bathroom scales and blood pressure monitors)
  • Connected cars (which may provide access to smartphone apps via dashboard controls, engine diagnostics, and automatic alerting of authorities in case of a crash)
  • Urban systems (such as air quality sensors, city rental bikes, and parking meters/sensors)

Designing for the Internet of Things (IoT) raises all the challenges of cross-platform design, and more.

An obvious difference is the much wider variety of device form factors, many without screens (see Figure 1-2).

Less obvious differences include the effects of many IoT devices being only intermittently connected. And even a simple task, like unlocking a door, can quickly become complex when it forms part of a system spanning many interconnected devices, services, and users.

IoT is still a technically driven field. At the time of writing, the UX of many IoT products is some way off the level expected of mature consumer products. For example, the UK government commissioned a study on the usability of connected heating systems in late 2013. They found that none of the five major connected heating devices on the market in the UK offered a good UX.1

Figure 1-2. The Lockitron connected door lock is one of a huge number of connected devices with no screen (image: Lockitron)

Before we start, we should explain what we mean by “UX” and “user experience design.” Many people equate the term with “UI” or “user interface design,” but they are not the same. UX is a holistic term referring to a wide range of design disciplines involved in creating systems that are useful, usable, and pleasurable to use. UI design is just one of those. As the UX consultant Elisabeth Hubert explains it:

The user interface is not the (design) solution, but instead is the medium through which users interact with the solution.2

(In “A Design Model for IoT,” we’ll look in more detail at the facets of design that comprise UX for an IoT system.)

In this chapter, we begin by introducing the differentiators that make UX design for IoT a new and challenging domain.

This chapter introduces:

  • What’s different about UX for IoT
  • A design model for IoT

It considers the following issues:

  • The challenges of distributing functionality across multiple devices (see page 5)
  • How the focus of the UX is increasingly in the service (see page 6)
  • Whether we are ready for the real world to start behaving like the Internet (see page 7)
  • How the ways devices connect to the network affects the UX (see page 8)
  • How multiple devices create more complexity for the user to understand (see page 9)
  • How controlling distributed devices is similar to programming (see page 11)
  • How what seem like simple systems can rapidly become complex (see page 13)
  • The problems of having many different technical standards (see page 14)
  • How data is at the core of many IoT services (see page 16)
  • The layers of UX thinking required to create a successful IoT product: from UI and interaction design all the way down to the platform (see page 17)

How Is UX for IoT Different?

Designing for IoT comes with a bunch of challenges that will be new to designers accustomed to pure digital services. How tricky these challenges prove will depend on:

  • The maturity of the technology you’re working with
  • The context of use or expectations your users have of the system
  • The complexity of your service (e.g., how many devices the user has to interact with)

The following sections summarize the key differences between UX for IoT and UX for digital services. Some of these are a direct result of the technology of embedded devices and networking. We’ll explain the technology issues in more detail in Chapters 2 and 3. But even if you are already familiar with embedded device and networking technology, you might not have considered the way it shapes the UX.


IoT devices come in a wide variety of form factors with varying input and output capabilities. Some may have screens, such as heating controllers or washing machines (see Figure 1-3). Some may have other ways of communicating with us, such as flashing LEDs or sounds (see Figure 1-4).

Figure 1-3. The Honeywell evohome connected radiator valve has a basic LCD screen (image: Honeywell Environmental Controls)

Some may have no input or output capabilities at all and are unable to tell us directly what they are doing. Interactions may be handled by web or smartphone apps. Despite the differences in form factors, users need to feel as if they are using a coherent service rather than a set of disjointed UIs. It’s important to consider not just the usability of individual UIs but interusability: distributed user experience across multiple devices (see Figure 1-5). This is explained further in Chapter 9.

Figure 1-4. The GlowCaps connected pill bottle lid uses light and sound notifications to remind the user to take medication (image: GlowCaps)
Figure 1-5. The Nest Learning Thermostat can be controlled by the on-device UI, a smartphone app, or a web app (image: Nest)


When we talk about IoT, we tend to focus on the devices, particularly those with striking or novel forms. But the behavior of the device might be generated by a program that lives on another device on the network (i.e., a server). We call this the Internet (or “cloud”) service.

This means that the service around a connected device is often just as critical in delivering the user experience, if not more so, than the device itself. For example, the smart travelcards such as the London Oyster and Hong Kong Octopus are often thought of as the focus of the payment service. But the services can be used without a card at all via an NFC-enabled smartphone or bank card (Figure 1-6). The card is just an “avatar” for the service (to borrow a phrase from the UX expert Mike Kuniavsky).3 For more on service business models and product–service ecosystems, see Chapter 4.

Figure 1-6. Hong Kong’s Octopus payment service can be used with an NFC phone as well as a smart card (Octopus reader on KMB bus by Ka890, CC licence via Wikimedia Commons; Octopus app: Octopus Cards Ltd.)


It’s frustrating when a web page is slow to download or a Skype call fails. But we accept that these irritations are just part of using the Internet. By contrast, real-world objects respond to us immediately and reliably.

When we interact with a physical device over the Internet, that interaction is subject to the same latency and reliability issues as any other Internet communication. So there’s the potential for delays in response and for our requests and commands to go missing altogether. This could make the real world start to feel very broken. Imagine if you turned your lights on and they took two minutes to respond, or failed to come on at all.

In theory, there could be other unexpected consequences of things adopting Internet-like behaviors. In the Warren Ellis story “The Lich House”4 a woman is unable to shoot an intruder in her home: her gun cannot contact the Internet for the authentication that would allow her to fire it. This might seem far-fetched, but we already have objects that require authentication, such as Zipcars (Figure 1-7).

Figure 1-7. When you book a Zipcar online, the service sends details of the reservation to the car; swiping a smart card authenticates you as the person who made the booking (image: Zipcar)

For more information on networking and its impact on UX, see Chapter 3.


When we design for desktops, mobiles, and tablets, we tend to assume that they will have constant connectivity. Well-designed mobile apps handle network outages gracefully, but tend to treat them as exceptions to normal functioning. We assume that the flow of interactions will be reasonably smooth, even across devices. If we make a change on one device (such as deleting an email), it will quickly propagate across any other devices we use with the same service.

That will not always happen in IoT systems. Many connected devices run on batteries, and need to conserve electricity. Maintaining network connections uses a lot of power, so they only connect intermittently. This means that parts of the system can be out of sync with one another, creating discontinuities in the user experience. For example, if your heating is set to 19° C, and you use the heating app on your phone to turn it up to 21° C, it will take a couple of minutes for your battery-powered heating controller to go online to check for new instructions. During this time, the phone says 21° C, and the controller says 19° C (Figure 1-8).

Figure 1-8.

Schematic of heating system with app and controller giving different status information

These discontinuities won’t always be noticed: sometimes the delays will be very short, and sometimes users won’t be around to notice them—for example, when they are turning on a remote device. So the UX may feel synchronous, even if the service technically isn’t. But when we do notice, the UX may feel quite disjointed.

For more on the technical background see Chapters 2 and 3. For more on the design impact, see Chapter 9.


The configuration of devices and code that makes a system work is called the system model. In an ideal world, users should not have to care about this. We don’t need to understand the technical architecture of a conventional Internet service, like Amazon, in order to use it successfully. Instead, we form a conceptual model of what Amazon does and how it works that’s good enough to help us understand what to do. We know we can search or browse products, that we need to add them to a basket, set up or log into a user account, pay, and then we’ll get a delivery. We don’t need to understand the different machines involved in making this system function.

But as a consumer of an IoT service right now, you can’t always get away from some of this technical detail.

A typical IoT service is composed of:

  • One or more embedded devices (the things in IoT, described in Chapter 2)
  • An Internet service
  • Perhaps a gateway device (a separate device needed to connect some embedded devices to the Internet, described in Chapter 3)
  • One or more mobile or web apps for the user to interact with the service via a mobile, tablet, or PC

Compared to a conventional web service, there are more places where code can run. There are more parts of the system that can, at any point, be offline. Depending on what code is running on which device, some functionality may at any point be unavailable.

For example, imagine you have a connected lighting system in your home. It has controllable bulbs or fittings, perhaps a gateway that these connect to, an Internet service, and a smartphone app to control them all (see Figure 1-9). You have an automated rule set up to turn on some of your lights at dusk if there’s no one home. If your home Internet connection goes down, does that rule still work? If the rule runs on the Internet service or your smartphone, it won’t. If it runs on the gateway, it will. As a user, you want to know whether your security lights are running or not. You need to understand a little about the system model to understand which devices are responsible for which functionality, and how the system may fail.

It would be nice if we could guarantee no devices would ever lose connectivity, but that’s not realistic. And IoT is not yet a mature set of technologies in the way that ecommerce is, so failures are likely to be more frequent. System designers have to ensure that important functions (such as home security alarms) continue to work as well as possible when parts go offline and make it as easy as possible for users to understand what’s happening, and recover from any problems.

For more on matching system models to users’ conceptual models, see Chapter 6.

Figure 1-9. The Philips Hue system consists of connected bulbs, a gateway, an Internet service, and a smartphone app (image: Philips)


The shift from desktop to mobile computing means that we now use computers in a wide variety of situations. Hence, mobile design requires a far greater emphasis on understanding the user’s needs in a particular context of use. IoT pushes this even further: computing power and networking is embedded in more and more of the objects and environments around us. For example, a connected security system can track not just whether the home is occupied, but who is inside, and potentially video record them. Hence, the social and physical contexts in which connected devices and services can be used are even more complex and varied.


In 1982, the HCI researcher Ben Shneiderman defined the concept of direct manipulation. User interfaces based on direct manipulation “depend on visual representation of the objects and actions of interest, physical actions or pointing instead of complex syntax, and rapid incremental reversible operations whose effect on the object of interest is immediately visible. This strategy can lead to user interfaces that are comprehensible, predictable and controllable.”5 Ever since, this has been the prevailing trend in consumer UX design (see Figure 1-10). Direct manipulation is successful because interface actions are aligned with the user’s understanding of the task. They receive immediate feedback on the consequences of their actions, which can be undone.

Figure 1-10. A n image by the designer Susan Kare, created in MacPaint 1.0: an early example of a popular direct manipulation interface (image: Wikipedia)

IoT creates the potential for interactions that are displaced in time and space: configuring things to happen in the future, or remotely. For example, you might set up a home automation rule to turn on a video camera and raise the alarm when the house is unoccupied and a motion sensor is disturbed. Or you might unlock your porch door from your work computer to allow a courier to drop off a parcel.

Both of these break the principles of direct manipulation. To control things that happen in the future, you must anticipate your future needs and abstract the desired behavior into a set of logical conditions and actions. As the HCI researcher Alan Blackwell points out, this is basically programming.6 It is a much harder cognitive task than a simple, direct interaction. That’s not necessarily a bad thing, but it may not be appropriate for all users or all situations. It impacts usability and accessibility.

Unlocking the door remotely is an easier action to comprehend. But we are distanced from the consequences of our actions, and this poses other challenges. Can we be sure the door was locked again once the parcel had been left? A good system should send a confirmation, but if our smartphone (or the lock) lost connectivity, we might not receive this.

For a deeper discussion of programming-like experiences, see Chapter 15.


A simple IoT service might serve only one or two devices (e.g., a couple of connected lights). You could control these with a very simple app. But as you add more devices, there are more ways for them to coordinate with one another. If you add a security system with motion sensors and a camera, you may wish to turn on one of your lights when the alarm goes off. So the light effectively belongs to two functions or services: security and lighting. Then add in a connected heating system that uses information from the security system to know when the house is empty. And assume that there are several people in the house with slightly different access privileges to each system. For example, some can change the heating schedule, some can only adjust the current temperature. Some have admin rights to the security system, some can only set and unset the alarm. What started out as a straightforward system has become a complex web of interrelationships.7

For a user, understanding how this system works will become more challenging as more devices and services are added. It will also become more time consuming to manage.

For more information, see Chapter 15.


The Internet is an amazing feat of open operating standards, but, before embedded devices were connected, there was no need for appliance manufacturers to share common standards. As we begin to connect these devices together, this lack of common technology standards is causing headaches. Just getting devices talking to one another is a big enough challenge, as there are many different network standards. Being able to get them to coordinate in sensible ways is vastly more complicated still.

The consumer experience right now is of a selection of mostly closed, manufacturer-specific ecosystems. Devices within the same manufacturer’s ecosystem, such as Withings (see Figure 1-11), will work together. But this is the only given. In the case of Withings, this means that devices share data with a common Internet service, which the user accesses via a smartphone app. Apple’s Airplay is an example of a proprietary ecosystem in which devices talk directly to one another.

We’re starting to see manufacturers collaborating with other manufacturers too. So your Nest Protect smoke detector can tell your LIFX lightbulbs to flash red when smoke is detected. (This is done by connecting the two manufacturer’s Internet services rather than connecting the devices.)

There are also some emerging platforms that seek to aggregate devices from a number of manufacturers and enable them to interoperate. The connected home platform SmartThings supports a range of network types and devices from manufacturers such as Schlage and Kwikset (door locks); GE and Honeywell (lighting and power sockets); Sonos (home audio); and Philips Hue, Belkin, and Withings (connected home products); see Figure 1-12. But the platform has been specifically configured to work with each of these. You cannot yet buy any device and assume it will work well with SmartThings.

Figure 1-11. The Withings ecosystem of devices (images: Withings)
Figure 1-12. The SmartThings gateway and some compatible devices (image: SmartThings)

For the near future, the onus will be largely on the consumer to research which devices work with their existing devices before purchasing them. Options may be limited. In addition, aggregating different types of device across different types of network tends to result in a lowest common denominator set of basic features. The service that promises to unify all your connected devices may not support some of their more advanced or unique functions: you might be able to turn all the lights on and off but only dim some of them, for example. It will be a while before consumers can trust that things will work together with minimal hassle.

For more information, see Chapter 15.


Networked, embedded devices allow us to capture data from the world that we didn’t have before, and use it to deliver better services to users. For example, drivers looking for parking spaces cause an estimated 30% of traffic congestion in US cities. Smart parking applications such as Streetline’s Parker use sensors in parking spaces to track where spaces are free, for drivers to find via a mobile app (see Figure 1-13). The software company Opower analyzes data from smart meters to suggest ways in which utility customers could save energy and money (see Figure 1-14).

Figure 1-13. Streetline’s Parker app (image: Streetline)
Figure 1-14. Sample Opower energy report (image: Opower)

Networked devices with onboard computation are also able to use data, and in some cases act on it autonomously. For example, a smart energy meter can easily detect when electrical activity is being used above baseload. This is a good indicator that someone is in the house and up and about. This data could be used by a heating system to adjust the temperature or schedule timing.

To paraphrase another quote from Mike Kuniavsky, “information is now a design material.”8

For more information, see Chapter 13.

A Design Model for IoT

As shown in the preceding sections of this chapter, designing for IoT will confront you with some extra challenges and complexity that you wouldn’t encounter on a “conventional” (software only) web service. You’ll need to think about some different and perhaps new areas of design that all serve to shape the UX.

The two most visible and tangible forms of design for IoT are:

The UI/visual design

For example, the screen layout and look and feel of the web or mobile apps, or devices themselves (see Figure 1-15). (UIs don’t have to be visual; they can use audio, haptics, and other channels, as discussed in Chapter 8. But it’s rare for a service to have no screen-based UI at all.)

The industrial design of the physical hardware

The form factor, styling, and capabilities of the connected devices themselves (see Figure 1-16).

Figure 1-15. The Philips Hue UI allows users to change the color of light emitted by an LED bulb (image: Philips Communications)

UI and industrial design are important but not the whole picture. The UX is not just shaped by what the user can see or encounter directly. To create a valuable, appealing, usable, and coherent IoT service, we have to consider design on many different layers.

Figure 1-15. The Nest Learning Thermostat has a striking industrial design (image: Nest)

In 2000, Jesse James Garrett produced his “Elements of User Experience” diagram (and subsequent book) to explain how different design specialties fit together in web UX design.9 This represented the different types of design required, where uppermost layers (i.e., visual design, information, interface, and navigation design) are most visible to the user, but depend on the structure provided by the lower layers (i.e., site objectives, content requirements, etc.), which are dealt with earlier in the project. Just as there were dependencies in his model, where work that was not directly apparent to the user determined aspects that they could directly experience, so designing for connected products has a similar flow, with critical decisions that affect UX made in early stages of the design.

If you’re more familiar with engineering models, you might think of this as being a little like a technology stack (as in the Internet network stack in Chapter 3), in which each layer of technology is dependent on the lower levels functioning well. It’s not an exact analogy, as the dependencies between different layers of design are much more fluid. But it’s a useful comparison to make the point that good design at the higher, more visible layers requires a clear framework at the lower layers.

There are many different facets of design involved in delivering a good UX for an IoT service, set out in Figure 1-17.

Figure 1-17. Facets of design in IoT—a good product requires integrated thinking across all of these

This isn’t a set of discrete activities required in your project plan. It says nothing of user research, for example. Nor is it a set of job roles you need on your project. You might, for example, need data analytics, and you’ll certainly need engineers.

It’s aspects of the user experience that need to be considered. Some of these things will evolve in tandem. For example, UI, interaction design, and interusability need to be thought about together. UX design at the platform layer will emerge as a need once you start adding multiple devices to a service.

A good overall product requires integrated thinking across all these layers. A stunning UI means nothing if your product concept makes no sense. A beautiful industrial design may sell products in the short term but can’t mask terrible service.

Depending on the type and complexity of your service, layers will require more or less of your time. IoT services aspire to extend over more devices and become more complex over time, and the parts you need less now may become more relevant to you in the future.


UI/visual design refers to screen layout, visual styling, and look and feel on a device. This is the form that a device interface takes. The outputs of UI/visual design are typically high-fidelity screen mockups (Figure 1-18). Not all UIs are visual, of course: for a gestural or audio interface, the equivalent function might be defining the aesthetics of the gestures or voice.

Figure 1-18. Style tiles (visual concepts) and UI modules for a redesign of the IDA Institute’s website (; images: Halei Liu and Hans Pelle Jart for

See Chapter 8 for more details on designing UIs for embedded devices.


Interaction design is the design of device behaviors. Interaction designers shape the sequences of actions between the user and the device needed to achieve particular goals or activity. They also determine how to organize the user-facing functions of the device. For example, a heating controller might have several modes, such as schedule on/off or frost protection, and some hierarchical functions, such as schedule setting. The organization of these functions defines how easy or otherwise it may be for users to find their way around them.

Interaction design is closely aligned to UI design in the sense that the two are usually done in tandem and often by the same people. But interaction design is primarily concerned with behaviors and actions, whereas UI/visual design is concerned with layout and aesthetics. (Just to confuse matters, some people use UI design as a shorthand term to include both interaction design and visual design.) Typical outputs for interaction design might include user flows, low-medium fidelity interactive prototypes, and for a visual UI, screen wireframes (Figure 1-19).

Figure 1-19. Device and app interaction design concepts for Hackaball, a programmable ball for children (; images: Map and Made by Many)

You sometimes hear the term “information architecture” used to describe organization schemes for functionality, but technically this refers to the equivalent activity for content-driven systems, such as content-based websites.

See Chapter 8 for more details.


Interusability is a relatively new term. It refers to the additional considerations of designing interactions that span multiple devices. The goal is to make the overall experience feel like a coherent service, even when the devices involved may have quite different form factors and input/output capabilities.

Interusability isn’t a separate set of design activities. It’s an extra set of considerations to be addressed in tandem with interaction and UI design. The key differences to a single device UX design process would typically be:

  • Specifying which functionality belongs on each device
  • Creating design guidelines that span multiple device types
  • Designing cross-device user flows for key interactions
  • Designing multiple device UIs in parallel

See Chapter 9 for more details.


Industrial design refers to the aesthetic and functional design of the physical hardware in the service: the choice of form, materials, and capabilities it may have (see Figure 1-20). Connected devices contain electronic circuitry and radio antennae, which impose particular requirements on the industrial design. Devices can also have input and output capabilities, which require collaboration between industrial designers and UI/interaction design/interusability.

See Chapter 7 for more details.

Figure 1-20. Hardware prototype and color/material/form explorations for Hackaball, a programmable ball for children (; images: Map and Made by Many)


A connected device is rarely a one-off purchase. It comes with the expectation of an ongoing service, at the very least the provision of the Internet service that keeps it running, and customer support. All of this forms part of the user’s overall experience with the product.

Service design is an emerging discipline that addresses this holistic view of user experience. It looks at the whole lifespan of a user’s experience with a service, provides a view of all the components of the user experience, and specifies how these function together as a coherent whole (see Figure 1-21).

In addition to device interactions, it might include:

  • Customer support interactions
  • Instructional guides
  • Marketing or sales materials
  • In-store experiences
  • Email communications and notifications
  • The UX of software updates and rolling out new functionality
Figure 1-21. Experience mapping and prototyping service interactions for obstetric services in Kenya, Uganda, and India (images: M4ID)

The service experience is critical to the success of an IoT product, but in this book, we have not addressed service design as a separate activity or chapter. We encourage you to take a holistic approach to designing the user experience that includes interactions not just with devices but also the wider service context. Service design methods may be extremely useful here, but you would not apply them differently on an IoT project than you would on any other project, so we have not discussed them separately.

See Chapters 4 and 6 for guidance on scoping the service experience. The suggested methods in Chapter 14 include guidance on designing the service experience as part of overall UX design.


The conceptual model is the understanding and expectations you want the user to have of the system. What components does it have, how does it work, and how can they interact with it? It’s the mental scaffolding that enables users to figure out how to interact with your service. Regardless of whether a conceptual model is designed or not, users will form one. If they get it wrong, they’ll struggle to use the system. IoT services are often inherently complex systems. You can create a clear conceptual model through careful system and interaction design and supporting documentation. You want users to feel in control from the start—they should feel confident that they will be able to use the system, even if they don’t understand all the details yet.

See Chapters 6 and 9 for more details.


Productization is the activity of defining a compelling product proposition. It addresses the audience, proposition, objectives, and overall functionality of service (and often its business model). Does your product solve a real problem for a real audience? Is it presented so that they understand that? Does it appeal to them? This isn’t always the domain of the UX designer on a project, but it’s the underpinnings of good UX. All the frontend design in the world won’t make a killer product unless it does something of value for people, in a way that appeals to them and makes sense.

See Chapter 4 for more details.


A platform is a software framework. It takes care of low-level details to help developers build applications more easily. At their most basic, IoT platforms make it easy to put data from connected devices onto the Internet. Slightly more advanced platforms may provide frameworks to enable different types of devices to interoperate.

A software platform will aim to solve many technical issues, many of which may not directly have an impact on the UX. But some more advanced platform functionality is very much to do with UX.

For example, a platform like Hue or Withings may provide standard ways to:

  • Discover new devices and applications
  • Add devices and applications onto the system
  • Manage devices and users
  • Manage how devices share data

These are basic building blocks for the UX. If they don’t work well for your users, your UI and interaction design will be full of awkward workarounds.

A more complex platform might also provide ways of organizing and coordinating multiple devices. For example, if a user adds a light to an existing home system, they might reasonably expect the system to know that it should control it along with their other lights and/or offer it as part of the security system. It’s not important to make it talk to the toaster. That may be common sense to a human. But the system won’t know that unless this kind of logic is encoded in the platform.

If you’re building a very simple system of a single device, you might not need to do much platform-level UX to start with. For example, you will need to consider how the user gets the device onto the network. But you don’t necessarily need to design a standardized approach that would work for other types of device as well. But once your system has multiple, interconnected devices, there will be design challenges that will require platform logic to solve. Designers should get involved in shaping platforms to ensure they support good higher-level UX.

There is no commonly understood set of activities for this yet. Chapter 15 discusses what these challenges may look like and some of the approaches we might take to solving them.


UX for connected devices is not just about UI and interaction design. It also requires designers to think about interusability, industrial design, service design, conceptual models, productization, and platform design.

In summary, it differs from “conventional” UX in the following ways:

Embedded devices often save power by connecting only intermittently

... which means parts of the system can be out of sync, creating discontinuities in UX

Latency on the Internet is out of your control (and reliability is not 100%)

... which means although we expect physical things to respond immediately and reliably, this might not happen

Code can run in many more places

... which means users have to engage with the system model to predict how it will work if parts are offline

Devices are distributed in the real world

... which means social and physical context of use is complex and varied

Functionality can be distributed across multiple UIs

... which means designers need to consider not just usability but interusability

Much of the information processing happens in the Internet service

... which means the service experience is often equally or more important than the single device UX

Remote control and automation are programming-like activities

... which means IoT breaks direct manipulation—the basis of most successful consumer UXes

Many differing technical standards

... which means getting things to work together is hard

Complex services can have many users, many UIs, many devices, many rules and applications

... which means understanding and managing how they all interrelate can be extremely difficult. Users will turn off if admin becomes too onerous.

IoT enables us to capture and act on data we didn’t have before

... which means designers need to understand how to use information as a design material

1“Usability Testing of Smarter Heating Controls,” Amberlight Partners for the Department of Energy and Climate Change,

2“Interaction Design Beyond the Interface,”

3Mike Kuniavsky, Smart Things: Ubiquitous Computing User Experience Design (Burlington, MA: Morgan Kaufmann, 2010).

4“Lich House,”

5Ben Shneiderman, “Direct Manipulation for Comprehensible, Predictable and Controllable User Interfaces,” Proceedings of the ACM International Workshop on Intelligent User Interfaces 1997: 33–39.

6In conversation. See Alan F. Blackwell, “What Is Programming?” Proceedings of PPIG 2002: 204–218.

7Access control is a feature of many home automation systems. It has not previously been seen as a need for most household appliances or systems, aside from home security. Some people might argue that it still mostly isn’t necessary, and that it’s a result of an overly technical mindset that treats homes like computer operating systems (in which everyone is an identified user, with an account and access privileges). In Chapters 5 and 6, we discuss issues such as this in the context of appropriate design for user needs and social context. But for systems that do go down this route, it does add significant complexity.

8Kuniavsky, Smart Things, 43

9Jesse James Garrett, The Elements of User Experience: User-Centered Design for the Web and Beyond, 2nd Edition (Berkeley, CA: New Riders, 2010).

Article image: User testing Hackaball with children (source: Made by Many).