The importance of emotion in AI systems

The O’Reilly Data Show Podcast: Rana el Kaliouby on deep learning, emotion detection, and user engagement in an attention economy.

By Ben Lorica
September 8, 2016
Nature-kaleidoscope. Nature-kaleidoscope. (source: Reinhard Klar on Flickr)

The importance of emotion in AI systems
Data Show Podcast

 
 
00:00 / 00:32:13
 
1X
 

While I was in Beijing for Strata + Hadoop World, several people reminded me of the chatbot Xiaoice—one of the most popular accounts on the Chinese social media site Weibo. Developed by Microsoft researchers, Xiaoice comes with a personality and is able to engage users in extended conversations on Weibo. These types of capabilities highlight that in an attention economy, systems that are able to forge an emotional connection will garner more loyalty and engagement from users.

In this episode of the O’Reilly Data Show, I sat down with Rana el Kaliouby, co-founder and CEO of Affectiva, one of the leading experts in emotion sensing systems. We talked about the impact of deep learning and computer vision, Affectiva’s large facial expression database, and privacy and ethics in an era of multimodal systems.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

Here are some highlights from our conversation:

Emotion artificial intelligence

We call it ’emotion artificial intelligence.’ If you think of humans as having multiple intelligences, one is your IQ, and one that is equally important, if not more, is your EQ. We’re building the digital equivalent of human emotional intelligence, or human EQ.

Let’s start with why it’s important for humans, because I do think it’s analogous. Our emotions influence every aspect of our lives, from how we connect with each other, to how we make decisions, to our health and well-being. Your emotional state can affect a very simple decision, like what you’re having for breakfast, to very big decisions, like what house you’re going to buy or who you’re going to marry. People who have high EQ, who are able to translate their emotion reading and sensing skills into their day-to-day behavior, tend to be more likable as human beings. They tend to be more successful in their professional lives and they actually tend to be healthier. They live longer and happier lives in general.

As more and more of our world migrates online and becomes digital, and as we become more intimate with our devices, we’re spending a lot of time interacting with these technologies; a big piece of it is whether they have a level of emotional intelligence. If we expect our devices to persuade us to change our behavior, like exercise more or lead a healthier, more productive lifestyle, then these devices and these digital experiences and apps really need to know how to persuade and motivate. That all boils down to the technologies utilizing more emotional intelligence.

User engagement

We see that, with whichever company engages us, their ultimate goal is to build an emotional connection with their consumer. They want that because that drives usage, loyalty, word of mouth, and trust. They want to do that with their products, their advertising, and user experience, so that when you check into a hotel, for example, they want to be able to quantify how they engaged you—was it in a positive way and did they build that deep connection?

As consumers now, we have a lot of options, and I think we’re way more in control of the brands and products we use. Part of how we choose is the functionality, but a big part is also the brand and our connection with the brand. We see that across the board, and being able to not only accurately measure and quantify it, but also elicit it, is key, and our technology plays on both sides of that.

Emotion detection research

The analogy I use is that our emotion AI system is like a toddler right now, in terms of its capabilities. It can read a bunch of emotional states, it can read expressions pretty accurately, for example, but areas where we’re spending a lot of our research efforts—and I see the world moving in that direction—is moving from the simpler emotions to some of the more complex emotions.

What do you look like when you’re worried, inspired, or jealous? These are some of the more complex emotional states, and I don’t think anybody’s built a robust model of that. We’re looking into health-related states—so, longitudinal mood tracking or pain. What does pain look like on the face? What does depression look like? Then I’d say the next frontier, if you like, is multimodal. Combining physiological tracking with facial coding, emotion analytics, and maybe even voice, combining the tone of your voice with your facial expressions.

Related resources:

Post topics: AI & ML, Data, O'Reilly Data Show Podcast
Post tags: Podcast
Share:

Get the O’Reilly Radar Trends to Watch newsletter