Language understanding remains one of AI’s grand challenges

The O’Reilly Data Show Podcast: David Ferrucci on the evolution of AI systems for language understanding.

By Ben Lorica
May 11, 2017
Yada, yada. Yada, yada. (source: Pixabay)

Language understanding remains one of AI’s grand challenges
Data Show Podcast

 
 
00:00 / 00:38:04
 
1X
 

In this episode of the Data Show, I spoke with David Ferrucci, founder of Elemental Cognition and senior technologist at Bridgewater Associates. Ferrucci served as principal investigator of IBM’s DeepQA project and led the Watson team that became champion of the Jeopardy! quiz show. Elemental Cognition (EC) is a research group focused on building an AI system that will be equipped with state-of-the-art natural language understanding technologies. Ferrucci envisions that EC will ship with foundational knowledge in many subject areas, but will be able to very quickly acquire knowledge in other (specialized) domains with the help of “human mentors.”

Having built and deployed several prominent AI systems through the years, I also wanted to get Ferrucci’s perspective on the evolution of AI technologies, and how enterprises can take advantage of all the exciting recent developments.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

Here are some highlights from our conversation:

Watson

Language is a much harder problem. It’s very ambiguous, very uncertain—exactly how to determine whether two words or two phrases might mean the same thing under different contexts is a hard problem. There were a few key things that happened between Deep Blue and Watson. One is huge amounts of information became readily accessible. Giant corpora of information from the web, for example, Wikipedia, became available. Secondly, machines actually got a lot faster, so you’re able to process much more content.

… I could also process tons of language. I can look through these patterns much more efficiently because machines are a lot faster, and I can generalize those patterns using machine learning techniques. So, machines got a lot faster, huge volumes of data became available, and machine learning techniques allowed me to discover patterns in that language data more rapidly and more effectively than ever before. All these things came together so we were able to do something like Watson.

Understanding language

Elemental Cognition is doing something very different—Watson did not actually try to build an understanding. Watson analyzed how words occur together in questions and passages, and it came up with an approximation: this phrase might mean that phrase, and if I see this phrase as part of the question and this phrase as part of a potential answer passage, since they might mean the same thing, then such connections might help formulate an answer. There was no actual understanding.

Elemental Cognition is actually trying to get at the logical model, the mechanistic model, that causes the words as opposed to the patterns in the words. When I say something, there’s a model in my head about how the world works that causes me to say what I’m about to say. What caused that was an understanding. In this specific conversation we’re having right now, I’m using certain words because of my understanding of AI, my understanding of deep learning, of reasoning mechanisms, and so forth.

That’s very different than if I just looked at the surface of the words and said: ‘I’ve analyzed all of Dave Ferrucci’s prior speeches, and he’s about to say these words.’ That’s how chatbots work. There’s no understanding. They look at all of my prior conversations.

… Elemental Cognition’s model for learning is different. The model for learning is more the way you might teach a human. For example you might sit there and say: ‘We’re going to team you up with an expert, and you’re going to read the material and start building a model and asking questions until you, too, can answer the questions that customers pose to you.’  So, it’s very different—you train it much more the way you would train a human. It builds and compounds understanding through reading and dialog.

Related resources:

Post topics: AI & ML, Data, O'Reilly Data Show Podcast
Post tags: Podcast
Share:

Get the O’Reilly Radar Trends to Watch newsletter