Chapter 14. Distillation

Vittorio Castelli and Radu Florian

14.1. Introduction

Distillation is an emerging nontraditional branch of natural language processing (NLP), which lies between the now classical fields of information retrieval (IR) and question answering (QA).

Unlike IR, the aim of distillation is to provide answers supported by one or more passages in the searched collection rather than to retrieve documents or passages that are relevant to a user’s query. Distillation answers could be excerpts from the passages or could be synthesized. Distillation queries can be very complex and can require complex answers. For example, consider

Describe the reactions of <COUNTRY> to <EVENT>,

where <EVENT> could be specified via one or more natural ...

Get Multilingual Natural Language Processing Applications: From Theory to Practice now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.