Chapter 14. Distillation

Vittorio Castelli and Radu Florian

14.1. Introduction

Distillation is an emerging nontraditional branch of natural language processing (NLP), which lies between the now classical fields of information retrieval (IR) and question answering (QA).

Unlike IR, the aim of distillation is to provide answers supported by one or more passages in the searched collection rather than to retrieve documents or passages that are relevant to a user’s query. Distillation answers could be excerpts from the passages or could be synthesized. Distillation queries can be very complex and can require complex answers. For example, consider

Describe the reactions of <COUNTRY> to <EVENT>,

where <EVENT> could be specified via one or more natural ...

Get Multilingual Natural Language Processing Applications: From Theory to Practice now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.