Chapter 14. Distillation
Vittorio Castelli and Radu Florian
Distillation is an emerging nontraditional branch of natural language processing (NLP), which lies between the now classical fields of information retrieval (IR) and question answering (QA).
Unlike IR, the aim of distillation is to provide answers supported by one or more passages in the searched collection rather than to retrieve documents or passages that are relevant to a user’s query. Distillation answers could be excerpts from the passages or could be synthesized. Distillation queries can be very complex and can require complex answers. For example, consider
Describe the reactions of <COUNTRY> to <EVENT>,
where <EVENT> could be specified via one or more natural ...