Ingest Node

Traditionally, Logstash is used to preprocess your data before indexing into Elasticsearch. Using Logstash, you can define pipelines to extract, transform, and index your data into Elasticsearch.

In Elasticsearch 5.0, the ingest node has been introduced. Using the ingest node, pipelines to modify the documents before indexing can be defined. A pipeline is a series of processors, each processor working on one or more fields in the document. The most commonly used Logstash filters are available as processors. For example, using a grok filter to extract data from an Apache log file into a document, extracting fields from JSON, changing the date format, calculating geo-distance from a location, and so on. The possibilities are endless. ...

Get Learning Elasticsearch now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.