Skip to Content
Data Lake for Enterprises
book

Data Lake for Enterprises

by Vivek Mishra, Tomcy John, Pankaj Misra
May 2017
Beginner to intermediate
596 pages
15h 2m
English
Packt Publishing
Content preview from Data Lake for Enterprises

Data acquisition via Flume into Kafka channel

Now let's use Flume to acquire the address data as well as contacts data from the database and spool file, respectively (same as we did in the previous chapter). In order to achieve this, we will define a single Flume configuration file, ${FLUME_HOME}/conf/customer-address-contact-conf.properties, with dedicated Kafka channels to convert the data of both the sources into events.

The complete Flume configuration is as shown here:

agent.sources = sql-source spool-sourceagent.sources.spool-source.type=spooldiragent.sources.spool-source.spoolDir=<spool-file-data-dir>agent.sources.spool-source.inputCharset=ASCIIagent.sources.sql-source.type=org.keedio.flume.source.SQLSourceagent.sources.sql-source.hibernate.connection.url=jdbc:postgresql://localhost/sourcedb?schema=public ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

The Enterprise Big Data Lake

The Enterprise Big Data Lake

Alex Gorelik
Operationalizing the Data Lake

Operationalizing the Data Lake

Holden Ackerman, Jon King
Data Lakes

Data Lakes

Anne Laurent, Dominique Laurent, Cédrine Madera

Publisher Resources

ISBN: 9781787281349Supplemental Content