May 2017
Beginner to intermediate
596 pages
15h 2m
English
As discussed in the earlier chapter, this class is used for converting a Kafka event into Tuple2 so that they can be written to HDFS. Once the event is converted into Tuple2, it can also be used to write/put into Elasticsearch (ES). Since we want to have both HDFS and ES sinks as part of the same transaction, such a conversion would help.
This deserialization is slightly different for address and contacts, since both are being streamed via different sources. As seen from the Flume configuration, address is sourced from database using sql-source which adds a timestamp and are comma separated elements, while contacts is sourced from spool file, which add the 2 byte character before every Flume event and contains ...