Other BigData Tools andTechnologies | 179
Hadoop HDFS for analysis. In this way, a company can track the users’ main area about prod-
uct searching, for example, mobile in electronics, sports shoes and gym equipment in sports
area, etc.
Flume is used to move the log data generated by application servers into HDFS at a higher
speed.
7.4.4 Components of Flume
• Event: Event is the single log entry or unit of data which we transport further.
• Source: Source is the component by which data enters Flume workflows.
• Sink: For transporting data to the desired destination, Sink is responsible.
• Channel: Channel is nothing but a duct between the Sink and Source.
• Agent: Agent is what we have known as any JVM that runs Flume.
• Client: ...