Apache Flume Tutorial
Introduction -
Apache Flume is an open-source tool created to handle huge amounts of streaming data—like logs or events—and safely transport them into a centralized storage system (such as HDFS or HBase).
Audiences & Objective -
In this tutorial, all the topics are covered with in-depth information and can understandable by anyone. The tutorial is intended for the Hadoop developers who are having minimum Hadoop and java knowledge. Readers can be anyone who has a basic knowledge on orgnizational controlling activities.
Prerequisites -
Readers who are having the Hadoop and java knowledge can understand the concepts a bit easily. Reader without having Hadoop and java knowledge may need to go through the topic more than once to understand clearly.