Apache Flume: Distributed Log Collection for Hadoop — Save 50%
Stream data to Hadoop using Apache Flume with this book and ebook
In this article by Steve Hoffman, author of Apache Flume: Distributed Log Collection for Hadoop, we'll put Avro to use in communication between Flume agents.
(For more resources related to this topic, see here.)
A typical configuration might look something as follows:
To use the Avro Source, you specify the type property with a value of avro. You need to provide a bind address and port number to listen on:
Here we have configured the agent on the right that listens on port 42424, uses a memory channel, and writes to HDFS. Here I've used the memory channel for brevity of this example configuration. Also, note that I've given this agent a different name, collector, just to avoid confusion.
The agents on the left—feeding the collector tier—might have a configuration similar to this. I have left the sources off this configuration for brevity:
The hostname value, collector.example.com, has nothing to do with the agent name on that machine, it is the host name (or you can use an IP) of the target machine with the receiving Avro Source. This configuration, named client, would be applied to both agents on the left assuming both had similar source configurations.
Since I don't like single points of failure, I would configure two collector agents with the preceding configuration and instead set each client agent to round robin between the two using a sink group. Again, I've left off the sources for brevity:
In this article, we covered tiering data flows using the Avro Source and Sink. More information on this topic can be found in the book Apache Flume: Distributed Log Collection for Hadoop.
Resources for Article :
- Supporting hypervisors by OpenNebula [Article]
- Integration with System Center Operations Manager 2012 SP1 [Article]
- VMware View 5 Desktop Virtualization [Article]
|Stream data to Hadoop using Apache Flume with this book and ebook|
eBook Price: £13.99
Book Price: £22.99
About the Author :
Steve Hoffman has 30 years of software development experience and holds a B.S. in computer engineering from the University of Illinois Urbana-Champaign and a M.S. in computer science from the DePaul University. He is currently a Principal Engineer at Orbitz Worldwide.
More information on Steve can be found at http://bit.ly/bacoboy or on Twitter @bacoboy.
This is Steve's first book.