Transfer data efficiently between RDBMS and Hadoop ecosystems using Apache Sqoop with Packt’s new eBook
Packt is pleased to announce the release of Instant Apache Sqoop, a practical guide that will help the reader take advantage of the real power of Apache Sqoop to transfer data between RDBMS and the Hadoop ecosystem. This fast and focused guide is packed with practical examples and challenges to test and improve readers' knowledge of Apache Sqoop. This 58-page Instant eBook is available in PDF, e-pub, and Kindle formats for $15.99.
About the author:
Ankit Jain is an experienced software developer with over two years of expertise in implementing, designing, and managing big data solutions for industry leaders. His core skills include Hadoop, HBase, Hive, Sqoop, Flume, Elasticsearch, Machine Learning, Kafka, Storm, Java, and J2EE. He is currently employed by Impetus Infotech Pvt Ltd.
Instant Apache Sqoop gets the reader started with the import/export process required in data transfer and shows them examples of each process. It then introduces the reader to HBase and Hive table structures, and shows them how to populate HBase and Hive tables. Step-by-step instructions teach the reader about important topics such as moving data between RDBMS and the Hadoop ecosystem, third-party Sqoop connectors, and Sqoop’s incremental import feature, to name but a few.
This is a great resource for developers with some some experience in Hadoop HBase and Hive who are looking to harness the power of Apache Sqoop. For more details, please visit the book page at http://www.packtpub.com/apache-sqoop-transfer-data-between-rdbms-hadoop/book
|Instant Apache Sqoop|
|Learn how to transfer data between RDBMS and Hadoop using Sqoop
For more information, please visit: http://www.packtpub.com/apache-sqoop-transfer-data-between-rdbms-hadoop/book