Free Sample
+ Collection

Instant Apache Sqoop

Ankit Jain

Transfer data efficiently between RDBMS and the Hadoop ecosystem using the robust Apache Sqoop
RRP $14.99

Want this title & more?

$12.99 p/month

Subscribe to PacktLib

Enjoy full and instant access to over 2000 books and videos – you’ll find everything you need to stay ahead of the curve and make sure you can always get the job done.

Book Details

ISBN 139781782165767
Paperback58 pages

About This Book

  • Learn something new in an Instant! A short, fast, focused guide delivering immediate results
  • Learn how to transfer data between RDBMS and Hadoop using Sqoop
  • Add a third-party connector into Sqoop
  • Export data from Hadoop and Hive to RDBMS
  • Describe third-party Sqoop connectors

Who This Book Is For

This book is great for developers who are looking to get a good grounding in how to effectively and efficiently move data between RDBMS and the Hadoop ecosystem. It’s assumed that you will have some experience in Hadoop already as well as some familiarity with HBase and Hive.

Table of Contents

Chapter 1: Instant Apache Sqoop
Working with the import process (Intermediate)
Incremental import (Simple)
Populating the HBase table (Simple)
Importing data into HBase (Intermediate)
Populating the Hive table (Simple)
Importing data into Hive (Simple)
The exporting process (Intermediate)
Exporting data from Hive (Simple)
Using Sqoop connectors (Advanced)

What You Will Learn

  • Understand the Sqoop import arguments and the provided examples to master moving data from RDBMS to Hadoop
  • Get to know the Sqoop incremental import feature
  • Understand the HBase table structure, HBase basic commands, and learn how to move data from RDBMS to HBase
  • Learn about the Hive table structure, Hive basic commands, and understand the provided examples to discover how to move data from RDBMS to Hive
  • Explore the Sqoop export arguments and learn how to move process data from Hadoop to RDBMS
  • Learn how to move data from Hive to RDBMS
  • Discover Sqoop third-party connectors

In Detail

In today’s world, data size is growing at a very fast rate, and people want to perform analytics by combining different sources of data (RDBMS, Text, and so on). Using Hadoop for analytics requires you to load data from RDBMS to Hadoop and perform analytics on that data, before then loading that process data back to RDBMS to generate business reports.

Instant Apache Sqoop is a practical, hands-on guide that provides you with a number of clear, step-by-step exercises that will help you to take advantage of the real power of Apache Sqoop and give you a good grounding in the knowledge required to transfer data between RDBMS and the Hadoop ecosystem.

Instant Apache Sqoop looks at the import/export process required in data transfer and discusses examples of each process. It will also give you an overview of HBase and Hive table structures and how you can populate HBase and Hive tables. The book will finish by taking you through a number of third-party Sqoop connectors.

You will also learn about various import and export arguments and how you can use these arguments to move data between RDBMS and the Hadoop ecosystem. This book also explains the architecture of import and export processes. The book will also take a look at some Sqoop connectors and will discuss examples of each connector. If you want to move data between RDBMS and the Hadoop ecosystem, then this is the book for you.

You will learn everything that you need to know to transfer data between RDBMS and the Hadoop ecosystem as well as how you can add new connectors into Sqoop.


Read More