O'Reilly logo

HBase High Performance Cookbook by Ruchir Choudhry

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Using Sqoop

Sqoop provides an excellent way to import data in parallel from existing RDBMs to HDFS. We get an exact set of table structures that are imported. This happens because of parallel processing. These files can have text delimited by ',' '|', and so on. After manipulating imported records by using MapReduce or Hive, the output result set can be exported back to RDBMS. The data imported can be done in real time or in the batch process (using a cron job).

Getting ready

Prerequisites:

HBase and Hadoop cluster must be up and running.

You can do a wget to http://mirrors.gigenet.com/apache/sqoop/1.4.6/sqoop-1.4.6.tar.gz

Untar it to /u/HbaseB using tar –zxvf sqoop-1.4.6.tar.gz

It will create a /u/HbaseB/sqoop-1.4.6 folder.

A Sqoop user is created ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required