Chapter 5. Export

The previous three chapters had one thing in common: they described various use cases of transferring data from a database server to the Hadoop ecosystem. What if you have the opposite scenario and need to transfer generated, processed, or backed-up data from Hadoop to your database? Sqoop also provides facilities for this use case, and the following recipes in this chapter will help you understand how to take advantage of this feature.

Transferring Data from Hadoop

Problem

You have a workflow of various Hive and MapReduce jobs that are generating data on a Hadoop cluster. You need to transfer this data to your relational database for easy querying.

Solution

You can use Sqoop’s export feature that allows you to transfer data from the Hadoop ecosystem to relational databases. For example, to export data from the export-dir directory cities (the directory in HDFS that contains the source data) into table cities (the table to populate in the database), you would use the following Sqoop command:

sqoop export \
  --connect jdbc:mysql://mysql.example.com/sqoop \
  --username sqoop \
  --password sqoop \
  --table cities \
  --export-dir cities

Discussion

Export works similarly to import, except export transfers data in the other direction. Instead of transferring data from the relational database using SELECT queries, Sqoop will transfer the data to the relational database using INSERT statements. Sqoop’s export workflow matches the import case with slight differences. After you execute ...

Get Apache Sqoop Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.