Import into HDFS

The following is a sample command to import data into HDFS:

$sqoop import -connect jdbc:mysql://localhost/dbname -table <table_name>   --username <username> --password >password> -m 4

The import is done in two steps, which are as follows.

  1. Sqoop scans the database and collects the table metadata to be imported
  2. Sqoop submits a map-only job and transfers the actual data using necessary metadata

The imported data is saved in HDFS folders. The user can specify alternative folders. The imported data is saved in a directory on HDFS, based on the table being imported. As is the case with most aspects of a Sqoop operation, the user can specify any alternative directory where the files should be populated. You can easily override the ...

Get Modern Big Data Processing with Hadoop now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.