Data loads

Now, let's load data from our database, one table at a time, so that we can store it in different RAW areas:

  1. Loading of Customer Data:
    1. Run a Sqoop Job for importing customer profile from DB with the following command:
      ${SQOOP_HOME}/bin/sqoop import --connect jdbc:postgresql://<DB-SERVER-ADDRESS>/sourcedb?schema=public --table customer --m 10 --username postgres --password <DB-PASSWORD> --as-avrodatafile --append --target-dir /datalake/raw/customer
      
    1. Once the Sqoop MapReduce jobs are complete, the customer directory can be seen populated with a number of Avro data files as shown in the following figure:
Figure 24: Avro Data ...

Get Data Lake for Enterprises now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.