Now, let's load data from our database, one table at a time, so that we can store it in different RAW areas:
- Loading of Customer Data:
- Run a Sqoop Job for importing customer profile from DB with the following command:
${SQOOP_HOME}/bin/sqoop import --connect jdbc:postgresql://<DB-SERVER-ADDRESS>/sourcedb?schema=public --table customer --m 10 --username postgres --password <DB-PASSWORD> --as-avrodatafile --append --target-dir /datalake/raw/customer
- Once the Sqoop MapReduce jobs are complete, the customer directory can be seen populated with a number of Avro data files as shown in the following figure:
- Run a Sqoop Job for importing customer profile from DB with the following command: