Time for action – using a type mapping
Let's use a type mapping to improve our data import.
- Delete any existing output directory:
$ hadoop fs -rmr employees
- Execute Sqoop with an explicit type mapping:
sqoop import --connect jdbc:mysql://10.0.0.100/hadooptest --username hadoopuser -P --table employees --hive-import --hive-table employees --map-column-hive start_date=timestamp
You will receive the following response:
12/05/23 14:53:38 INFO hive.HiveImport: Hive import complete.
- Examine the created table definition:
$ hive -e "describe employees"
You will receive the following response:
OK first_name string dept string salary int start_date timestamp Time taken: 2.547 seconds
- Examine the imported data:
$ hive -e "select * from employees"; ...
Get Hadoop: Data Processing and Modelling now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.