Skip to Content
Data Lake for Enterprises
book

Data Lake for Enterprises

by Vivek Mishra, Tomcy John, Pankaj Misra
May 2017
Beginner to intermediate
596 pages
15h 2m
English
Packt Publishing
Content preview from Data Lake for Enterprises

Customer data import into Hive using Sqoop

The following configuration may be required to be added in ${HADOOP_HOME}/etc/hadoop/core-site.xml so that hue can impersonate the user creating the Parquet file:

<property>  <name>hadoop.proxyuser.hue.hosts</name>  <value>*</value></property><property>  <name>hadoop.proxyuser.hue.groups</name>  <value>*</value></property>

Once the preceding configurations are added, we will need to restart the dfs service with the following command:

${HADOOP_HOME}/sbin/stop-dfs.sh && ${HADOOP_HOME}/sbin/start-dfs.sh

Import the customer records from database to Hadoop RAW storage using Sqoop job which would write the data in Parquet format, using the following command:

${SQOOP_HOME}/bin/sqoop import --connect jdbc:postgresql://<DB_SERVER_ADDRESS>/sourcedb?schema=public ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

The Enterprise Big Data Lake

The Enterprise Big Data Lake

Alex Gorelik
Operationalizing the Data Lake

Operationalizing the Data Lake

Holden Ackerman, Jon King
Data Lakes

Data Lakes

Anne Laurent, Dominique Laurent, Cédrine Madera

Publisher Resources

ISBN: 9781787281349Supplemental Content