Video description
Effectively store, manage, and analyze large Datasets with HDFS, SQOOP, YARN, and MapReduce
About This Video
- Handle big data with ease using Hadoop and its ecosystem
- Learn to store data with HDFS, transfer bulk data with SQOOP, and manage data efficiently with YARN.
- Make your foundation strong with the basic concepts of Hadoop and big data Analytics
In Detail
Do you struggle to store and handle big data sets? This course will teach to smoothly handle big data sets using Hadoop 3.
The course starts by covering basic commands used by big data developers on a daily basis. Then, you'll focus on HDFS architecture and command lines that a developer uses frequently. Next, you'll use Flume to import data from other ecosystems into the Hadoop ecosystem, which plays a crucial role in the data available for storage and analysis using MapReduce. Also, you'll learn to import and export data from RDBMS to HDFS and vice-versa using SQOOP. Then, you'll learn about Apache Pig, which is used to deal with data using Flume and SQOOP. Here you'll also learn to load, transform, and store data in Pig relation. Finally, you'll dive into Hive functionality and learn to load, update, delete content in Hive.
By the end of the course, you'll have gained enough knowledge to work with big data using Hadoop. So, grab the course and handle big data sets with ease.
The code bundle for this course is available at https://github.com/PacktPublishing/Hands-On-Beginner-s-Guide-on-Big-Data-and-Hadoop-3-.
Publisher resources
Table of contents
-
Chapter 1 : Unix Operating System
- The Course Overview 00:02:30
- Introduction to Unix OS 00:04:39
- Unix Commands 00:07:18
- Unix Commands (Continued) 00:07:30
-
Chapter 2 : Hadoop Distributed File System – HDFS
- HDFS Overview 00:03:30
- HDFS Architecture 00:06:32
- HDFS Commands 00:14:20
- HDFS Commands (Continued) 00:10:10
-
Chapter 3 : Apache Flume
- Introduction to Flume 00:06:31
- How to Start a Flume Agent 00:07:56
- How to Configure a Flume Memory Channel 00:05:13
- How to a Flume Agent 00:05:02
-
Chapter 4 : Apache Sqoop
- Introduction to Apache Sqoop 00:06:52
- Sqoop Import: RDBMS to HDFS 00:11:56
- Sqoop Import: Using SQL Query 00:07:32
- Sqoop Import: RDBMS to Hive 00:07:51
- Sqoop Export: HDFS to RDBMS 00:08:16
-
Chapter 5 : Apache Pig
- Introduction to Pig 00:05:14
- Load Data in Pig Relation 00:04:21
- Data Transformation Using Pig 00:06:38
- Export Data from Pig Relation 00:04:09
- JOIN Operation Using Pig 00:08:57
-
Chapter 6 : Apache Hive
- Introduction to Hive 00:05:09
- Load Data into Hive Table 00:05:50
- Load Data into Hive Table (Continued) 00:06:34
- Update/Delete Contents of Hive Table 00:07:49
- Optimization in Hive 00:04:07
Product information
- Title: Hands-On Beginner’s Guide on Big Data and Hadoop 3
- Author(s):
- Release date: July 2018
- Publisher(s): Packt Publishing
- ISBN: 9781788996099
You might also like
video
Microsoft AZ-900 Certification Course: Azure Fundamentals
Not sure where to start with the Microsoft Azure platform? Whether an IT pro or new …
video
Exam AZ-900 Microsoft Azure Fundamentals (Video)
4 Hours of Video Instruction Prepare for Microsoft Exam AZ-900—and demonstrate your foundational-level knowledge of cloud …
video
AWS Certified Cloud Practitioner Complete Video Course
7 Hours of Video Instruction Seven hours of video instruction covering the fundamentals of cloud computing; …
video
Docker for the Absolute Beginner - Hands-On
Learn Docker with Hands-On Coding Exercises About This Video The course introduces Docker and its key …