O'Reilly logo

Hadoop Security by Joey Echeverria, Ben Spivey

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Chapter 11. Data Extraction and Client Access Security

One of the core philosophies of Hadoop is to bring processing to the data rather than the other way around, so our focus has been on how security works inside the cluster. In this chapter, we’ll cover the last mile of securing a Hadoop cluster, namely securing client access and data extraction. While most of the processing of Hadoop data is done on the cluster itself, users will access that data via external tools, and some use cases, such as an enterprise data warehouse, require extracting potentially large volumes of data for use in specialized tools.

The most basic form of client access comes in the form of command-line tools. As we described in “Edge Nodes”, it’s common for clusters to limit external access to a small set of edge nodes. Users use ssh to remotely log into an edge node and then use various command-line tools to interact with the cluster. A brief description of the most common commands is shown in Table 11-1.

Table 11-1. Common command-line tools for client access
Command Description

hdfs dfs -put <src> <dst>

Copy a local file into HDFS

hdfs dfs -get <src> <dst>

Download a file from HDFS to the local filesystem

hdfs dfs -cat <path>

Print the contents of a file to standard out

hdfs dfs -ls <path>

List the files and directories in a path

hdfs dfs -mkdir <path>

Make a directory in HDFS

hdfs dfs -cp <src> <dst>

Copy an HDFS file to a new location

hdfs dfs -mv <src> <dst>

Move an HDFS file to ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required