Chapter 11. Data Extraction and Client Access Security

One of the core philosophies of Hadoop is to bring processing to the data rather than the other way around, so our focus has been on how security works inside the cluster. In this chapter, we’ll cover the last mile of securing a Hadoop cluster, namely securing client access and data extraction. While most of the processing of Hadoop data is done on the cluster itself, users will access that data via external tools, and some use cases, such as an enterprise data warehouse, require extracting potentially large volumes of data for use in specialized tools.

The most basic form of client access comes in the form of command-line tools. As we described in “Edge Nodes”, it’s common for clusters to limit external access to a small set of edge nodes. Users use ssh to remotely log into an edge node and then use various command-line tools to interact with the cluster. A brief description of the most common commands is shown in Table 11-1.

Table 11-1. Common command-line tools for client access
Command Description

hdfs dfs -put <src> <dst>

Copy a local file into HDFS

hdfs dfs -get <src> <dst>

Download a file from HDFS to the local filesystem

hdfs dfs -cat <path>

Print the contents of a file to standard out

hdfs dfs -ls <path>

List the files and directories in a path

hdfs dfs -mkdir <path>

Make a directory in HDFS

hdfs dfs -cp <src> <dst>

Copy an HDFS file to a new location

hdfs dfs -mv <src> <dst>

Move an HDFS ...

Get Hadoop Security now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.