Adding commands that talk to HDFS for deployment in Karaf

As HDFS at its core is a filesystem, let's see how we can access that with the standard tools and the bundle we've been building up so far.

What we'll do is store one level of configuration files from our running Karaf container into HDFS. Then, we'll provide a second command to read the files back.

We've learned how to build a feature for Hadoop that takes care of all the various dependencies needed to talk to HDFS, and we have also jumped a little bit ahead and discussed classloading and a few tricks to get the Hadoop libraries we deployed to cooperate. We are now at a point where we can start writing code against Hadoop using the libraries provided.

Getting ready

The ingredients of this ...

Get Apache Karaf Cookbook now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.