Adding commands that talk to HDFS for deployment in Karaf
As HDFS at its core is a filesystem, let's see how we can access that with the standard tools and the bundle we've been building up so far.
What we'll do is store one level of configuration files from our running Karaf container into HDFS. Then, we'll provide a second command to read the files back.
We've learned how to build a feature for Hadoop that takes care of all the various dependencies needed to talk to HDFS, and we have also jumped a little bit ahead and discussed classloading and a few tricks to get the Hadoop libraries we deployed to cooperate. We are now at a point where we can start writing code against Hadoop using the libraries provided.
The ingredients of this ...