Adding commands that talk to HDFS for deployment in Karaf
As HDFS at its core is a filesystem, let's see how we can access that with the standard tools and the bundle we've been building up so far.
What we'll do is store one level of configuration files from our running Karaf container into HDFS. Then, we'll provide a second command to read the files back.
We've learned how to build a feature for Hadoop that takes care of all the various dependencies needed to talk to HDFS, and we have also jumped a little bit ahead and discussed classloading and a few tricks to get the Hadoop libraries we deployed to cooperate. We are now at a point where we can start writing code against Hadoop using the libraries provided.
Getting ready
The ingredients of this ...
Get Apache Karaf Cookbook now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.