Appendix D HPC and Hadoop
Bridging HPC to Hadoop
Overview
The goal of this document is to demonstrate how legacy HPC tools and methodology effectively lessen the learning curve for users familiar with HPC but new to Hadoop.
Environment Modules
In the HPC world, the usage of environment modules is standard. An environment module is a TCL script that when invoked prepends specific software, library, and other variables to the user’s path. For instance, if I logged into an HPC cluster where the default Python version was stock 2.6 but I wanted to use 2.7, I would be able to invoke the 2.7 module if it was available. After loading the module I would be able to simply type python—version and expect to see Python 2.7.x returned. For more information on environment modules, see: http://www.admin-magazine.com/HPC/Articles/Environment-Modules/.
Apache Spark
As I’m an HPC admin and Hadoop admin, I had the unique opportunity to see how multiple versions of a software package coexisted on a cluster of servers. In Hadoop land, the Apache Spark project was under heavy development with new builds and features released on an aggressive schedule. Like most environments we had production use cases for the older “stable” releases. (The term “stable” is used very loosely here.) As resources are limited we didn’t have a spare 40-node Hadoop cluster for development so a solution had to be found for multiple Spark versions to coexist along with the default configuration and specific build/integration ...
Get Strategies in Biomedical Data Science now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.