O'Reilly logo

Extending OpenStack by Omar Khedher

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Executing jobs

Sahara facilitates the execution of jobs and bursting workloads in big data clusters running any supported EDP workload platform in OpenStack. As we have rapidly deployed a Spark cluster in the previous section, associated jobs in Sahara can be managed very easily.

Running jobs in Sahara requires the localization of the data source and destination from which the Sahara engine will fetch, analyze, and store them respectively. Sahara supports mainly three types of input/output data storage:

  • Swift: This designates the OpenStack object storage as the main location for data input and the destination of the output result
  • HDFS: This uses any running OpenStack instance backed by HDFS storage
  • Manila: This uses the OpenStack network ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required