SBT is the tool used to compile your Scala scripts into JAR files and Java bytecode that can be distributed and executed efficiently across your Spark cluster. Amazon Web Services pro Frank Kane shows you how to build SBT files in the correct format so you can deftly compile and package your script, then upload it to Amazon S3 in preparation for running on an Elastic MapReduce (EMR) cluster.

Learn more about running Spark for big data analysis on the Amazon Elastic MapReduce service (EMR) with video training from Frank Kane.

Article image: screenshot from "How do I package a Spark Scala script with SBT for use on an Amazon Elastic MapReduce (EMR) cluster?" (source: O'Reilly).