Applying user-defined functions in SparkR

In this recipe we'll see how to apply the functions such as dapply, gapply and lapply over the Spark DataFrame.

Getting ready

To step through this recipe, you will need a running Spark Cluster either in pseudo distributed mode or in one of the distributed modes that is, standalone, YARN, or Mesos. Also, install RStudio. Please refer the Installing R recipe for details on the installation of R and Creating SparkR DataFrames recipe to get acquainted with the creation of DataFrames from a variety of data sources.

How to do it…

In this recipe, we'll see how to apply the user defined functions available as of Spark 2.0.2.

  1. Here is the code which applies dapply on the Spark DataFrame.
     schema <- structType(structField("eruptions", ...

Get Apache Spark for Data Science Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.