Applying user-defined functions in SparkR

In this recipe we'll see how to apply the functions such as dapply, gapply and lapply over the Spark DataFrame.

Getting ready

To step through this recipe, you will need a running Spark Cluster either in pseudo distributed mode or in one of the distributed modes that is, standalone, YARN, or Mesos. Also, install RStudio. Please refer the Installing R recipe for details on the installation of R and Creating SparkR DataFrames recipe to get acquainted with the creation of DataFrames from a variety of data sources.

How to do it…

In this recipe, we'll see how to apply the user defined functions available as of Spark 2.0.2.

  1. Here is the code which applies dapply on the Spark DataFrame.
     schema <- structType(structField("eruptions", ...

Get Apache Spark for Data Science Cookbook now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.