Appendix B. Installing PySpark

This appendix covers the installation of standalone Spark and PySpark on your own computer, whether it’s running Windows, macOS, or Linux. I also briefly cover cloud offerings, should you want to easily take advantage of PySpark’s distributed nature.

Having a local PySpark cluster means that you’ll be able to experiment with the syntax using smaller data sets. You don’t have to acquire multiple computers or spend money on managed PySpark on the cloud until you’re ready to scale your programs. Once you’re ready to work on a larger data set, you can easily transfer your program to a cloud instance of Spark for additional power.

B.1 Installing PySpark on your local machine

This section covers installing Spark and Python ...

Get Data Analysis with Python and PySpark now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.