The goal of this chapter is to quickly get you set up with the PySpark environment. There are multiple options discussed, so it is up to the reader to pick their favorite. Folks who already have the environment ready can skip to the “Basic Operations” section later in this chapter.
Local installation using Anaconda
Docker-based installation
Databricks community edition