Parallel Computing Using Dask

Dask is one of the simplest ways to process your data in a parallel manner. The platform is for pandas lovers who struggle with large datasets. Dask offers scalability in a similar manner to Hadoop and Spark and the same flexibility that Airflow and Luigi provide. Dask can be used to work on pandas DataFrames and Numpy arrays that cannot fit into RAM. It splits these data structures and processes them in parallel while making minimal code changes. It utilizes your laptop power and has the ability to run locally. We can also deploy it on large distributed systems as we deploy Python applications. Dask can execute data in parallel and processes it in less time. It also scales the computation power of your workstation ...

Get Python Data Analysis - Third Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.