April 2026
Intermediate to advanced
412 pages
10h 17m
English
Getting notebooks into production is where most data work hits reality. Notebooks work great for trying things out, but running stuff reliably at scale needs more than just executing notebooks on a schedule. You're moving from experimenting to building workflows that won't break when real data hits them.
Lakeflow Jobs is the native orchestrator within Databricks, designed to handle everything from simple scheduled notebook runs to complex, multi-task dependency graphs. Databricks recommends using Lakeflow Jobs as your primary orchestrator for all task dependencies within the platform. While you can integrate these encapsulated workflows into external orchestrators like Azure Data Factory ...
Read now
Unlock full access