Lakeflow Framework documentation

The Lakeflow Framework is a metadata-driven data engineering framework built for Databricks. It accelerates and simplifies the deployment of Spark Declarative Pipelines (SDP) while supporting your entire software development lifecycle.

Key Capabilities:

  • Build robust data pipelines using a configuration-driven, Lego-block approach

  • Support batch and streaming workloads across the medallion architecture (Bronze, Silver, Gold)

  • Deploy seamlessly with Databricks Asset Bundles (DABS)—no wheel files or control tables required

  • Extend and maintain easily as your data platform evolves

This documentation covers everything from getting started to advanced orchestration patterns. Explore the sections below to begin building reliable, maintainable data pipelines.

Contents: