Getting Started
The following section is a quick start guide on how to get started with the Lakeflow Framework as a data engineer.
Pre-Requisites
Databricks CLI installed and configured, if you are using DABs to locally deploy the Lakeflow Framework and Pipeline Bundles.
Access to a Databricks workspace.
VSCode installed.
Setup
Follow the below steps to get yourself setup to learn and use the Lakeflow Framework:
Understanding the Framework
Step through and execute one of the basic samples and inspect the create_dataflow_spec
Developing your first Pipeline Bundle
Select from one of the recommended pipeline patterns that best fits your use case, as documented in Data Flow and Pipeline Patterns
Build and deploy a data pipeline bundle. Refer to Build and Deploy Pipelines.