Deploying the Framework
Deploying From Your Local Machine
The steps below will guide you through deploying the Lakeflow Framework to your Databricks workspace and assume you have cloned the Lakeflow Framework repository and are in the root directory of the repository.
Ensure you have the Databricks CLI installed and configured. If not, please refer to the Databricks CLI documentation.
Ensure the correct Databricks workspace is set as the workspace host field in the databricks.yml file or ensure no host is set to use the default host confgured on the profile used by the Databricks CLI (Databricks CLI should be configured with credentials to access this workspace). databricks.yml file should look like this to add a host:
bundle: name: dlt_framework include: - resources/*.yml targets: dev: mode: development default: true workspace: host: https://<your-databricks-workspace-url>
Run the following command from the root directoy to validate the Lakeflow Framework bundle:
databricks bundle validateThis command will run a series of checks to ensure the bundle is correctly set up and ready for deployment.
Run the following command to deploy the Lakeflow Framework to your Databricks workspace:
databricks bundle deployOnce the deployment is successful, you should see the Lakeflow Framework bundle in your Databricks workspace. To varify, you can go to your Databricks workspace and check if the bundle is present in the
.bundledirectory.
Note
Databricks CLI will deploy the bundle to the default target workspace (usually dev by default) specified in the databricks.yml file. If you want to deploy the bundle to a different tagret, you can specify the target host using the -t option in the deploy command.
Databricks CLI will deploy using default credentials. If you want to deploy using a different set of credentials, you can specify the profile using the -p option in the deploy command.
Deploying via CI/CD
Please refer to the CI/CD documentation for more information on how to deploy the Lakeflow Framework samples using CI/CD.