Supported Target Types

Applies To:

Pipeline Bundle

Configuration Scope:

Data Flow Spec

Databricks Docs:

https://docs.databricks.com/en/delta-live-tables/python-ref.html

The Lakeflow Framework supports multiple target types.

Target Types

Type

Description

Key Features

Delta Streaming Table

Creates a streaming Delta table that continuously processes and updates data as it arrives.

  • Streaming write optimizations

  • Automatic schema evolution

  • Quality enforcement

  • Incremental processing

Materialized Views

A materialized view is a view that contains precomputed records based on the query that defines the materialized view. Materialized views are commonly used for transformations and Gold Layer tables.

  • Automatic updates based on pipeline schedule/triggers

  • Guaranteed consistency with source data

  • Incremental refresh optimization

  • Ideal for transformations and aggregations

  • Pre-computation of slow queries

  • Optimization for frequently used computations

  • Detailed configuration details: Materialized Views

Delta Sink

Stream records to a Delta tables.

Kafka Sink

Stream records to a Kafka topic.

Foreach Batch Sink

Enables processing each micro-batch with custom logic similar to the Spark Structured Streaming foreachBatch functionality. With the ForEachBatch sink, you can transform, merge, or write streaming data to one or more targets that do not natively support streaming writes

General Data Flow Spec Configuration

Set as an attribute when creating your Data Flow Spec, refer to the Data Flow Spec Reference documentation for more information:

Detailed Target Type Configuration Details