Databricks
Delta Lake tables with Unity Catalog support.
What it does
The Databricks destination writes data to Delta Lake tables with Unity Catalog integration. It supports automatic table creation, schema evolution, and multiple write modes. Data lands in your lakehouse ready for SQL analytics and ML workloads.
Why connect Databricks to StreamFlows
Databricks unifies data engineering, analytics, and machine learning on a single platform. StreamFlows loads your source data into Delta Lake tables so your data team can use SQL, Python, or Spark without building custom ingestion pipelines.
How Databricks fits in your pipeline
Sources
StreamFlows
Extract
Schedule
Checkpoint
Load
Destinations
Get started with Databricks
Create a free account, connect your databricks warehouse, and start syncing in minutes.