About This Architecture
Azure Data Factory orchestrates a three-stage Databricks ETL pipeline with dedicated job clusters for audit, processing, and validation workflows. Azure Data Factory triggers Audit Start, ETL Processing, and Audit End notebooks sequentially, each running on ephemeral job clusters that spin up, execute, and terminate to optimize costs. Data flows from ADLS Gen2 through Databricks transformations into Snowflake, with Azure Monitor tracking pipeline health and Key Vault securing credentials. Fork this diagram to customize cluster configurations, notebook logic, or add additional data sources and sinks for your enterprise data platform.