Azure Databricks — SKU Constellation Map

What a CDO or Head of Data needs to know about Azure Databricks as a SKU — the DBU + Azure VM dual-cost model, Unity Catalog as the governance plane, Photon adoption decisions, and the Fabric/Snowflake positioning.

BusinessCapabilityTechnology
Compass
  • Businesspersona, use case, outcome
  • Capabilitywhat the org needs to do
  • Technologythe technology choices
Guided journey · Step 1 of 4

Azure Databricks — SKU Anchor

Model DBU cost against forecast workloads — Photon, tier, and workload class drive 5x swings. Bring the commercial team in early to negotiate DBCU.

~ 3 weeks

Search any SKU, capability, risk, or source on this map.

Filter by type

Narrative intro

Azure Databricks is the canonical lakehouse — Delta Lake storage, Spark compute, Unity Catalog governance, Photon for SQL, MLflow and Mosaic AI for the ML lifecycle. The dual-cost model (DBU on top of Azure VM) is the procurement story; Unity Catalog is the operating story. This map walks both — the commercial shape and the platform shape — for buyers evaluating Databricks against Microsoft Fabric and Snowflake.

Key takeaways

  • DBU + Azure VM dual-cost model — model carefully, then negotiate DBCU commitments
  • Unity Catalog is the governance plane from day one, not a later upgrade
  • Photon is a per-workload decision — SQL usually wins, ML/engineering less so
  • Mosaic AI is the GenAI surface where the data is already governed by Unity Catalog
  • Fabric and Snowflake are alternative anchors; Databricks competes on lakehouse maturity

Programme shape

Estimated duration
1024 weeks
Estimated FTE
1 FTE data engineering lead + part-time governance and ML SMEs
Spend tier
major
Risk level
elevated

DBU pricing complexity is the largest single procurement risk. Photon, tier, and workload combinations swing DBU rates 5x — modelling is mandatory before commitment.

Back to all maps