Modern Data Platform for CDOs

A CDO's view of the strategic data platform decision — Fabric vs Databricks vs Snowflake, the foundations underneath, the governance overlay, and the operating model that makes the investment compound. The map that anchors enterprise data strategy conversations.

BusinessCapabilityTechnologySource
Compass
  • Businesspersona, use case, outcome
  • Capabilitywhat the org needs to do
  • Technologythe technology choices
  • Sourcewhere the evidence sits
Guided journey · Step 1 of 4

Platform Selection Strategy

Start with platform selection. Microsoft Fabric, Azure Databricks, and Snowflake are all credible. The decision shapes every later pillar — cost shape, governance model, talent fit, AI readiness. Run the structured comparison; don't pick on feature checklist.

~ 12 weeks

Search any SKU, capability, risk, or source on this map.

Filter by type

Narrative intro

The defining data platform decision in 2026 is not whether to invest — every enterprise has multiple data platforms already, often three or four through accretion. The question is whether to consolidate, on what, and what role each major vendor plays in the strategic three-year picture. Microsoft Fabric is the integrated SaaS answer; Databricks is the lakehouse and ML answer; Snowflake is the multi-cloud SQL warehouse answer with the strongest data-sharing story. None is obviously the right choice for every organisation. The hard part of the data platform decision isn't technology selection — features are roughly comparable and converging. It's the operating model that comes with each: Fabric's capacity-based pricing favours predictable workloads but punishes spiky ones; Databricks DBU pricing is the most opaque and most flexible; Snowflake's separation of compute and storage is the most transparent and the most exposed to warehouse-sizing discipline. The platform decision is, in practice, a finance and operating-model decision wearing a technology brief. This briefing covers the four pillars a CDO needs credible answers on before committing to a multi-year data platform programme: platform selection strategy, data foundations, enterprise data governance, and platform operations. The featured SKUs are the candidate platforms and their substrate; the operating-model investment is what determines whether the platform delivers business outcomes or just licence spend.

Key takeaways

  • Platform selection is reversible at higher cost than most CDOs assume. Run the structured comparison up front; pay back many times.
  • Cost shape differs more than features. Fabric capacity favours predictability; Databricks DBU favours flexibility with opacity; Snowflake credits favour transparency with sizing discipline.
  • Open table formats (Delta, Iceberg) decouple storage from compute — they're the structural protection against platform-level lock-in.
  • Governance federation is usually the right model at enterprise scale — Purview cross-platform plus Unity Catalog plus Horizon, not one or the other.
  • Operating-model failures kill more data platform programmes than technology failures. Treat the platform team as a product function, not a ticket queue.

Programme shape

Estimated duration
2678 weeks
Estimated FTE
CDO + data platform lead, data engineering team (4–10 FTE depending on scale), governance lead, FinOps partner, ML platform lead. Enterprise data platforms are persistent functions; the FTE shape is the standing team, not a project budget.
Spend tier
significant
Risk level
elevated

Data platform programmes are multi-year arcs; the 26–78 week range covers the first major phase (selection through governance in production). Most fail not from technology but from operating-model gaps: data engineering treated as ticket-takers, no FinOps cadence, ML platform built in parallel rather than on top. Vendor selection is reversible at higher cost than most assume — a structured comparison up front pays back many times.

Source references

Back to all maps