Your metrics are only as reliable as your nouns. If “customer,” “order,” or “revenue” shift between teams or tools, analytics becomes negotiation instead of decision. The way out is to put meaning first—an ontological layer that anchors everything—and then let the semantic layer deliver that meaning at speed and scale.
Data leaders often sprint to the semantic layer because it pays quick dividends: friendly models, consistent measures, fast queries. But without an ontological foundation, a semantic layer becomes a catalog of convenient approximations. This piece re‑centers the ontological layer as the source of truth for concepts and relationships, shows how it complements the semantic layer, and traces the market’s evolution from SAP’s early ideas to Microsoft’s Semantic Models and today’s platform‑native semantics from Databricks and Snowflake. Tomorrow, we’ll look at what Microsoft’s new Fabric IQ Ontology (preview) adds to the picture.
Ontology first: the layer that defines your world
An ontological layer is a formal specification of the concepts in your domain and how they relate. It names things (Customer, Contract, Product), defines relationships (owns, contains, fulfills), encodes constraints (every Order has a Buyer), and enables reasoning (if A is part of B and B is part of C, then A is part of C). Whether you express it with OWL/RDF, a property graph, or a well‑governed conceptual model, the purpose is the same: stable identity and shared meaning that survive new systems, new sources, and new questions.
Three properties make an ontology durable: identity and reference across systems, rich relationships beyond joins, and inference with constraints machines can enforce. If the ontological layer is your dictionary and grammar, everything else—pipelines, marts, dashboards, agents—should write sentences that conform to it. That’s the foundation modern AI agents need to act on data safely.
The semantic layer: operationalizing meaning
The semantic layer is where analytics and applications consume that meaning: business‑friendly entities, reusable metrics, slicers, security, and performant access paths. It maps the ontology’s nouns and relationships to concrete tables and logic, making “Gross Margin” or “Active Customer” executable and consistent in BI, SQL, notebooks, and AI assistants. In Microsoft Fabric, this shows up as Semantic Models (the renamed “datasets”), which can run in Direct Lake for import‑like performance and surface the same definitions to code via Semantic Link. Put simply: the ontology defines what is true; the semantic layer ensures the truth is used the same way everywhere.
A short history—seen through an ontological lens
SAP’s early beginnings. Business Objects popularized the idea that users should query with business terms, not SQL. Universes—and later the BI Semantic Layer—gave organizations a governed vocabulary and mapping to physical data. That mental model persists: a curated layer of meaning between users and source systems.
Microsoft’s reframing. Analysis Services made semantics a first‑class citizen; the Fabric era sharpened the picture by recasting Power BI “datasets” as Semantic Models and threading them through the platform. Direct Lake brings import‑like performance from Delta tables in OneLake, and Semantic Link lets notebooks respect the same business semantics—no ad‑hoc rewrites.
Databricks and Snowflake make semantics native. Databricks treats metrics and dimensions as governed assets with Unity Catalog Metric Views—defined once, queried everywhere (SQL, dashboards, Genie). Snowflake’s Semantic Views store entities, relationships, dimensions, and metrics as schema‑level objects, powering Cortex Analyst and plain SQL alike. The throughline: semantics live beside data under governance.
Interchange emerges. With the Open Semantic Interchange (OSI) initiative—and dbt Labs open‑sourcing MetricFlow—vendors are converging on a portable way to move semantic metadata across stacks. That makes a well‑formed ontology even more valuable: it becomes the stable contract multiple semantic layers can honor.
How to put the ontological layer to work
Start small, but start explicit. Name your core concepts and relationships independent of any schema; write competency questions your ontology should answer; pick a representation you can govern; then map ontology → semantic constructs in the platforms you run:
- In Microsoft Fabric, bind ontology concepts to Semantic Models, prefer Direct Lake where feasible, and expose the same semantics to notebooks via Semantic Link so BI and code agree.
- In Databricks, register metrics and dimensions as Metric Views in Unity Catalog to serve SQL, dashboards, and agentic experiences.
- In Snowflake, model entities and metrics as Semantic Views and feed them to SQL consumers and Cortex Analyst.
The payoff is practical: fewer definition debates, faster onboarding, clearer lineage, and AI assistants that cite the same meaning humans use.
Summary
We started with the problem—metrics without stable nouns—and argued for an ontology‑first posture. The ontological layer defines what’s true; the semantic layer ensures that truth is delivered consistently. From SAP’s Universe to Microsoft’s Semantic Models and the native semantics in Databricks and Snowflake, the industry is meeting in the middle. With OSI’s momentum, business meaning is becoming portable. Your next move: write down your enterprise concepts, make the relationships explicit, and bind every semantic artifact back to that source of meaning.
Tomorrow—Fabric IQ Ontologies, in practice
Next up, we’ll dive into Fabric IQ’s Ontology (preview): what an Ontology item is in Fabric IQ, how it digitally represents enterprise vocabulary across OneLake sources, how to generate an ontology from an existing Semantic Model, and how Fabric’s emerging agents use that ontology to reason and act. We’ll walk through entity types, relationships, binding to live data, and what the preview experience offers today. Stay tuned for a hands‑on look at how IQ turns ontology into a working, governed knowledge graph for BI and AI.