When teams build warehouses the old way—source → bronze → silver → gold → semantic—visualization and semantic specialists are invited in at the end. Their job looks reactive: wire up a few visuals, name some measures, make it load fast enough. They inherit whatever the pipeline produced, then try to make meaning out of it. The failure mode is predictable: pixel‑perfect charts sitting on semantic quicksand, with definitions that shift underfoot and performance that depends on structures no one designed for the questions at hand.
Flip the sequence to Gold → Silver → Bronze → Ingestion, and the center of gravity moves. The product—expressed as a semantic contract—is defined first. In Fabric, that contract is not a veneer; it’s the spine. Direct Lake brings OneLake Delta tables straight into the model; Materialized Lake Views make silver transformations declarative in the lake; Eventhouse (as part of Real‑Time Intelligence) lands and analyzes streams while also publishing them to OneLake for the same model to consume. In that world, the people who shape the semantic layer stop being “report writers” or “data visualization engineers.” They become data product engineers who lead the build toward a specific, testable outcome.
What changes when the contract comes first
The semantic model becomes the API, not a by‑product.
A data product engineer starts by fixing the business contract: grain, conformed dimensions, measure logic, row/object‑level security, and freshness/SLOs. They treat that model as a public interface that other surfaces reuse—Power BI reports, notebooks via Semantic Link, model‑driven exports, even alerting logic. The language of the job shifts from “Which chart?” to “What is the authoritative definition of AUM, and how do we guarantee it across batch and stream?”
Performance is engineered, not hoped for.
Because Direct Lake eliminates an import hop for large facts, modelers must be comfortable pushing performance concerns down into Delta layout: partitioning strategies, file sizes, and how Materialized Lake Views precompute heavy joins or aggregations. They still write DAX, but they also think like storage and query engineers—owning “time‑to‑first‑answer” as part of the contract.
Silver is a precision tool, not a pilgrimage.
With the product defined, silver exists only to enforce it. Sometimes that’s a Warehouse view or procedure (full T‑SQL semantics over Delta); other times it’s a Lakehouse transform and a materialized lake view to make the shape repeatable and cheap. The data product engineer decides where logic belongs based on consumption and governance, not habit.
Bronze is intentionally small—and includes events.
Bronze is where shortcuts and mirrors land, nothing more: just the slices the product demands. For real‑time signals, Eventhouse is streaming bronze and analytics engine in one. Eventstream brings events into Eventhouse; OneLake Availability projects those same events into Delta in OneLake so the gold model can read them like any other table. One contract, two speeds.
Quality and change management live at the semantic edge.
Owning the contract means owning its tests: reconciling measures against known financial math, proving RLS boundaries, setting performance budgets, and instrumenting usage. When a definition must change, the data product engineer runs a semantic change protocol—versioning, deprecation horizons, and communication—so downstream consumers aren’t surprised.
What the job actually looks like now
Upstream, you’re opinionated.
Rather than accepting whatever bronze happens to have, you tell data engineering what to land—which operational tables to mirror, which external stores to shortcut, and which event types must arrive in Eventhouse. Ingest only what the contract needs. If a new KPI appears, you ask, “Where does it live in the model?” before anyone writes a pipeline.
At silver, you wield the right tool.
If a KPI’s math is expensive and reused, you author it as a Materialized Lake View; if it needs transactional T‑SQL semantics, you place it in Warehouse. You prefer reusable surfaces (views, MLVs) over one‑off extracts. You care that everything behind the model is explainable, redeployable, and observable.
Across batch and stream, you demand one truth.
The same dimensions and measures govern daily reports and intraday tiles. Eventhouse materialized views can summarize live windows; OneLake Availability drops the raw and/or summarized Delta alongside mirrored tables; the gold model points to both without duplicating logic. “Why does the real‑time drift disagree with the daily drift?” stops being a weekly meeting.
Downstream, “reporting” is only one surface.
The model serves BI, but it also feeds notebooks via Semantic Link, automation flows, and export jobs. You defend measure names and descriptions like an API steward would, because they are an API. You track adoption and decision‑lift as product metrics, not just page views.
A brief vignette from wealth management
Consider the Household Performance & Risk product. As a data product engineer, you start by locking the contract: household as a conformed grain; daily positions and transactions as facts; measures for AUM, compliant TWR/IRR, beta, and volatility; row‑level security by advisor/branch; SLOs of T+0 by 9:00 AM local and 15‑minute intraday reflection.
From there, you choose surfaces to honor the promise:
- Big facts ride Direct Lake; small, slow dims can be Import within the same composite model.
- Daily position shaping, FX enrichment, and calendar joins become Materialized Lake Views so the star‑ready facts are always there at 6:00 AM.
- For intraday drift and cash alerts, Eventstream → Eventhouse captures trades and balances; Eventhouse computes rolling windows; OneLake Availability publishes those events to Delta so the same gold model exposes live tiles and daily pages with matching definitions.
- Bronze remains thin: a few mirrored custodial tables, shortcuts to market data and CRM, and event feeds—no museum of raw copies.
The “report” is just one manifestation. What you really shipped is a governed answer, testable and reusable.
How leadership shows up
This role leads by defining what “done” means and defending it throughout the stack. “Done” is not a dashboard; it’s a contract met: the right numbers at the promised grain and latency, repeatable under load, secure by design, and reused across surfaces. That clarity eliminates a ton of thrash. Data engineers stop guessing what to ingest. Domain partners stop debating the meaning of KPIs after go‑live. SREs know what to monitor. Executives get the same answer everywhere because there is only one answer.
Why organizations should embrace the title
Calling these practitioners “report writers” or even “data visualization engineers” is like calling a chip designer a PowerPoint artist because they draw block diagrams. In Fabric’s upside‑down design, the semantic layer is the factory floor where answers are manufactured. The people who run it are data product engineers—they carry product vision, engineer performance, shape ingestion, bridge batch and real‑time, and own semantic change over time. Promote them to the front of the conversation, and your warehouse stops moving data around and starts delivering outcomes.