SAP Business Data Cloud Connect for Microsoft Fabric: The New Backbone of Your Data‑Product Strategy

SAP and Microsoft have just taken away one of the biggest excuses for slow analytics and AI on SAP: “We can’t move that data safely or reliably enough.”

At Microsoft Ignite 2025, they announced SAP Business Data Cloud (BDC) Connect for Microsoft Fabric—a new capability that lets you share SAP Business Data Cloud data products and Microsoft Fabric data sets bi‑directionally, with zero‑copy, and have those products show up natively in OneLake and back in BDC.

Planned for general availability in Q3 2026, this isn’t “yet another connector.” It’s the missing link between SAP’s data‑product‑centric Business Data Cloud and Microsoft’s Fabric platform. It’s also where SAP Databricks, Azure Databricks, and Fabric line up as peers rather than competitors.

What SAP BDC Connect for Microsoft Fabric actually is

SAP’s own description is straightforward: BDC Connect for Microsoft Fabric gives organizations secure, rapid access to semantically rich SAP data products inside Fabric without the delays and risk of replication, and lets Fabric‑origin data sets flow back into BDC to enrich SAP intelligent applications.

A few key characteristics matter at executive level:

  • It is bi‑directional: SAP BDC data products appear in OneLake, and selected OneLake data sets can be pushed back into BDC.
  • It is zero‑copy by design: the integration uses a shared data fabric approach (building on the same zero‑copy patterns SAP is already using with Databricks, Snowflake, and BigQuery) rather than constant ETL replication.
  • It is data‑product‑aware: what flows into Fabric are SAP BDC data products—curated, semantically rich bundles aligned to SAP’s business data fabric, not arbitrary raw tables.
  • It is explicitly targeted at advanced analytics and AI: the announcement ties BDC Connect for Fabric to Fabric’s engineering, warehousing, Power BI, Fabric data agents, and AI Foundry, and highlights multi‑agent scenarios across M365 Copilot and SAP Joule.

Microsoft’s Azure data blog makes the positioning just as clear from the other side: OneLake is the central hub, and BDC Connect for Fabric is a first‑class, zero‑copy, bi‑directional integration for SAP—announced alongside similar zero‑copy moves with Salesforce, Azure Databricks, and Snowflake.

So if you are trying to explain this to your boss, you can say:

“We get SAP’s curated data products into Fabric’s OneLake and Fabric’s AI stack without copying, and we can send Fabric‑derived insights back into SAP Business Data Cloud to drive intelligent applications.”

That is a different game from the classic “ETL SAP into a warehouse and hope it stays in sync.”

How BDC Connect for Fabric reshapes the SAP + Fabric architecture

Before BDC Connect for Fabric, the SAP–Fabric story looked like a set of point‑to‑point patterns:

  • SAP Datasphere or SAP source mirroring into Fabric via mirroring and copy jobs.
  • Separate integrations to Azure Databricks or Snowflake, each then connected into Fabric through mirroring or shortcuts.
  • Localized AI scenarios in SAP applications and separate AI projects in Fabric, with SAP semantics re‑implemented downstream.

BDC Connect for Fabric gives you a single, semantic bridge:

  • Upstream: SAP Business Data Cloud curates SAP content into managed data products (scoped by domain, with semantics intact) and exposes them via BDC Connect into OneLake as AI‑ready, governed assets.
  • Across: Fabric workloads—Lakehouse, Warehouse, Real‑Time Intelligence, Fabric IQ, Power BI, and Fabric data agents—consume those products without rebuilding SAP logic.
  • Downstream: Selected OneLake data sets (for example, ML features, aggregated KPIs, or knowledge bases) are shared back to SAP BDC so that SAP intelligent applications and agents can act on them without owning the heavy data engineering.

Crucially, this sits inside a broader open data ecosystem strategy:

  • SAP has already announced BDC Connect variants for Databricks and Snowflake, all with bi‑directional, zero‑copy semantics.
  • Microsoft, in parallel, is making OneLake interoperable with Azure Databricks, Snowflake, and Salesforce using the same “single‑copy” approach.

In other words, BDC Connect for Fabric is not a special case. It is SAP’s standard pattern for working with serious data platforms. For #SAP customers heavily invested in Microsoft, it just happens to be the most strategically relevant one.

Where SAP Databricks and Azure Databricks fit—parity, not rivalry

You can’t talk about SAP BDC without talking about SAP Databricks.

SAP Databricks is Databricks as an SAP‑managed service inside SAP Business Data Cloud: a Databricks Data Intelligence Platform deployment natively embedded in BDC, governed by Unity Catalog, and connected to SAP data products via zero‑copy Delta Sharing.

At the same time:

  • SAP Business Data Cloud Connect for Databricks is already generally available, offering bi‑directional, zero‑copy sharing of SAP data products to any Databricks workspace across AWS, Azure, and GCP.
  • Azure Databricks is deeply integrated with Fabric via Mirrored Azure Databricks Unity Catalog and upcoming native OneLake read/write through Unity Catalog—again with a single‑copy mindset.

When you add BDC Connect for Microsoft Fabric into that picture, you get parity:

  • If your Databricks estate is mostly in BDC (SAP Databricks), SAP BDC Connect for Databricks and SAP BDC Connect for Fabric together give you zero‑copy paths from SAP data products into SAP Databricks, into Azure Databricks if needed, and into Fabric/OneLake—plus a way back into BDC.
  • If your Databricks estate is mostly on Azure, you connect BDC to Azure Databricks via BDC Connect for Databricks, mirror Unity Catalog into Fabric, and still get the same fundamental pattern: SAP data products → Databricks → Fabric, zero‑copy at each step.

The important takeaway for senior IT decision makers is this:

  • SAP Databricks is not a fork; it is an SAP‑managed edition of Databricks with the same Unity Catalog and Delta protocols you expect.
  • Azure Databricks is not displaced; instead, it is one of several first‑class destinations for BDC Connect, and first‑class peers in Fabric’s “single‑copy” story.

You can standardize on Databricks—SAP, Azure, or both—and still have BDC Connect for Microsoft Fabric as your canonical SAP → OneLake bridge.

A clarion data‑product moment

From a data‑strategy perspective, BDC Connect for Fabric is a clarion example of data products done properly.

On the SAP side:

  • SAP Business Data Cloud is explicitly data‑product‑centric. It knits together Datasphere, SAP Analytics Cloud, SAP BW, SAP Databricks, and intelligent applications into a single SaaS experience, and exposes curated data products across key business processes, with semantics and governance intact.
  • SAP’s open data ecosystem story is that BDC Connect variants (Databricks, Snowflake, BigQuery, and now Fabric) all ship those data products into partner platforms via zero‑copy sharing, not via bespoke ETL.

On the Microsoft side:

  • Microsoft Fabric and Purview Unified Catalog now have a data product concept of their own: you can package tables, files, BI models, and AI artifacts as a product with an owner, purpose, policies, endorsements, and health metrics.
  • Fabric’s OneLake shortcuts and mirroring provide a consistent way to virtualize external data (including SAP) into OneLake while minimizing copies.

BDC Connect for Fabric bridges these two worlds:

  • Foundational data products: SAP curates them in BDC; they flow into OneLake via BDC Connect, with business semantics included.
  • Derived data products: Fabric composes them with non‑SAP sources (streaming via Real‑Time Intelligence, external lakes via shortcuts, internal apps via mirroring) and publishes them as Purview data products for analytics, AI agents, and applications.

You stop treating SAP as “just another source system.” SAP becomes the system of meaning, Fabric becomes the system of composition, and data products are the contract between them.

How this plays out in Oil & Gas, Manufacturing, and Supply Chain

You can explain this architecture at 10,000 meters, but it becomes real when attached to line‑of‑business outcomes.

Oil & Gas – asset health, safety, and emissions

In Oil & Gas, SAP PM/EAM and MM already encode the asset hierarchies, work orders, notifications, and materials that matter for reliability and safety. In BDC, those show up as Maintenance & Asset Health data products.

With BDC Connect for Fabric:

  • Those products arrive in OneLake without replication.
  • Fabric’s Real‑Time Intelligence ingests historian or edge telemetry, scoring equipment risk in KQL databases.
  • Fabric IQ can model equipment and plant entities, letting AI agents think in terms of “pumps” and “compressors,” not just tables.
  • Selected predictions and KPIs can be shared back to BDC, where SAP Databricks and SAP intelligent applications use them to drive work prioritization, safety workflows, or emissions reporting.

The result is an end‑to‑end loop from SAP semantics to Fabric‑hosted analytics and back to SAP apps, all on top of a single set of data products.

Manufacturing – OEE and FPY without re‑modeling SAP

Manufacturers already rely on SAP for production orders, routings, BOMs, and quality results. In BDC, that becomes a Manufacturing Operations set of data products.

BDC Connect for Fabric lets you:

  • Land those products in OneLake and combine them with machine telemetry, MES events, and vision QC outputs via shortcuts and Real‑Time Intelligence.
  • Use Azure Databricks or SAP Databricks to build feature stores and ML models for scrap prediction or changeover optimization, governed by Unity Catalog.
  • Expose “Line Performance & Yield” as a Derived Data Product in Purview, used by plant managers, CI teams, and finance with one set of definitions.

You gain a single, consistent story for OEE and FPY, rather than reconciling multiple SAP extracts and plant‑level warehouses.

Supply Chain – OTIF and live ETAs

For Supply Chain, SAP S/4, EWM, TM, and Ariba already describe orders, inventory, shipments, and procurement events. In BDC they become Order & Fulfilment and Supply Network products.

BDC Connect for Fabric allows:

  • Those products to live in OneLake alongside carrier telematics, port congestion data, and risk scores via shortcuts to S3/GCS and other lakes.
  • Real‑Time Intelligence and Fabric agents to score shipments for delay risk and trigger actions through Activator.
  • Aggregated views (e.g., “Global OTIF Commit”) and scenario models to be published back into BDC to drive SAP’s own planning and orchestration apps.

For senior leaders, this moves OTIF from a static KPI to a live control surface grounded in SAP truth but enriched in Fabric.

Governance, risk, and cost—what you need to be comfortable signing

All of this is compelling, but it has to pass a governance and cost sniff test.

On the governance side:

  • OneLake security and Purview Unified Catalog give you engine‑agnostic access control and policy management across Fabric workloads and mirrored/shortcut data.
  • Fabric Domains and Purview data products let you align ownership and policies to domains—Asset Reliability, Manufacturing Performance, Supply Chain Commit—rather than to individual schemas.
  • BDC Connect itself is positioned as an extension of SAP’s open data ecosystem, which already includes zero‑copy connectors to Databricks, Snowflake, and BigQuery; the same design principles apply.

On the security and network side:

  • Azure is expanding Private Link and workspace outbound access protection for Fabric and OneLake, which you can align with SAP’s own networking and identity controls.
  • BDC Connect patterns themselves are built on secure protocols (Delta Sharing and similar primitives) and emphasize zero‑copy access within a governed data fabric, not ad‑hoc replication.

On the cost side:

  • Zero‑copy and mirroring reduce storage duplication and ETL overhead, but you still pay for compute and egresswhere you design cross‑cloud shortcuts; OneLake’s shortcut caching is explicitly meant to mitigate that for S3/GCS.
  • Direct Lake avoids repeated import/refresh cycles for BI, but you must design models and capacities to avoid falling back into expensive DirectQuery patterns.

The key point: with BDC Connect for Fabric, copies become an explicit choice, not an architectural inevitability. That’s a far better position to manage risk and spend.

Unknown's avatar

Author: Jason Miles

A solution-focused developer, engineer, and data specialist focusing on diverse industries. He has led data products and citizen data initiatives for almost twenty years and is an expert in enabling organizations to turn data into insight, and then into action. He holds MS in Analytics from Texas A&M, DAMA CDMP Master, and INFORMS CAP-Expert credentials.