There’s a moment in every Fabric semantic model lifecycle where the “click it in the UI” approach stops scaling.
It usually happens when you need to rename dozens (or hundreds) of fields to match a business glossary, or when Dev is stable and you’re ready to point the same model at a new Lakehouse for Test/Prod. That’s when the model stops being a diagram and starts being an artifact—something you want to treat like code.
This guide reflows the whole workflow end-to-end, using the Fabric service Edit in Desktop experience to open the model, exporting it to a PBIP project stored as a TMDL folder, editing that folder externally (no scripting inside Power BI Desktop), and then getting those changes back into the service—including the key capability you asked for:
- retargeting the entire model to a different Lakehouse/Warehouse, and
- retargeting a single table to a different physical table (even in a new Lakehouse).
We’ll do it with the mindset of Power BI + Microsoft Fabric development: repeatable changes, visible diffs, and fewer “hand edits” you regret later.
What you’re going to do (and what you’re not)
You’re going to:
- Open an existing Fabric semantic model via Edit in Desktop (live edit).
- Export the model metadata to PBIP + TMDL so the definition becomes editable files.
- Edit the TMDL folder externally (VS Code is the common choice).
- Retarget Lakehouse/Warehouse bindings and table mappings using folder edits.
- Redeploy the result back to the service using Fabric deployment mechanisms (because Direct Lake PBIP has a specific constraint here).
You’re not going to use any scripting inside Power BI Desktop (no TMDL view “apply scripts,” no in-Desktop regex sessions). Everything that changes is changed in the folder.
First: confirm you’re in the right semantic model scenario
The service “Edit in Desktop” pathway is intentionally narrow:
- Edit in Desktop is available only for Direct Lake models, and it launches live editing in Power BI Desktop on Windows.
That matters because it affects what “publishing” means later.
In Direct Lake live edit:
- Save is disabled and changes are automatically applied to the semantic model in the workspace.
- Version history captures a version at the start of a live editing session, giving you a rollback option if you make a bad change.
Before you touch anything structural, open version history once and make sure you’re comfortable restoring. Version history is designed for “oh no” moments, not full source control, and it stores up to five versions per model.
Open the semantic model from Fabric using Edit in Desktop
In the Fabric/Power BI service:
- Locate the semantic model in your workspace content list.
- Use the menu and choose Open data model to enter the web modeling experience.
- Switch to Editing mode if needed, then select Edit in Desktop.
Keep the mental model clear: this is live editing against the remote model, not opening a local .pbix.
Export the model to PBIP and store the definition as TMDL
Now you create the “folder you can edit.”
Enable TMDL storage (one-time setup)
In Power BI Desktop:
- File → Options and settings → Options → Preview features
- Enable: Store semantic model using TMDL format
This puts your semantic model definition into a \definition folder instead of a single model.bim file, and it’s the foundation for folder-first editing.
Export to PBIP
In Desktop (while live editing):
- File → Export → Power BI Project (PBIP)
This produces a PBIP project structure. When TMDL is enabled, the semantic model appears as a folder that includes the \definition directory containing the TMDL files.
Know your workspace: what the TMDL folder represents
Inside the semantic model folder, the important concept is simple:
- The
\definitionfolder is your model definition in TMDL form. - TMDL is designed to be human-friendly and supports a folder structure that makes diffing and collaboration practical (#TMDL, #DataModeling).
When you edit those files externally, remember one operational rule that saves hours:
Power BI Desktop isn’t aware of changes made by other tools. If Desktop is open and you edit files in VS Code, you must restart Desktop to reload the updated project definition.
Edit the TMDL folder (external-only) for bulk changes
Open the PBIP folder in your editor (VS Code is commonly used, and there is a VS Code TMDL extension as the preferred external editing experience).
Mass edit column names (the sane way)
This is where folder editing shines.
A practical approach that holds up:
- Use multi-file search to locate column definitions in table TMDL files.
- Apply consistent renaming rules (prefix removal, capitalization, glossary alignment).
- Then search for downstream references:
- relationships
- measures
- calculated columns
The subtle but critical part for Direct Lake (and schema sync generally): Power BI tracks user customizations to preserve them across schema synchronization. Microsoft’s lineage tags guidance explains that for synced objects (including Direct Lake tables/columns), you must mark name changes as user customizations when needed using changedProperty = Name, and maintain stable identity with lineage tags/source lineage tags.
A pattern you’ll see (and want to preserve) is:
sourceColumnstays the physical/source column name- the column name in the model becomes your friendly reporting name
changedProperty = Nameindicates the rename is intentional
That’s the difference between “it looked right yesterday” and “schema sync reverted my work.”
Retargeting: point the model (or one table) at a different Lakehouse
This is the part that usually forces rebuilds. It doesn’t have to.
There are two retargeting problems:
- Model-level retargeting: “Move this semantic model from Dev Lakehouse to Prod Lakehouse.”
- Table-level retargeting: “Keep my semantic model table, but point it at a different physical table (possibly in a different Lakehouse).”
They’re related—but not the same.
Retarget the entire Lakehouse/Warehouse binding
Direct Lake models depend on how tables are bound to a Fabric item:
- In Direct Lake, the Edit tables experience assumes tables are coming from a single Fabric item (a Lakehouse or Warehouse).
So the cleanest retarget is: update the model’s connection/binding so the same table set resolves in the new Lakehouse.
What to change in the TMDL folder
You are looking for the “connection definition” and/or the shared expression used by Direct Lake partitions.
Practically, that means:
- Search your
\definitionfolder for the old identifiers (they’re usually inexpressions.tmdl):- the old SQL endpoint server string (Direct Lake on SQL scenarios)
- the old SQL endpoint “database”/ID
- or old OneLake path segments like
onelake.dfs.fabric.microsoft.com
- Update them to the new Lakehouse/Warehouse equivalents.
Microsoft’s Direct Lake Desktop documentation shows the model can be defined using:
Sql.Database(...)for SQL endpoint-based connections, andAzureStorage.DataLake("https://onelake.dfs.fabric.microsoft.com/<workspaceId>/<lakehouseOrWarehouseId>")for OneLake-based connections. (Microsoft Learn)
Even if your exact file names differ, the workflow is consistent: locate the connector expression in the TMDL folder and update it to the new target.
A reality check before you do this
Retargeting is easiest when:
- the table names match, and
- the schemas/columns match.
If they don’t, you’re no longer “retargeting.” You’re doing a controlled migration—and you’ll need to handle table-level mapping too (next section).
Production-friendly alternative: deployment pipeline rules
If you’re already using deployment pipelines, there’s a service-first option: use deployment rules to swap the Lakehouse the model points to as you move stages.
This doesn’t replace TMDL folder editing, but it is often the safest way to keep Dev/Test/Prod aligned without hand-editing connection values per environment.
Retarget a single table to a different physical table (even in a new Lakehouse)
This is the more interesting case—and the one people underestimate.
There are two “flavors” of table retargeting:
- Same table name, different Lakehouse
You mostly solve this at the model binding level (previous section). - Different table name (or replacement table), possibly in a new Lakehouse
You must explicitly manage identity and mapping so schema refresh doesn’t drop the table.
Why SourceLineageTag is the lever you need
For Direct Lake models, Microsoft is explicit:
- For DirectLake models, the
sourceLineageTagmust be the name of the table/column in the Lakehouse/data warehouse.
That means table-level retargeting generally looks like:
- Keep your semantic model table name the same (so reports don’t break)
- Update the table’s
sourceLineageTagso it matches the new physical table name - Do the same for columns if the physical column names changed
This also aligns with schema refresh behavior:
- Schema refresh compares table definitions in the model to the same named table in the data source, and source renames can cause tables/columns to be removed on the next schema refresh unless you update SourceLineageTag.
A practical table retarget workflow in the TMDL folder
In your TMDL folder:
- Identify the table definition file for the semantic model table you want to keep.
- Search inside it for:
sourceLineageTag- any reference to the old physical table name
- Replace the
sourceLineageTagvalue with the new physical table name. - If the new physical table uses different column names:
- preserve your friendly names
- update
sourceColumnandsourceLineageTagfor those columns accordingly - ensure the rename is marked as intentional where needed (
changedProperty = Name)
- Restart Desktop before validation (because external edits won’t be detected live).
When your “new table” comes from a rename upstream
Fabric’s Direct Lake Edit tables documentation is blunt about the operational impact:
- If an upstream source renames a table/column after it’s been added, the semantic model can still reference the old name and the table can be removed during schema sync; the new table shows as unchecked and must be explicitly added again, and relationships/column property updates must be reapplied.
Folder editing doesn’t eliminate that reality—it gives you a way to handle it intentionally, rather than rediscovering it during a refresh window.
Bring the edited folder back into Desktop and validate
Once your edits are complete:
- Close Power BI Desktop (if it’s open).
- Re-open the PBIP by opening the
.pbipfile. - Validate:
- relationships still bind
- measures resolve
- table/column presence matches expectation after a refresh
For Direct Lake models specifically, remember what Refresh does in live edit:
- Refresh performs a schema refresh and reframe for Direct Lake tables.
- If the source renamed objects and your lineage mapping isn’t corrected, schema refresh can remove them.
This is also where you lean on version history when needed. Version history begins capturing versions once a model is opened in Editing mode on the web or opened for Direct Lake live editing in Desktop.
Deploy back to the service
This is where Direct Lake + PBIP differs from classic Power BI habits.
If you started with “Edit in Desktop” on an existing model
You were live editing the remote semantic model:
- Changes you make in Desktop are automatically saved back to the workspace model.
So for the semantic model itself, you’re not “publishing” in the classic sense—you’re editing in place.
If you want to deploy the PBIP artifacts from your edited folder
For Direct Lake PBIP specifically:
- You deploy using Fabric deployment mechanisms such as opening the file and selecting a workspace, Fabric Git integration or Item APIs.
That’s why the folder workflow is so valuable: it fits naturally into Git-based promotion and environment deployment.
The quiet “gotchas” that break real projects
A few issues show up repeatedly:
- Desktop didn’t reflect my folder edits.
That’s expected—restart Desktop after external edits. - Schema refresh removed my table after I retargeted.
In Direct Lake, make suresourceLineageTagmatches the physical object name in the target Lakehouse/Warehouse. - I renamed fields and downstream reports broke.
This is a known limitation of service-side model editing: renaming fields won’t automatically update existing visuals in downstream artifacts. Plan renames as a coordinated change.
Conclusion: Treat the model like an artifact, not a diagram
You came here for a practical workflow—and now you have one:
Open the semantic model from Fabric using Edit in Desktop, export to PBIP/TMDL, do high-volume edits safely in the folder, retarget Lakehouse bindings and table mappings intentionally, validate by reopening the PBIP, and redeploy using the mechanisms that match Direct Lake reality.
The payoff is bigger than convenience. It’s repeatability. It’s the ability to promote across environments without rebuilding. And it’s a development posture that aligns with modern Fabric delivery: #MicrosoftFabric models as managed assets, not hand-tuned one-offs.
If you’re going to operationalize this, the next “level up” is straightforward: keep the PBIP/TMDL folder in source control and promote through controlled deployments—so retargeting Dev→Test→Prod becomes a rule-driven step, not a late-night scramble.