Introduction
Large Semantic Models in Power BI allow you to work with semantic models that exceed 1GB in size. Once enabled, your semantic model can grow up to the limits defined by your capacity. However, enabling this feature introduces a challenge: if you attempt to move your workspace to a different capacity in another region you will be unable to view your reports without completing additional steps.
The Problem
After enabling Large Semantic Model format, migrating the workspace to a different region may appear successful in the UI, especially if the workspace contains only Power BI items. But when you try to open a report, you’ll encounter this error:

The Solution
To resolve this, we can use a backup and restore process made available by configuring Dataflow Storage to use ADLS Gen2. By configuring this process we can use Tabular Model Scripting Language (TMSL) to create our backups.
Migration Workflow
A high level overview of the process is as follows:
- Connect the workspace to ADLS Gen2
- Backup the semantic model
- Process Clear the Data
- Turn off Large Semantic Model Format
- Migrate the workspace to the new capacity in the new region
- Turn on Large Semantic Model Format
- Restore the semantic model
- Optional: Disconnect workspace from ADLS Gen2
Create and connect our Storage Account
Firstly we need to create our storage account, this can be done via the Azure Portal. The full list of prerequisites are listed here, but the main two are to ensure that the storage account isn’t behind a firewall and that hierarchical namespace is enabled.
Next we’ll need to connect the storage account to the Fabric Service, this can either be done in Tenant Settings or at the Workspace level.
For simplicity, we’ll focus on connecting a single workspace. Navigate to your workspace settings, open the Azure Connections section, and link your storage account.

Backup Semantic Model
Unfortunately, Power BI does not support backing up semantic models via the UI, but we can make use of XMLA tools such as SQL Server Management Studio (SSMS), but for the purposes of this post we are going to use Semantic Link Labs which allows us to have a Fabric-native solution as opposed to using an external tool.
There are a number of functions that we are going to need in Semantic Link Labs to complete the backup and restore process:
- backup_semantic_model
- This will backup our semantic model to our Storage account
- For additional security you are able to encrypt your backup with a password
- refresh_semantic_model
- We will pass the TMSL refresh command of “clearValues” to clear the data from our semantic model (so that it’s size is smaller than the large semantic model limit)
- set_semantic_model_storage_format
- We will call this twice to set the storage format to Small before we move the workspace, and back to Large once the workspace has moved to the new region.
- assign_workspace_to_capacity
- This will assign the workspace to the capacity in the new region
- restore_semantic_model
- This will restore the data to our semantic model from the Storage Account
Template Notebook
Rather than bloat the post with descriptions of each stage, I have provided a Fabric Notebook using Semantic Link Labs that you can download from my GitHub repo. It automates the backup and migration steps for a single workspace, which you can modify to migrate multiple workspaces/capacities if required.
You can watch a short video here that includes uses the notebook: