Tips for Optimizing Your Microsoft Fabric Environment. These recommendations are not specific to the Fabric Connector but apply to using Microsoft Fabric in general.
Use Azure Key Vault
It is not advisable to leave the connector key visible in notebooks. While this may be practical during initial setup, we recommend storing the key in Azure Key Vault afterward. This improves security and prevents sensitive information from being accidentally shared or published.
To use Azure Key Vault, you need an active Azure environment, which is typically available when using a paid Fabric subscription. The cost of Key Vault is minimal (a few cents per 10,000 transactions).
Steps to use Azure Key Vault:
- Create a Key Vault in the Azure portal (see Azure documentation for details).
- Create a Secret in the Key Vault (see Azure documentation). Enter the connector key you copied from PowerBIConnector.nl (see Chapter 1.2 Account) as the Secret value.
- In the Key Vault, go to the Overview tab and copy the Vault URI (e.g., https://<keyvaultname>.vault.azure.net/).
- Copy the name of the Secret you just created.
- Modify the notebook cell as shown in the example below.

- From now on, the connector key will be automatically retrieved from Azure Key Vault and will no longer be visible in the notebook. This improves security and makes managing connector keys easier and safer.
Note: The user running the Fabric notebook must have at least the Key Vault Secrets User role in Azure Key Vault to access the secret.
Start / Stop Fabric Capacity
Microsoft Fabric costs are calculated per minute. For every minute your Fabric capacity is active, you consume a certain number of Capacity Units (CUs), depending on the chosen SKU.
For example: an F2 SKU provides 2 CU per second, which equals 120 CU per minute / 7,200 CU per hour.
If you consistently use fewer CUs than your SKU provides, it may be financially beneficial to pause Fabric capacity. During the time capacity is paused, no costs are incurred.
You can automatically start or stop Fabric capacity at fixed times using Azure Logic Apps or Power Automate. A detailed explanation is available in this video (Logic App from 8:24 and Power Automate from 15:36).
Important considerations:
- If you use Direct Lake in Power BI, Fabric capacity must always be enabled, otherwise the data will not be available.
- If you use import via SQL Endpoint in Power BI, Fabric capacity only needs to be active during refresh. The data is stored in the Power BI semantic model during refresh and remains available even if capacity is paused afterward.
- Fabric uses smoothing and bursting, which means CUs can be consumed in advance. When you pause capacity, any CUs consumed beyond your available credit are charged immediately.
Example: If your F2 SKU runs for 30 minutes, you have 3,600 CU credit. If you consumed 5,000 CU during that time and then pause capacity, you will be charged for the extra 1,400 CU. - Use the Fabric Capacity Metrics App to monitor CU usage. Analyzing your usage can help optimize when to start and stop capacity.
Tip: Microsoft Fabric costs vary by Azure region. For example, rates in North Europe are about 15% lower than in West Europe. If you need Fabric capacity active 24×7, you can save up to 41% by using Azure Reservations.
Lakehouse Maintenance
In a Microsoft Fabric Lakehouse, Delta tables are automatically managed, but additional maintenance tasks can help maintain performance and optimize storage. The two most important maintenance operations are OPTIMIZE and VACUUM. These are available directly in the Lakehouse environment and can be executed manually or via code/API.
Optimize
The OPTIMIZE operation combines large numbers of small Parquet files into fewer, larger files. This significantly improves query performance because engines need to open and scan fewer files.
Key benefits:
- Combines small files into larger files for faster read operations
- Reduces metadata overhead and speeds up analysis
- Recommended after large load processes or workloads that generate many small files
More information is available in Microsoft documentation.
Vacuum
The VACUUM operation permanently removes old, unused files from a Delta table. When rows are deleted or overwritten, the underlying files remain for a while to support time travel. VACUUM ensures these old files are deleted once they exceed the retention period (default: 7 days).
Key benefits:
- Removes old, unreferenced files based on the Delta log
- Prevents wasted storage and keeps the Lakehouse clean
- Important for long-term performance and manageable storage costs
More information is available in Microsoft documentation.