The basic notebook consists of four cells, each performing a specific task. You can run a cell individually using the Run cell button before the cell, or run all cells sequentially using Run all at the top of the notebook.
Before running the cells, they must be adjusted to your own configuration.
1. Install Fabric Connector
In the first cell, the Fabric Connector is installed in the active Spark session using the %pip install command. The connector is then imported so it can be used in the following cells.
The %pip install command is only supported in Fabric when the notebook is executed manually. If you want the notebook to run on a schedule, you must create an Environment that includes the required packages. See Chapter 2.5 Configure refresh for details.
2. Initialize a new run
In the second cell, we initialize a new run. First, enter the Connector Key you copied from PowerBIConnector.nl (see Chapter 1.2 Account).
A run can include one or more tables from Exact Online. Each table can only be imported once per run. If you want to reload the same table, you must re-run this second cell to start a new run.
Tip: Do not store the Connector Key directly in the code. Use Azure Key Vault for secure key storage. See Chapter 2.6 Fabric Tips for more details.
3. Import data
In the third cell, data is imported from Exact Online and stored in the Fabric Lakehouse.
Parameters
There are two required parameters:
- administrationcode: Enter the administration codes (see Chapter Companies) for the companies you want to import. For example: “10000” for one company or “10000,10001,10002” for multiple companies. Instead of specific codes, you can also use “all” for all companies and “tag:group1” or “tag:group1,group2” for all companies with these tags.
- aantaljaren: Enter the number of years to import. For example, in 2026, entering 3 will import data for 2024 through 2026. The default value (if left blank) is 1 year.
The parameter ‘aantaljaren’ works differently depending on the table’s refresh type.
For incremental tables (see overview):
- First refresh: Enter the number of years for which you want to load historical data. The data is loaded based on the modification date of a record. If data is added in Exact Online with a date far in the future, it may be wise to include an extra year so these records are captured.
- Subsequent refreshes: Only changes since the previous refresh are retrieved. The parameter ‘aantaljaren’ no longer has any effect.
For full-refresh tables (see overview):
- First refresh: Enter the number of years for which you want to load historical data. The data is always loaded based on the transaction date of a record (not the modification date).
- Subsequent refreshes: Usually, 1 year is sufficient so that only new and updated records from this year are retrieved. Older years remain available in the lakehouse. If data from the previous year can still change, temporarily choose 2 years until the previous year is closed.
Important: make sure you only enter the full historical range during the first refresh. For subsequent refreshes, set the number of years always to 1 or 2, so the entire history is not reloaded every time!
Tables
Next, specify the tables you want to import with this notebook. Parameter values are defined at the top of the cell, so you only need to enter them once. The basic notebook includes examples such as:
ExactOnline.Companies(companycode=companycode, numberofyears=numberofyears)
ExactOnline.Calendar(companycode=companycode, numberofyears=numberofyears)
ExactOnline.Ledger(companycode=companycode, numberofyears=numberofyears)
ExactOnline.LedgerTransactions(companycode=companycode, numberofyears=numberofyears)
You’ll find the full list of tables in Chapter 2.4 Datamodel Exact Online. Pay attention to the difference between full-refresh and incremental tables for optimal efficiency.
Efficient strategy for keeping data up-to-date at the lowest Fabric cost:
- Create one notebook for full-refresh tables and refresh it once per day.
- Create a second notebook for incremental tables and refresh it multiple times per day.
- Create a third notebook for rarely changing data and refresh it weekly or monthly.
All notebooks can be linked to the same Lakehouse. This approach keeps your data current with minimal refresh time and cost.
Schemas
After importing, tables appear in the Lakehouse. You may need to refresh to see newly added tables.
![]()
Two schemas are used in the Lakehouse, the default-schema ‘dbo’ and a second schema ‘staging’. In Power BI, use only tables in the ‘dbo’ schema. The ‘staging’ schema is used internally by the Fabric Connector.
Optional parameters
There are 4 optional parameters, which are only needed in specific situations:
- table=”custom_table_name”. Tables are created automatically by the connector. For example, if you import the Ledger table, the connector will create a table in the Lakehouse named “Ledger”. With this parameter, you can specify a custom table name to be used in the Lakehouse.
- importmode=”batch” or importmode=”percompany”:
- importmode=”batch” (default). The standard approach when importing multiple companies is to first retrieve all data and then write it to the Lakehouse in a single batch. This is the most cost‑efficient method for Fabric.
- importmode=”percompany”. For very large tables (more than 1 million rows), this may require more memory than is available in the selected Fabric SKU, usually only during the first run when all data is loaded. In subsequent runs, only changes are loaded incrementally. If you hit a memory limit during the initial import, you can use importmode=”percompany”, which writes data to the Lakehouse per company. The recommendation is to remove this parameter after the first import to minimize Fabric costs.
- forcefullrefresh=True. Incremental tables load only new, modified, and deleted records on each run. Sometimes there is a reason to force a full refresh. You can use forcefullrefresh=True for that purpose. Be sure to remove this parameter afterward to avoid excessive data consumption.
- forcehistory=”2023-2024″. Incremental tables only load data that has been added, modified, or deleted since the last refresh. Sometimes you may want to force a full refresh for a specific year, or you may have initially loaded only 2025 and 2026 but now want to add 2023 and 2024. In such cases, you can use forcehistory=”2023-2024″. Be sure to remove this parameter afterward to avoid excessive data consumption.
4. Stop session
In the fourth cell, the SQL Endpoint of the Fabric Lakehouse is updated so the data is immediately available in Power BI. Without this step, it may take a few minutes.
Finally, the Spark session is stopped to prevent ongoing Fabric costs.