Incremental Refresh


Power BI

Colorful software or web code on a computer monitor
Colorful software or web code on a computer monitor


Welcome to a detailed exploration of Incremental Update in Power BI, a transformative tool in the world of business data analysis. In the current era, where data grows exponentially, the ability to handle large volumes of information efficiently has become a critical necessity for organizations. This is where the Incremental Update in Power BI plays a crucial role.

By Vicente Antonio Juan Magallanes -

15th January 2024 - fp20 analytics

What is Incremental refresh?

Incremental update in Power BI is an advanced technique used to optimize efficiency and performance when updating data models. Instead of reloading and processing the entire data set each time an update is needed, this technique allows for updating only a specific part of the data. This is particularly useful and efficient for large and complex data models.

Advantages and Benefits of Use

  • Performance Improvement: Incremental update significantly reduces the time needed to update data. By loading only new or modified data, processing time is minimized, resulting in quick availability of updated reports.

  • Resource Usage Optimization: By reducing the amount of data that needs to be reloaded and processed, the load on Power BI servers and data sources is decreased. This leads to more efficient use of computing and network resources.

  • Efficient Handling of Large Data Volumes: For large data sets, a full reload can be very slow and costly in terms of resources. Incremental update allows for more effective management of these large volumes of data.

Let's get to work!!!

First, when opening our Power BI Desktop. Right-click on our table, and we see the option for incremental update.

A dialog window appears.

Initially, we must have certain parameters configured within Power Query for the options to be activated.

To do this, go to the Power Query editor, click on "Transform Data."

While in Power Query, click on "Manage Parameters" as indicated.

Select "New Parameter" to create a new parameter.

In the dialog box that appears, we will place the name of the RangeStart parameter

Choose "Date/Time" as the data type for the parameter. You can also set it specifically as "Date."

In the dialog box that appears, we will place the name of the RangeStart parameter

Now we will add another parameter called RangeEnd. Edit it in the same way, but with a future or distant date (Remember these dates are just for filling in the data and will not have a valid effect).

Having created our parameters, we move on to link these with a column from our table, which will be the reference for these parameters. In our example, this is the column named 'Fecha Actualización' (Update Date).

Click on the dropdown menu.

Select "Custom Filter".

We will see the following dialog box.

We select in the dialog box, the first option is 'after or equal to'.

In the calendar figure, we click and select 'parameter’.

We select 'RangeStart'.

Now, we proceed to the second field, which will be 'before,' meaning everything that comes before the 'RangeStart'.

Leaving the entire configuration as follows.

We close and apply. After the changes are applied, we right-click on our table and see the 'Incremental Update' option.

A dialog box appears, and now we have the option to activate incremental mode.

In the first option of our dialog box, it asks how much historical data we want to have. For our example, we will use 10 years. This parameter will be the first one to run when we upload our file to the service, and the second one is the limit for our dataset.

We move on to the second option, which is the amount of information based on time that it will retrieve. For our case, it will be 5 days.

There are three more options in which we manage each process:

1. If we are using Direct Query (which is not the case in this example).

2. The second option is applicable in our case, as we will only update complete days.

3. For the last option, it is recommended to have an audit column to reference this check with those columns and update the records where modifications have occurred.

Now we should have a dialog box configured as follows:

Now, we publish our report to Power BI Service.

We go to our workspace where we host our Power BI.

We click on 'Refresh' in the semantic model, and we will see it update efficiently.

Now, to witness the magic of this process, we will edit our source data file by changing the last record from 250,000 to 25. Remember that this one will be updated because the update date column is within the range to be updated.

To further demonstrate, we edit a record that is more than 10 days old, for example, changing the number 5 from 12 to 1. However, this record will not show its update in the service.

Leaving our source table as follows:

Now, we click on 'Refresh' in the semantic model.

We will see it update efficiently. As we can see, the number 10 has been effectively updated to 25 in the 'Product' column, but the number 5 remains unchanged because the 'Update Date' column is older than 10 days. This is the magic of incremental updating.


Incremental updating in Power BI is not just an additional technical feature; it represents a revolution in the handling and analysis of large-scale data. In today's business world, where data is a critical asset and the pace of business is rapid, the ability to quickly process and analyze large volumes of information has become a fundamental competitive advantage.

The 'magic' of incremental updating in Power BI lies in its ability to transform the way large datasets are handled and updated, making the process more efficient and manageable.