Power BI October 2025 Feature Summary — AI-Powered Insights, Smarter Reporting, and a Future-Ready Platform

October didn’t disappoint for Power BI fans. This month’s update is a powerhouse of innovation, from AI-driven Copilot for DAX and sleek Button Slicer upgrades, to long-awaited features like Export Query Results, ARM support, and a new Power BI Controller for PowerPoint that finally makes managing embedded visuals painless.

Whether you’re an analyst fine-tuning dashboards or an enterprise architect optimizing your Fabric workflows, there’s something here that will make your data life easier (and maybe even fun). Let’s dive in.

General: Goodbye Bing Maps, Hello Azure Maps

Microsoft is moving full steam ahead with the transition from Bing Maps to Azure Maps in Power BI. This change future-proofs map visuals with richer capabilities and better integration across Microsoft’s ecosystem.

  • Power BI Desktop & Service: The Bing Maps icon will stick around for now, but it’s still on the deprecation path. Start upgrading to Azure Maps if your reports aren’t used in China, Korea, or government clouds.
  • Paginated Reports: Phase 1 migration to Azure Maps is complete in Power BI Report Builder (PBIRB). Phase 2: service-side migration, wraps by mid-November 2025.

Paginated report authors who still rely on Bing Maps can temporarily switch back while the Azure Maps migration completes. To do so, set the RevertToBingMaps registry key to 1 under:

Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft Power BI Report Builder

If the Microsoft Power BI Report Builder folder doesn’t exist, simply create it before adding the key. This option allows continued authoring with Bing Maps until both migration phases are fully rolled out.


Copilot and AI: Let Copilot Write Your DAX

DAX just got way less intimidating. The new Copilot-powered DAX query view (now Generally Available) lets you describe the insights you need in plain English and Copilot writes the DAX for you. You can edit the query conversationally, test it live, and learn as you go.

Copilot is context-aware, leveraging your semantic model, understanding relationships, measures, and hierarchies. Whether you’re exploring data in the DAX query view, chat with your data, or semantic model–backed data agents, Copilot is ready to help.

To learn more, refer to the Write DAX queries with Copilot documentation, and try it out today!


Reporting: Visual Power and Presentation Control

Button Slicer (GA) — More Interaction, Less Effort

The Button Slicer graduates from preview with new cross-highlighting and Auto Grid features.

  • Cross-highlighting: Like the Chiclet slicer, it dims unrelated visuals and highlights related ones, helping you instantly spot correlations.
  • Auto Grid: Automatically arranges rows and columns to fit your layout. No more pixel-perfect wrestling.

Accessibility improvements make it more inclusive, and it’s now your go-to slicer for modern report interactivity.

For example, following with Auto grid on:


Visual Calculations in Embedded Reports (Preview)

Embedded analytics gets a boost, visual calculations are now supported in “Embed for your customers” scenarios.
You can define calculations directly in visuals without touching the data model or writing DAX. This means ISVs and developers can deliver faster, dynamic insights in embedded Power BI solutions.


Grow to Fit — Tables that Auto-Resize Gracefully

A small change with a big impact: Power BI tables now support “Grow to Fit”, automatically distributing unused space across columns. It’s the polish your reports deserve — clean, balanced, and professional, no manual resizing required.


Power BI Controller for PowerPoint (Preview)

Say goodbye to repetitive manual updates in slide decks. The new Power BI Controller add-in for PowerPoint acts as a central control panel to manage all embedded Power BI visuals across multiple slides.

Bulk refresh, update, or replace visuals in one shot. Simply install the add-in from the PowerPoint ribbon, open the command pane, and let the Controller handle the heavy lifting.


Performance Analyzer in Web

The Performance Analyzer pane — long loved in Desktop, is now available in the web experience. You can finally measure the load times of visuals directly where your users view them.

Rollout hits all tenants within weeks.


Data Connectivity: Export Query Results (Preview)

This one’s huge for data engineers and Fabric users. Export Query Results lets you take transformed Power Query data and export it directly to Dataflows Gen2, Lakehouses, or OneLake — all from Power BI Desktop.

Why It Matters

  • No more copy-paste or third-party hacks.
  • Seamless handoff from analyst prep to enterprise storage.
  • Streamlined refresh, monitoring, and interoperability across Fabric workloads.

How to Try It

  1. Go to File → Options → Preview Features → Export Queries from Power Query.
  2. In Power Query Editor, open Export Query Results.
  3. Choose a Fabric destination and credentials, then export.

Result: A Fabric Dataflow Gen2 created automatically in your workspace — ready for refresh and governance.

Open the Export query results feature

After enabling the preview:

  • Open Power Query Editor from the ribbon (Transform) or the query menu (Edit Query).
  • In the ribbon, select Export Query Results.

Check Fabric access

To use this feature:

  • You need access to Microsoft Fabric.
  • Your My Workspace must be assigned to a Fabric or trial capacity.
  • If these conditions aren’t met, you’ll see an error message.

Options:

  • Sign up for a Fabric trial and assign your workspace.
  • Assign My Workspace to a Fabric capacity in your organization.
  • If you can’t enable Fabric, contact your Fabric admin.
  • Select a destination
  • Once your workspace is assigned to Fabric, choose an online destination:
  • Supported destinations match those in Fabric Dataflows.
  • You can also pick existing OneLake destinations.

Enter credentials

After selecting a destination:

  • Enter your credentials.
  • Select Connect, then Choose.
  • Confirm and export

You see a summary screen:

  • A Fabric Dataflow Gen2 is created in My Workspace.
  • You can rename the dataflow but can’t change the workspace yet.
  • Expand Manage Connections to check or fix query connections.
  • When ready, select Export.
  • The next screen shows export progress.
  • After completion, review your dataflow in Fabric.
  • If errors occur, open the dataflow in Fabric and run it to debug.

Platform Support: Power BI Desktop on ARM

Power BI Desktop now runs natively on ARM-based Windows devices (with KB5065789 installed).
That means better performance, lower battery usage, and optimized experience for lightweight, energy-efficient devices — ideal for users working on the go.

This move aligns with Microsoft’s broader ecosystem shift toward ARM and ensures Power BI stays modern, fast, and portable.


Visualizations: Fresh, Functional, and Fun

The marketplace is buzzing this month — here are the standouts:

Sankey Chart by Powerviz

Flow your data like never before. The Sankey Chart visual adds vertical/horizontal orientations, multi-level support, color-blind palettes, conditional formatting, and even image labels. Perfect for customer journeys, expense flows, and supply chains.


Your Timeline Slicer

Microsoft’s compact, modern timeline visual saves up to 60% dashboard space while introducing Current Period and Latest Available filters. It’s sleek, adaptive, and made for executive dashboards.


Drill Down Scatter PRO by ZoomCharts

Intuitive data exploration meets smartphone-like UX: zoom, pinch, click to drill down. Great for interactive scatter plots with regression lines, category clusters, and visual storytelling.


Multiple Sparklines

Build a complete Income Statement in one visual with trend, comparison, and benchmark columns. Clean, compact, and storytelling-ready.


Performance Bar by JTA

Minimalist progress bars with optional total bars, dynamic labels, and animations. Ideal for tracking KPIs in a sleek way.


Financial Reporting Matrix v8.2

Adds RowHeaderParent(), Invert Sign Factor, conditional formatting enhancements, pinned columns, and comment support (even exporting comments to Excel!). It’s the accountant’s new best friend.


BI Pixie by DataChant

A robust governance and adoption tracker now measuring six dimensions from Adoption to Security. BI Pixie audits Power BI usage and detects RLS misconfigurations, helping orgs boost ROI on analytics.


Closing Thoughts

October 2025 proves that Power BI isn’t slowing down, it’s accelerating into an era of AI-assisted modeling, Fabric-integrated data mobility, and cross-platform performance.
Copilot is rewriting DAX (literally), the Button Slicer is smarter than ever, and Power BI is now comfortably running on ARM because analytics should be as agile as the people using it.


Final Takeaway

If you’re building reports or managing analytics platforms:

  • Start transitioning your maps to Azure Maps.
  • Enable Copilot for DAX and Export Query Results.
  • Experiment with the Power BI Controller for PowerPoint for presentation efficiency.
  • Explore the new visuals, they’re not just eye candy; they simplify storytelling.

Thank you for stopping by. ✌️

Source: Power BI October 2025 Feature Summary

Power BI – Restore Datasets to new on-premise Gateway when old Gateway has failed or Recovery Key is lost

Power BI on-premises data gateway is a service running on a Windows server working as a connecting platform between the Power BI cloud service and the on-premise data sources.

Setting up a data gateway on-premise is fairly a straightforward process. There can be instances where your on-premise gateway fails because of a hardware failure, issues due to updates or you may want to move the gateway instance to a new server then you realize you need the recovery key which is no where to be found.

Without a functioning gateway, the reports and dashboard in the Power BI cloud service with datasets that are connected to on-premise data sources will fail resulting in data to become stale. I’ll elaborate more on this issue in this post on how to restore datasets from an old or failed on-premise gateway to a new gateway.

I faced a similar scenario recently and it was a great learning experience. There are few methods using which you can resolve this issue. I’ll try to cover them all in as much detail as possible.

Manual Method

Well.. If you don’t have too many data sources on-premise or if you are just planning for a quick fix maybe because someone important in your organization needs this fixed and they notified you like an hour before their big meeting.

Here are the high-level steps,

Once you install and configure the data gateway, you can see and manage both the old and new instances from the Power BI portal.

To add a user as admin to the gateway in the portal, follow below steps.

This image has an empty alt attribute; its file name is image-15-1024x370.png

Search user using username or email address, select Admin and click Share.

This image has an empty alt attribute; its file name is image-16-1024x504.png

Now to add a new data source, from the page header in the Power BI service, select Settings gear icon > Manage gateways.

I have highlighted my failed gateway and the new gateway server in my case.

Your next step is to determine the data source of the affected dataset. To get this information, you’ll need access to the workspace. As you can see, I have a report named ‘AdventureWorksProducts‘ and the underlying dataset with the same name.

Under the Gateway connection section, you’ll find the necessary information to setup the data source in the new gateway.

Back in the Manage Gateways page and in the Data sources tab, Click on New

Choose the new gateway to create the connection on, then select the Data source type. In my scenario, I picked SQL Server.

Once you provide all the information, Select Create. You see Created New data source if the process succeeds and a new data source entry like in screenshot below.

If you’ve made it this far, you are almost at the end of this method. Now back to the dataset’s settings like we did earlier and on to the Gateway connection section. As a reminder, you’ll need access(Admin, Member or Contributor) to the workspace and to the dataset, also keep in mind that you also need admin permissions on the gateway.

You should see the new data source we created listed. Select it from the drop-down and click Apply

That should take care of the connection and to confirm, you can refresh your dataset to make sure the connection works ok.

Like I said earlier, this method should be good in a small environment or if you are in a hurry to get it fixed and worry about the bulk of things later. I’ll cover the semi-automated way in the coming sections. I use the word automated loosely here but it’s more like less clicks and not moving around in the BI portal as much.

Using a Service Account

In this method, I’m using a service account or in other words a regular user account without any roles assigned to it. This can be an AD synced account or a Azure AD cloud only account. This account will need a Power BI Pro license assigned to it.

Here are the high-level steps,

I’ve already covered the adding data source part to the gateway in the earlier section and the process is same for this method too. You can do it with PowerShell or REST APIs but I don’t believe there is a method to copy the data sources from one gateway to another.

Permissions

In this method, I’m using a service account which was granted Admin permissions for the gateways and set as Owner on the data source. You should be able to get away with just having the account set as user on the data source. This service account is also set as Admin on the workspace but Member or Contributor should do the trick.

You can grant the gateway admin permission in the portal which I’ve covered in the earlier method or use the below script to add the user as admin.

Connect-AzureAD
Connect-DataGatewayServiceAccount
Get-DataGatewayAccessToken

Get-DataGatewayCluster
$gw = Read-Host "Enter Gateway ID"
$user = Read-Host "Enter username to be added as gateway admin"
$userToAdd = (Get-AzureADUser -Filter "userPrincipalName eq '$user'").ObjectId
Get-DataGatewayRegion
$Region = Read-Host "Enter region value where IsDefaultPowerBIRegion is set to true"
Add-DataGatewayClusterUser -GatewayClusterId $gw -PrincipalObjectId $userToAdd -AllowedDataSourceTypes $null -Role Admin -RegionKey $Region

With all these permissions, the service account still needs to take ownership of the dataset to finish rebinding the data source to the new gateway. You won’t have to manually take ownership of the dataset, the script below will do it for you on the dataset you specify.

Rebind dataset

Before proceeding further make sure you have the Microsoft Power BI Cmdlets for PS installed and logged in to the Power BI service using PowerShell,

Connect-PowerBIServiceAccount
Get-PowerBIAccessToken

I don’t do Power BI administration on a daily basis and there was a learning curve for me to understand the inner workings. Here is the thought process that went into building this script.

  1. Get all the gateways the service account has access to
    • Using the output, determine and copy the new gateway ID and store it in a variable
  2. Using the variable from earlier step, return a list of data sources from the new gateway
    • Using the output, determine and copy the data source ID where the affected dataset should be mapped to and store it in a variable
  3. Returns a list of workspaces the user has access to
    • Using the output, determine and copy the workspace ID which has the affected dataset
  4. Using the variable from earlier step, return list of datasets from the specified workspace
    • Using the output, determine and copy the affected dataset’s ID
  5. Using the variable from step 3 and step 4, transfer ownership over the specified dataset to the service account
  6. Using variable from steps 1, 2, 3 and 4, bind the specified dataset from the specified workspace to the new gateway
$gateways = Invoke-PowerBIRestMethod -Url "gateways" -Method Get | ConvertFrom-Json
$gateways.value
Write-Host "Please copy the new Gateway ID from above output" -ForegroundColor Red
$newGWID = Read-Host "Please paste the new Gateway ID"

$GWdatasources = Invoke-PowerBIRestMethod -Url "gateways/$($newGWID)/datasources" -Method Get | ConvertFrom-Json
$GWdatasources.value
Write-Host "Please note down the Data Source ID used by the dataset that needs to be migrated from above output" -ForegroundColor Red
$datasourceObjectIds = Read-Host "Please paste the Data source ID"

$ws = Invoke-PowerBIRestMethod -Url 'groups' -Method Get | ConvertFrom-Json
$ws.value
Write-Host "Please note down the Workspace ID which has the dataset that needs to be migrated from above output" -ForegroundColor Red
$wsID = Read-Host "Please paste the Workspace ID"

$dataset = Invoke-PowerBIRestMethod -Url "groups/$($wsID)/datasets" -Method Get | ConvertFrom-Json
$dataset.value
Write-Host "Please note down the dataset ID that needs to be migrated from above output" -ForegroundColor Red
$dsID = Read-Host "Please paste the dataset ID"

#This below line is not needed if the service account already has ownership of the dataset and is safe to comment out
Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.TakeOver" -Method POST

try { $body = "{
  'gatewayObjectId': '$newGWID',
  'datasourceObjectIds': [
    '$datasourceObjectIds'
  ]
}"

Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.BindToGateway" -Body $body -Method POST
Write-Host "Dataset updated" }

catch {
  Write-Host "An error occurred"
}

You can adjust this script according to your needs as in some instances, your gateway ID, new data source ID and workspace ID will be the same, only the affected dataset ID will vary.

Using a Service Principal

In this method, I’m using a service principal to accomplish the same as above. One added advantage of using this method is, the Power BI Dataset can be setup to refresh without an actual user account. This would be great from an automation point of view and to avoid being tied to a specific user.

Here are the high-level steps,

Create SPN

The az ad app is part of Azure CLI and not a PS cmdlet. You’ll need to have Azure CLI installed and do az login as well before running this.

Connect-AzureAD 
Connect-AzAccount
az login

You can create an Azure AD application which will be the service principal from the portal and grant the and grant the ‘Dataset.ReadWrite.All’ API permission or use the below lines to create it. I’ve detailed how to determine the API ID and Permission ID in this blog post here.

A new Azure AD group is also needed and the Azure AD application has be made a member of this group. The below lines will accomplish that and if you have an existing group you have in mind, you can use that too. I’ll go over the reason for creating this group later in this section.

$appname = Read-Host "Enter a name Azure AD Application's Display Name"
$ObjID = New-AzureADApplication -DisplayName $appname | Select ObjectId
Add-AzADAppPermission -ObjectId $ObjID.ObjectId -ApiId 00000009-0000-0000-c000-000000000000 -PermissionId 322b68b2-0804-416e-86a5-d772c567b6e6 -Type Scope
Start-Sleep -Seconds 60
az ad app permission admin-consent --id $ObjID.ObjectId
Get-AzureADApplication -Filter "DisplayName eq '$appname'" | fl

$grpName = Read-Host "Enter a name for new Azure AD group"
$grpID = (New-AzureADGroup -DisplayName $grpName -MailEnabled $false -SecurityEnabled $true -MailNickName "NotSet").ObjectId
Get-AzureADGroup -ObjectId $grpID
Add-AzureADGroupMember -ObjectId $grpID -RefObjectId $spnToAdd
Get-AzureADGroupMember -ObjectId $grpID

The Get-AzureADApplication cmdlet will list the API permissions we applied. This can be verified in the ‘App registrations‘ blade from the Azure AD portal too.

Create a new Secret in this Azure AD application. You can also achieve this by using PowerShell. This secret value is needed for authentication while running the script later this section.

Remember to copy the secret value as it’ll be masked forever.

And we can also make sure of the group we created and it’s membership. I named the group, ‘PBI-API‘ in Azure AD.

For an Azure AD app to be able to access the Power BI content and APIs, the following settings need to be enabled in Azure AD portal. This is where the Azure AD group comes into play.

Go to Tenant settings in the Power BI Admin portal, and scroll down to Developer settings

  • Enable the Allow service principals to use Power BI APIs
  • Enable the Allow service principals to create and use profiles

Create SPN profile

I noticed that the SPN way of doing things worked in one instance without having a service principal profile created by the service principal. Profiles can be created using Profiles REST API. I’ve included the below lines which will create a profile for the SPN.

$prof = Read-Host "Enter a name for SPN's profile"

$body = "{
    'displayName' : '$prof'
}"

Invoke-PowerBIRestMethod -Url 'https://api.powerbi.com/v1.0/myorg/profiles' -Body $body -Method POST

A service principal can also call GET Profiles REST API to get a list of its profiles.

Invoke-PowerBIRestMethod -Url 'profiles' -Method Get

Permissions

Next, the service principal needs permissions on the dataset. We can achieve this by granting permissions to the service principal on the workspace.

Note: Adding the Azure AD group that has SPN as members doesn’t work

This next step is kind of where things get tricky.

What are we trying to achieve here?

  • Grant the service principal, admin permissions on the new gateway
  • Grant the service principal, user permissions on the gateway data source

Reason why it is tricky is, I first tried adding the Azure AD group the above permissions and it allowed me to add it but the script which comes later in this section didn’t work as expected. Based on further research, I realized that the SPN needs to be granted the above access directly instead of using the Azure AD group. Also, at the time of writing this post, adding SPN the above permissions using the portal is not supported. Hence, we’ll have to use PowerShell cmdlets,

Before proceeding further, please connect to the AzAccount and PowerBIService using the below cmdlets,

Connect-AzAccount
Connect-PowerBIServiceAccount
Get-PowerBIAccessToken

The below script will add the permissions I mentioned above and display the same at the end of executing the cmdlets. One good thing about the part where you add permissions to the gateway, data sources and workspaces is, it is a one-time deal.

Get-DataGatewayCluster
$gw = Read-Host "Enter Gateway ID"
$spn = Read-Host "Enter App name to be added as gateway admin"
$spnToAdd = (Get-AzADServicePrincipal -DisplayName $spn).Id
Get-DataGatewayRegion
$Region = Read-Host "Enter region value where IsDefaultPowerBIRegion is set to true"
Add-DataGatewayClusterUser -GatewayClusterId $gw -PrincipalObjectId $spnToAdd -AllowedDataSourceTypes $null -Role Admin -RegionKey $Region
Get-DataGatewayCluster -GatewayClusterId $gw | Select -ExpandProperty Permissions | ft
Get-DataGatewayClusterDatasource -GatewayClusterId $gw
$gwDSID = Read-Host "Enter Gateway Cluster DatasourceId"
Add-DataGatewayClusterDatasourceUser -GatewayClusterId $gw -GatewayClusterDatasourceId $gwDSID -DatasourceUserAccessRight Read -Identifier $spnToAdd
Get-DataGatewayClusterDatasourceUser -GatewayClusterId $gw -GatewayClusterDatasourceId $gwDSID

With all the permissions for the SPN now in place, we are ready to take ownership of the affected datasets in the workspaces and bind it with the new data source on the new gateway

Rebind dataset

In this SPN method, Instead of logging in with a username and password, you’ll have to login with the Application ID and secret

$Tenant = Read-Host "Enter Azure AD Tenant ID"
Connect-PowerBIServiceAccount -Tenant $Tenant -ServicePrincipal -Credential (Get-Credential) #user = Application (client) ID | Password is the secret value we created earlier in this section
Get-PowerBIAccessToken

The script is pretty much the same as in earlier section but only runs in the SPN context.

$gateways = Invoke-PowerBIRestMethod -Url "gateways" -Method Get | ConvertFrom-Json
$gateways.value
Write-Host "Please copy the new Gateway ID from above output" -ForegroundColor Red
$newGWID = Read-Host "Please paste the new Gateway ID"

$GWdatasources = Invoke-PowerBIRestMethod -Url "gateways/$($newGWID)/datasources" -Method Get | ConvertFrom-Json
$GWdatasources.value
Write-Host "Please note down the Data Source ID used by the dataset that needs to be migrated from above output" -ForegroundColor Red
$datasourceObjectIds = Read-Host "Please paste the Data source ID"

$ws = Invoke-PowerBIRestMethod -Url 'groups' -Method Get | ConvertFrom-Json
$ws.value
Write-Host "Please note down the Workspace ID which has the dataset that needs to be migrated from above output" -ForegroundColor Red
$wsID = Read-Host "Please paste the Workspace ID"

$dataset = Invoke-PowerBIRestMethod -Url "groups/$($wsID)/datasets" -Method Get | ConvertFrom-Json
$dataset.value
Write-Host "Please note down the dataset ID that needs to be migrated from above output" -ForegroundColor Red
$dsID = Read-Host "Please paste the dataset ID"

Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.TakeOver" -Method POST

try { $body = "{
  'gatewayObjectId': '$newGWID',
  'datasourceObjectIds': [
    '$datasourceObjectIds'
  ]
}"

Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.BindToGateway" -Body $body -Method POST
Write-Host "Dataset updated" }

catch {
  Write-Host "An error occurred"
}

Similar to the earlier section, you can adjust this script according to your needs as in some instances, your gateway ID, new data source ID and workspace ID will be the same, only the affected dataset ID will vary.

Needless to say, you can test if this was successful by doing a ‘Refresh now‘ on the dataset.

Issues you may encounter and How to fix it

Issue: You may encounter below status codes while running the Invoke-PowerBIRestMethod

Response status code : 404 (Not Found)
Response status code : 400 (Bad Request)

Fix or workaround: Well.. If you’ve already browsed though community.powerbi.com, then might have already realized that you are not alone dealing with these error codes. Usually this means you are requesting the Power BI REST API endpoints for data that doesn’t exist or you or the SPN that’s requesting the resource doesn’t have the necessary permissions to it. These best way to troubleshoot is to run these requests one at a time to determine where you it is failing or understand which resource you don’t have permissions to.

Issue: Applied permissions don’t reflect in the portal

Fix or workaround: I noticed that some of the changes takes time. Give it a few minutes before you go changing more things and you lose track of all the things you’ve changed in the process. If the permissions still didn’t show up for a while, use PowerShell cmdlets to verify if the permissions you’ve set was applied or not.

I’ll keep experimenting other scenarios and I’ll update the issues I come across later on.

This was one of those really lengthy posts but hey..as long as there is a solution at the end..Hopefully..am I right?..😁🤷‍♂️

Thank you for stopping by.✌