Understanding billing account for Microsoft Customer Agreement (MCA)

Let’s be honest—cloud billing isn’t exactly the most exciting topic. But do you know what’s worse? Opening your Azure bill and feeling like you need a detective’s magnifying glass to figure out what’s going on. 🤯

If you’ve got a Microsoft Customer Agreement (MCA), understanding your billing account is key to keeping your cloud costs in check and avoiding any surprise charges. So, grab a coffee, and let’s break it down in a way that actually makes sense. ☕

What is an MCA Billing Account (And Why Should You Care)?

Think of your MCA billing account as the command center for all your Azure charges. It’s where you manage invoices, payments, and who gets to see (or mess with) your billing details. If Azure billing were a Netflix account, your billing account would be the primary profile—the one that controls everything.

Key Things Your MCA Billing Account Lets You Do:

  • View and manage invoices and payment methods 🧾
  • Set up multiple billing profiles for different teams or departments 🏢
  • Assign roles and permissions (so not everyone can max out the budget!)
  • Track spending across subscriptions 💰

If you’re managing an MCA billing account, congrats! You’ve got the keys to the financial kingdom—use them wisely.

Azure Billing Account: The Big Picture 🎯

Your Azure Billing Account is the home base for all things billing-related in your MCA. It’s where invoices, payments, and spending details live. If you think of Azure like a streaming service, your billing account is your main subscription—everything starts from here.

What You Can Do with an Azure Billing Account:

  • View and manage invoices 🧾
  • Set up and control billing profiles 💳
  • Assign billing roles to different users 👥
  • Track spending across all subscriptions 💰

This is your financial cockpit—control it wisely!

Billing Profiles: Keeping Budgets Organized 🏢

A Billing Profile is like a separate tab on your credit card statement for different teams, projects, or departments. Instead of one giant invoice that makes your head spin, you can split up costs for better organization.

Why Billing Profiles Matter:

  • They generate separate invoices for different teams.
  • You can set up different payment methods for each profile.
  • They help track spending more effectively.

So, if your company has an AI research team and a DevOps team, they can each have their own billing profile—no messy financial mix-ups!

Invoice Sections: Breaking Down Costs Clearly 📄

Under each Billing Profile, you have Invoice Sections. Think of these as subfolders inside your billing profiles—perfect for breaking down costs by project, department, or even specific environments (like Dev vs. Production).

How Invoice Sections Help:

  • You can group charges logically (e.g., marketing vs. engineering).
  • It makes cost tracking super clear.
  • Helps with financial reporting—no more guessing where money went!

If Billing Profiles are the different tabs on your statement, Invoice Sections are like itemized charges—they give you a clearer breakdown.

Subscriptions: Where the Magic Happens

Your Azure Subscriptions are where your actual cloud services live—virtual machines, databases, AI services, you name it. But each subscription needs to be linked to a Billing Profile to be paid for.

Key Things to Know About Subscriptions:

  • They inherit billing settings from their assigned billing profile.
  • You can have multiple subscriptions under one billing account.
  • Each subscription can be assigned to an Invoice Section for better tracking.

Think of it like multiple mobile lines on a family plan. Each line (subscription) has its own usage, but they all roll up into the main bill (billing profile).

Optimizing and Tracking Azure Costs

To effectively manage and optimize your Azure expenditures, consider the following practices:

  • Strategic Structuring: Align your billing profiles and invoice sections with your organization’s hierarchy or project structure. This alignment ensures that invoices reflect your internal financial organization, simplifying reconciliation and reporting.
  • Role-Based Access Control: Assign appropriate roles to team members based on their responsibilities. Azure offers various billing roles, such as Billing Account Owner, Billing Profile Owner, and Invoice Section Owner, each with specific permissions. Implementing role-based access ensures that individuals have the necessary access to perform their tasks without compromising security.
    • Billing Account Owner – The supreme leader of the billing universe. Full access.
    • Billing Profile Owner – Controls billing for one profile (but not the entire account).
    • Billing Profile Contributor – Can manage invoices and payments but not assign roles.
    • Billing Reader – Can see invoices but can’t touch them (great for finance teams!).
  • Regular Monitoring: Utilize Azure’s cost management tools to monitor spending across different billing profiles, invoice sections, and subscriptions. Regular analysis helps in identifying trends, detecting anomalies, and making data-driven decisions to optimize costs.
  • Budgeting and Alerts: Set up budgets and configure alerts for your billing profiles and invoice sections. Proactive notifications enable you to address potential overspending promptly, ensuring adherence to financial plans.

Pro Tips to Avoid Billing Headaches

  1. Assign Roles Wisely – Not everyone needs full access! Keep spending power in the right hands.
  2. Use Billing Profiles for Better Organization – Split billing by department or project to track spending easily.
  3. Enable Cost Management Tools – Azure has built-in cost tracking to help you avoid end-of-month surprises.
  4. Regularly Review Invoices – Set up a habit of checking your invoices to catch any unexpected charges.

Final Thoughts: Take Control of Your Azure Billing 💡

Understanding your MCA Billing Account isn’t just about paying bills—it’s about controlling costs, organizing expenses, and making sure your finance team doesn’t hunt you down. 😅

So next time you log into Azure, don’t panic at your invoice. Instead, think:

  • Is my billing organized?
  • Am I using Billing Profiles and Invoice Sections properly?
  • Do I need to adjust roles to keep spending in check?

Thanks for stopping by. ✌

Guide to Azure Private Endpoint vs Service Endpoint

In the realm of Azure networking, two pivotal features enhance the security and accessibility of your resources: Azure Private Endpoints and Azure Service Endpoints. Understanding their functionalities and differences is crucial for architecting secure and efficient cloud solutions.

Azure Service Endpoints

Azure Service Endpoints extend your virtual network’s identity to Azure services over the Azure backbone network. When a service endpoint is enabled on a subnet, traffic from that subnet to the Azure service remains within Microsoft’s network, reducing exposure to the public internet.

Key Features:

  • Simplified Security: Service endpoints allow you to secure Azure resources to specific virtual networks, enhancing control over which subnets can access particular services.
  • Optimized Routing: Traffic is routed directly through the Azure backbone, potentially reducing latency compared to routes over the public internet.
  • Integration with Network Security Groups (NSGs): You can leverage NSGs to control access, ensuring that only designated subnets or virtual networks can communicate with specific services.

Considerations:

  • Public Endpoint Usage: Despite routing over the Azure backbone, service endpoints connect to the service’s public endpoint, which may not meet stringent security requirements.
  • Azure-Only Access: Service endpoints are designed for traffic originating within Azure. On-premises resources cannot utilize service endpoints and must access services over the public internet.

Azure Private Endpoints

Azure Private Endpoints assign a private IP address from your virtual network to an Azure service, effectively bringing the service into your private address space. This setup ensures that traffic between your virtual network and the service remains entirely within the Azure network, eliminating exposure to the public internet.

Key Features:

  • Private IP Connectivity: Services are accessible via a private IP address within your virtual network, ensuring that all traffic stays within the private network.
  • Enhanced Security: By eliminating public internet exposure, private endpoints are ideal for sensitive data and applications requiring stringent security measures.
  • DNS Integration: Private endpoints require proper DNS configuration to resolve the service’s private IP address. Azure provides automatic DNS resolution, but custom configurations are also supported.

Considerations:

  • Complexity and Cost: Implementing private endpoints can be more complex and may incur additional costs due to the need for DNS configuration and management of private IP addresses.
  • Broader Access: Private endpoints allow access from on-premises networks and other virtual networks, provided they are connected, facilitating hybrid cloud architectures.

Comparison of Azure Service Endpoints and Private Endpoints

FeatureService EndpointsPrivate Endpoints
Connection TypeExtends VNet identity to the service’s public endpoint over Azure backboneAssigns a private IP from your VNet to the service
Security LevelEnhanced security by restricting access to specific VNets; still uses public endpointHighest security with no exposure to public internet
DNS ConfigurationNo changes required; uses public DNSRequires DNS updates to resolve private IPs
Access ScopeOnly from within Azure VNetsAccessible from on-premises and other VNets via private IP
Supported ServicesLimited to specific Azure servicesSupported by a broader range of Azure and third-party services
Use CaseSuitable for scenarios where enhanced security is needed without complex setupIdeal for sensitive data and applications requiring complete isolation

Choosing Between Service Endpoints and Private Endpoints

  • Opt for Service Endpoints if:
    • You need a straightforward way to enhance security for Azure services accessed from within Azure.
    • Your applications do not require complete isolation from the public internet.
    • You prefer minimal configuration without the need for DNS management.
  • Opt for Private Endpoints if:
    • Your applications handle sensitive data necessitating complete isolation from the public internet.
    • You require secure access from on-premises networks or other virtual networks.
    • You are prepared to manage the additional complexity and costs associated with private IP configurations and DNS management.

In summary, both Azure Service Endpoints and Private Endpoints serve to secure access to Azure services, but they cater to different security requirements and use cases. Assess your application’s specific needs to determine the most appropriate solution.

Azure Role-Based Access: Who’s Got the Keys to the Cloud Castle?

Alright, let’s talk about Azure Role-Based Access Control (RBAC)—the bouncer at the club, the gatekeeper of your cloud kingdom, the difference between “Oops, I deleted the production database” and “Phew, good thing I didn’t have permission for that.”

If you’re working with Microsoft Azure, RBAC is a must-know. It’s how you control who can do what in your cloud environment. Let’s break it down in a fun, easy-to-digest way.


What is Azure RBAC, and Why Should You Care?

Think of Azure RBAC like a high-tech office building with keycards. Not everyone should have access to every room, right? Your interns shouldn’t be able to access the CEO’s private office, and the janitor doesn’t need the nuclear launch codes.

RBAC works the same way in Azure:

  • You assign roles to users, groups, or applications instead of just giving them full access.
  • It’s based on the principle of least privilege, meaning people only get access to what they need—nothing more, nothing less.
  • It prevents chaos. Because let’s be real, one accidental click from an over-permissioned user can lead to disaster.

The Three Key Pieces of RBAC

Azure RBAC is built on three main pieces:

  1. Roles: These define what someone can do. Examples:
    • Owner – The boss. Can do anything and everything.
    • Contributor – Can create and manage resources but can’t assign roles.
    • Reader – Can look, but not touch.
    • Custom Roles – If the built-in roles aren’t enough, you can create your own.
  2. Scope: This defines where the role applies. It can be at:
    • Subscription level (the whole kingdom)
    • Resource group level (a city inside the kingdom)
    • Specific resources (a single castle or shop)
  3. Assignments: This is the who gets what role part. Assign a user, group, or service principal to a role at a given scope, and boom—permissions granted.

Real-World Example: The Coffee Shop Analogy ☕

Imagine you’re running a coffee shop:

  • The Owner (you) can do everything—order supplies, hire staff, make coffee, or even shut down the store.
  • The Baristas (contributors) can make coffee and manage the store but can’t hire or fire anyone.
  • The Customers (readers) can look at the menu, enjoy their coffee, but they’re not allowed behind the counter.

That’s Azure RBAC in action. Everyone gets access to what they need, but no one is accidentally pressing the “shutdown entire store” button.


Common RBAC Mistakes (And How to Avoid Them)

  1. Giving Everyone Owner or Contributor Roles – That’s like handing out master keys to your entire office. Keep permissions minimal!
  2. Not Using Groups – Assigning roles individually? Big mistake. Use Azure AD groups to manage permissions efficiently.
  3. Ignoring Scope – Always assign roles at the lowest necessary level to avoid over-permissioning.
  4. Forgetting to Review Roles Regularly – People leave jobs, projects change, and roles should be updated accordingly.

Final Thoughts: Lock It Down, But Keep It Practical

Azure RBAC is all about control, security, and making sure the right people have the right access. It’s not just an IT thing—it’s about keeping your cloud environment safe and sane.

So next time you’re setting up roles in Azure, ask yourself:

  • Does this person really need this level of access?
  • Could I use a lower scope?
  • Am I following best practices?

Get it right, and your cloud stays secure. Get it wrong, and… well, let’s just say you don’t want to be the person who accidentally gives the intern the power to delete the company’s entire infrastructure.

Thank you for stopping by.✌

Power BI – Restore Datasets to new on-premise Gateway when old Gateway has failed or Recovery Key is lost

Power BI on-premises data gateway is a service running on a Windows server working as a connecting platform between the Power BI cloud service and the on-premise data sources.

Setting up a data gateway on-premise is fairly a straightforward process. There can be instances where your on-premise gateway fails because of a hardware failure, issues due to updates or you may want to move the gateway instance to a new server then you realize you need the recovery key which is no where to be found.

Without a functioning gateway, the reports and dashboard in the Power BI cloud service with datasets that are connected to on-premise data sources will fail resulting in data to become stale. I’ll elaborate more on this issue in this post on how to restore datasets from an old or failed on-premise gateway to a new gateway.

I faced a similar scenario recently and it was a great learning experience. There are few methods using which you can resolve this issue. I’ll try to cover them all in as much detail as possible.

Manual Method

Well.. If you don’t have too many data sources on-premise or if you are just planning for a quick fix maybe because someone important in your organization needs this fixed and they notified you like an hour before their big meeting.

Here are the high-level steps,

Once you install and configure the data gateway, you can see and manage both the old and new instances from the Power BI portal.

To add a user as admin to the gateway in the portal, follow below steps.

This image has an empty alt attribute; its file name is image-15-1024x370.png

Search user using username or email address, select Admin and click Share.

This image has an empty alt attribute; its file name is image-16-1024x504.png

Now to add a new data source, from the page header in the Power BI service, select Settings gear icon > Manage gateways.

I have highlighted my failed gateway and the new gateway server in my case.

Your next step is to determine the data source of the affected dataset. To get this information, you’ll need access to the workspace. As you can see, I have a report named ‘AdventureWorksProducts‘ and the underlying dataset with the same name.

Under the Gateway connection section, you’ll find the necessary information to setup the data source in the new gateway.

Back in the Manage Gateways page and in the Data sources tab, Click on New

Choose the new gateway to create the connection on, then select the Data source type. In my scenario, I picked SQL Server.

Once you provide all the information, Select Create. You see Created New data source if the process succeeds and a new data source entry like in screenshot below.

If you’ve made it this far, you are almost at the end of this method. Now back to the dataset’s settings like we did earlier and on to the Gateway connection section. As a reminder, you’ll need access(Admin, Member or Contributor) to the workspace and to the dataset, also keep in mind that you also need admin permissions on the gateway.

You should see the new data source we created listed. Select it from the drop-down and click Apply

That should take care of the connection and to confirm, you can refresh your dataset to make sure the connection works ok.

Like I said earlier, this method should be good in a small environment or if you are in a hurry to get it fixed and worry about the bulk of things later. I’ll cover the semi-automated way in the coming sections. I use the word automated loosely here but it’s more like less clicks and not moving around in the BI portal as much.

Using a Service Account

In this method, I’m using a service account or in other words a regular user account without any roles assigned to it. This can be an AD synced account or a Azure AD cloud only account. This account will need a Power BI Pro license assigned to it.

Here are the high-level steps,

I’ve already covered the adding data source part to the gateway in the earlier section and the process is same for this method too. You can do it with PowerShell or REST APIs but I don’t believe there is a method to copy the data sources from one gateway to another.

Permissions

In this method, I’m using a service account which was granted Admin permissions for the gateways and set as Owner on the data source. You should be able to get away with just having the account set as user on the data source. This service account is also set as Admin on the workspace but Member or Contributor should do the trick.

You can grant the gateway admin permission in the portal which I’ve covered in the earlier method or use the below script to add the user as admin.

Connect-AzureAD
Connect-DataGatewayServiceAccount
Get-DataGatewayAccessToken

Get-DataGatewayCluster
$gw = Read-Host "Enter Gateway ID"
$user = Read-Host "Enter username to be added as gateway admin"
$userToAdd = (Get-AzureADUser -Filter "userPrincipalName eq '$user'").ObjectId
Get-DataGatewayRegion
$Region = Read-Host "Enter region value where IsDefaultPowerBIRegion is set to true"
Add-DataGatewayClusterUser -GatewayClusterId $gw -PrincipalObjectId $userToAdd -AllowedDataSourceTypes $null -Role Admin -RegionKey $Region

With all these permissions, the service account still needs to take ownership of the dataset to finish rebinding the data source to the new gateway. You won’t have to manually take ownership of the dataset, the script below will do it for you on the dataset you specify.

Rebind dataset

Before proceeding further make sure you have the Microsoft Power BI Cmdlets for PS installed and logged in to the Power BI service using PowerShell,

Connect-PowerBIServiceAccount
Get-PowerBIAccessToken

I don’t do Power BI administration on a daily basis and there was a learning curve for me to understand the inner workings. Here is the thought process that went into building this script.

  1. Get all the gateways the service account has access to
    • Using the output, determine and copy the new gateway ID and store it in a variable
  2. Using the variable from earlier step, return a list of data sources from the new gateway
    • Using the output, determine and copy the data source ID where the affected dataset should be mapped to and store it in a variable
  3. Returns a list of workspaces the user has access to
    • Using the output, determine and copy the workspace ID which has the affected dataset
  4. Using the variable from earlier step, return list of datasets from the specified workspace
    • Using the output, determine and copy the affected dataset’s ID
  5. Using the variable from step 3 and step 4, transfer ownership over the specified dataset to the service account
  6. Using variable from steps 1, 2, 3 and 4, bind the specified dataset from the specified workspace to the new gateway
$gateways = Invoke-PowerBIRestMethod -Url "gateways" -Method Get | ConvertFrom-Json
$gateways.value
Write-Host "Please copy the new Gateway ID from above output" -ForegroundColor Red
$newGWID = Read-Host "Please paste the new Gateway ID"

$GWdatasources = Invoke-PowerBIRestMethod -Url "gateways/$($newGWID)/datasources" -Method Get | ConvertFrom-Json
$GWdatasources.value
Write-Host "Please note down the Data Source ID used by the dataset that needs to be migrated from above output" -ForegroundColor Red
$datasourceObjectIds = Read-Host "Please paste the Data source ID"

$ws = Invoke-PowerBIRestMethod -Url 'groups' -Method Get | ConvertFrom-Json
$ws.value
Write-Host "Please note down the Workspace ID which has the dataset that needs to be migrated from above output" -ForegroundColor Red
$wsID = Read-Host "Please paste the Workspace ID"

$dataset = Invoke-PowerBIRestMethod -Url "groups/$($wsID)/datasets" -Method Get | ConvertFrom-Json
$dataset.value
Write-Host "Please note down the dataset ID that needs to be migrated from above output" -ForegroundColor Red
$dsID = Read-Host "Please paste the dataset ID"

#This below line is not needed if the service account already has ownership of the dataset and is safe to comment out
Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.TakeOver" -Method POST

try { $body = "{
  'gatewayObjectId': '$newGWID',
  'datasourceObjectIds': [
    '$datasourceObjectIds'
  ]
}"

Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.BindToGateway" -Body $body -Method POST
Write-Host "Dataset updated" }

catch {
  Write-Host "An error occurred"
}

You can adjust this script according to your needs as in some instances, your gateway ID, new data source ID and workspace ID will be the same, only the affected dataset ID will vary.

Using a Service Principal

In this method, I’m using a service principal to accomplish the same as above. One added advantage of using this method is, the Power BI Dataset can be setup to refresh without an actual user account. This would be great from an automation point of view and to avoid being tied to a specific user.

Here are the high-level steps,

Create SPN

The az ad app is part of Azure CLI and not a PS cmdlet. You’ll need to have Azure CLI installed and do az login as well before running this.

Connect-AzureAD 
Connect-AzAccount
az login

You can create an Azure AD application which will be the service principal from the portal and grant the and grant the ‘Dataset.ReadWrite.All’ API permission or use the below lines to create it. I’ve detailed how to determine the API ID and Permission ID in this blog post here.

A new Azure AD group is also needed and the Azure AD application has be made a member of this group. The below lines will accomplish that and if you have an existing group you have in mind, you can use that too. I’ll go over the reason for creating this group later in this section.

$appname = Read-Host "Enter a name Azure AD Application's Display Name"
$ObjID = New-AzureADApplication -DisplayName $appname | Select ObjectId
Add-AzADAppPermission -ObjectId $ObjID.ObjectId -ApiId 00000009-0000-0000-c000-000000000000 -PermissionId 322b68b2-0804-416e-86a5-d772c567b6e6 -Type Scope
Start-Sleep -Seconds 60
az ad app permission admin-consent --id $ObjID.ObjectId
Get-AzureADApplication -Filter "DisplayName eq '$appname'" | fl

$grpName = Read-Host "Enter a name for new Azure AD group"
$grpID = (New-AzureADGroup -DisplayName $grpName -MailEnabled $false -SecurityEnabled $true -MailNickName "NotSet").ObjectId
Get-AzureADGroup -ObjectId $grpID
Add-AzureADGroupMember -ObjectId $grpID -RefObjectId $spnToAdd
Get-AzureADGroupMember -ObjectId $grpID

The Get-AzureADApplication cmdlet will list the API permissions we applied. This can be verified in the ‘App registrations‘ blade from the Azure AD portal too.

Create a new Secret in this Azure AD application. You can also achieve this by using PowerShell. This secret value is needed for authentication while running the script later this section.

Remember to copy the secret value as it’ll be masked forever.

And we can also make sure of the group we created and it’s membership. I named the group, ‘PBI-API‘ in Azure AD.

For an Azure AD app to be able to access the Power BI content and APIs, the following settings need to be enabled in Azure AD portal. This is where the Azure AD group comes into play.

Go to Tenant settings in the Power BI Admin portal, and scroll down to Developer settings

  • Enable the Allow service principals to use Power BI APIs
  • Enable the Allow service principals to create and use profiles

Create SPN profile

I noticed that the SPN way of doing things worked in one instance without having a service principal profile created by the service principal. Profiles can be created using Profiles REST API. I’ve included the below lines which will create a profile for the SPN.

$prof = Read-Host "Enter a name for SPN's profile"

$body = "{
    'displayName' : '$prof'
}"

Invoke-PowerBIRestMethod -Url 'https://api.powerbi.com/v1.0/myorg/profiles' -Body $body -Method POST

A service principal can also call GET Profiles REST API to get a list of its profiles.

Invoke-PowerBIRestMethod -Url 'profiles' -Method Get

Permissions

Next, the service principal needs permissions on the dataset. We can achieve this by granting permissions to the service principal on the workspace.

Note: Adding the Azure AD group that has SPN as members doesn’t work

This next step is kind of where things get tricky.

What are we trying to achieve here?

  • Grant the service principal, admin permissions on the new gateway
  • Grant the service principal, user permissions on the gateway data source

Reason why it is tricky is, I first tried adding the Azure AD group the above permissions and it allowed me to add it but the script which comes later in this section didn’t work as expected. Based on further research, I realized that the SPN needs to be granted the above access directly instead of using the Azure AD group. Also, at the time of writing this post, adding SPN the above permissions using the portal is not supported. Hence, we’ll have to use PowerShell cmdlets,

Before proceeding further, please connect to the AzAccount and PowerBIService using the below cmdlets,

Connect-AzAccount
Connect-PowerBIServiceAccount
Get-PowerBIAccessToken

The below script will add the permissions I mentioned above and display the same at the end of executing the cmdlets. One good thing about the part where you add permissions to the gateway, data sources and workspaces is, it is a one-time deal.

Get-DataGatewayCluster
$gw = Read-Host "Enter Gateway ID"
$spn = Read-Host "Enter App name to be added as gateway admin"
$spnToAdd = (Get-AzADServicePrincipal -DisplayName $spn).Id
Get-DataGatewayRegion
$Region = Read-Host "Enter region value where IsDefaultPowerBIRegion is set to true"
Add-DataGatewayClusterUser -GatewayClusterId $gw -PrincipalObjectId $spnToAdd -AllowedDataSourceTypes $null -Role Admin -RegionKey $Region
Get-DataGatewayCluster -GatewayClusterId $gw | Select -ExpandProperty Permissions | ft
Get-DataGatewayClusterDatasource -GatewayClusterId $gw
$gwDSID = Read-Host "Enter Gateway Cluster DatasourceId"
Add-DataGatewayClusterDatasourceUser -GatewayClusterId $gw -GatewayClusterDatasourceId $gwDSID -DatasourceUserAccessRight Read -Identifier $spnToAdd
Get-DataGatewayClusterDatasourceUser -GatewayClusterId $gw -GatewayClusterDatasourceId $gwDSID

With all the permissions for the SPN now in place, we are ready to take ownership of the affected datasets in the workspaces and bind it with the new data source on the new gateway

Rebind dataset

In this SPN method, Instead of logging in with a username and password, you’ll have to login with the Application ID and secret

$Tenant = Read-Host "Enter Azure AD Tenant ID"
Connect-PowerBIServiceAccount -Tenant $Tenant -ServicePrincipal -Credential (Get-Credential) #user = Application (client) ID | Password is the secret value we created earlier in this section
Get-PowerBIAccessToken

The script is pretty much the same as in earlier section but only runs in the SPN context.

$gateways = Invoke-PowerBIRestMethod -Url "gateways" -Method Get | ConvertFrom-Json
$gateways.value
Write-Host "Please copy the new Gateway ID from above output" -ForegroundColor Red
$newGWID = Read-Host "Please paste the new Gateway ID"

$GWdatasources = Invoke-PowerBIRestMethod -Url "gateways/$($newGWID)/datasources" -Method Get | ConvertFrom-Json
$GWdatasources.value
Write-Host "Please note down the Data Source ID used by the dataset that needs to be migrated from above output" -ForegroundColor Red
$datasourceObjectIds = Read-Host "Please paste the Data source ID"

$ws = Invoke-PowerBIRestMethod -Url 'groups' -Method Get | ConvertFrom-Json
$ws.value
Write-Host "Please note down the Workspace ID which has the dataset that needs to be migrated from above output" -ForegroundColor Red
$wsID = Read-Host "Please paste the Workspace ID"

$dataset = Invoke-PowerBIRestMethod -Url "groups/$($wsID)/datasets" -Method Get | ConvertFrom-Json
$dataset.value
Write-Host "Please note down the dataset ID that needs to be migrated from above output" -ForegroundColor Red
$dsID = Read-Host "Please paste the dataset ID"

Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.TakeOver" -Method POST

try { $body = "{
  'gatewayObjectId': '$newGWID',
  'datasourceObjectIds': [
    '$datasourceObjectIds'
  ]
}"

Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/$($wsID)/datasets/$($dsID)/Default.BindToGateway" -Body $body -Method POST
Write-Host "Dataset updated" }

catch {
  Write-Host "An error occurred"
}

Similar to the earlier section, you can adjust this script according to your needs as in some instances, your gateway ID, new data source ID and workspace ID will be the same, only the affected dataset ID will vary.

Needless to say, you can test if this was successful by doing a ‘Refresh now‘ on the dataset.

Issues you may encounter and How to fix it

Issue: You may encounter below status codes while running the Invoke-PowerBIRestMethod

Response status code : 404 (Not Found)
Response status code : 400 (Bad Request)

Fix or workaround: Well.. If you’ve already browsed though community.powerbi.com, then might have already realized that you are not alone dealing with these error codes. Usually this means you are requesting the Power BI REST API endpoints for data that doesn’t exist or you or the SPN that’s requesting the resource doesn’t have the necessary permissions to it. These best way to troubleshoot is to run these requests one at a time to determine where you it is failing or understand which resource you don’t have permissions to.

Issue: Applied permissions don’t reflect in the portal

Fix or workaround: I noticed that some of the changes takes time. Give it a few minutes before you go changing more things and you lose track of all the things you’ve changed in the process. If the permissions still didn’t show up for a while, use PowerShell cmdlets to verify if the permissions you’ve set was applied or not.

I’ll keep experimenting other scenarios and I’ll update the issues I come across later on.

This was one of those really lengthy posts but hey..as long as there is a solution at the end..Hopefully..am I right?..😁🤷‍♂️

Thank you for stopping by.✌

Azure AD – Improve Authenticator Notifications with Additional Context and Number Matching

As I have covered several times before, disabling basic authentication in one of the best things you can do in your O365 tenant for security.

MFA helps protect user’s account and prevents attacks. It is not perfect by any means but it is being improved. I’m a big fan of the Authenticator App. I try not to use the SMS or voice call options. Whenever I get a chance I always advocate the users I work with, to stick with the App. If your organization is yet to roll out MFA, it is time to take a hard look and make some drastic changes.

Microsoft in their November 18 Azure AD Identity blog revealed two new features for the Authenticator app. IMO, all O365 tenants should strongly consider enabling these two features below.

  • Number matching in Microsoft Authenticator
  • Additional context in Microsoft Authenticator

Number Matching

When a user responds to MFA challenge, they will see a number in the application or in the webpage which is challenging them and the user must enter this number in the Authenticator app to complete the process. This process is already part of the passwordless authentication method.

Additional Context

The Authenticator will also display the name of the app requesting MFA and also the user’s sign-in location. The sign-in location is based on the user’s public IP address. The location may not be accurate at times. This is because the IP location tagging and based on what I saw it is not the exact location of where the application’s traffic origin but usually close enough.

Application prompt on the webpage
Authenticator prompt

How to enable number matching with additional context in Azure AD

  • Open Azure AD admin center(https://aad.portal.azure.com/)
  • Click on the Security tab –> Authentication methods
  • Select Microsoft Authenticator
  • Toggle ENABLE to Yes
  • Toggle TARGET to All users
    • Depending on how you decide to roll out this feature, you can select a Azure AD group by selecting Select users, Select the group and follow along the next steps
  • Click on the three dots and Configure
  • Set
    • Authentication mode = Any
    • Require number matching = Enabled
    • Show additional context in notifications = Enabled
  • Click Done
  • Click Save
Microsoft Authenticator settings

Configure settings for All Users

In the drop down for ‘Require number matching’ and ‘Show additional context in notifications’, there is a ‘Microsoft Managed‘ option. It means this functionality will be enabled by default for all tenants after the feature is generally available. Currently it is in public preview.

Thank you for stopping by.✌