Alright, let’s talk about Azure Role-Based Access Control (RBAC)—the bouncer at the club, the gatekeeper of your cloud kingdom, the difference between “Oops, I deleted the production database” and “Phew, good thing I didn’t have permission for that.”
If you’re working with Microsoft Azure, RBAC is a must-know. It’s how you control who can do what in your cloud environment. Let’s break it down in a fun, easy-to-digest way.
What is Azure RBAC, and Why Should You Care?
Think of Azure RBAC like a high-tech office building with keycards. Not everyone should have access to every room, right? Your interns shouldn’t be able to access the CEO’s private office, and the janitor doesn’t need the nuclear launch codes.
RBAC works the same way in Azure:
You assign roles to users, groups, or applications instead of just giving them full access.
It’s based on the principle of least privilege, meaning people only get access to what they need—nothing more, nothing less.
It prevents chaos. Because let’s be real, one accidental click from an over-permissioned user can lead to disaster.
The Three Key Pieces of RBAC
Azure RBAC is built on three main pieces:
Roles: These define what someone can do. Examples:
Owner – The boss. Can do anything and everything.
Contributor – Can create and manage resources but can’t assign roles.
Reader – Can look, but not touch.
Custom Roles – If the built-in roles aren’t enough, you can create your own.
Scope: This defines where the role applies. It can be at:
Subscription level (the whole kingdom)
Resource group level (a city inside the kingdom)
Specific resources (a single castle or shop)
Assignments: This is the who gets what role part. Assign a user, group, or service principal to a role at a given scope, and boom—permissions granted.
Real-World Example: The Coffee Shop Analogy ☕
Imagine you’re running a coffee shop:
The Owner (you) can do everything—order supplies, hire staff, make coffee, or even shut down the store.
The Baristas (contributors) can make coffee and manage the store but can’t hire or fire anyone.
The Customers (readers) can look at the menu, enjoy their coffee, but they’re not allowed behind the counter.
That’s Azure RBAC in action. Everyone gets access to what they need, but no one is accidentally pressing the “shutdown entire store” button.
Common RBAC Mistakes (And How to Avoid Them)
Giving Everyone Owner or Contributor Roles – That’s like handing out master keys to your entire office. Keep permissions minimal!
Not Using Groups – Assigning roles individually? Big mistake. Use Azure AD groups to manage permissions efficiently.
Ignoring Scope – Always assign roles at the lowest necessary level to avoid over-permissioning.
Forgetting to Review Roles Regularly – People leave jobs, projects change, and roles should be updated accordingly.
Final Thoughts: Lock It Down, But Keep It Practical
Azure RBAC is all about control, security, and making sure the right people have the right access. It’s not just an IT thing—it’s about keeping your cloud environment safe and sane.
So next time you’re setting up roles in Azure, ask yourself:
Does this person really need this level of access?
Could I use a lower scope?
Am I following best practices?
Get it right, and your cloud stays secure. Get it wrong, and… well, let’s just say you don’t want to be the person who accidentally gives the intern the power to delete the company’s entire infrastructure.
If you work with firewalls, proxies, or any system that restricts traffic based on IP addresses, you’re likely familiar with the challenges of maintaining access to dynamic cloud infrastructure. Microsoft understands this, which is why they publish the Azure IP Ranges and Service Tags – Public Cloud dataset — a JSON file containing up-to-date IP address ranges used by Azure services.
Why Microsoft Publishes Azure IP Ranges
Microsoft Azure’s cloud infrastructure spans global data centers, with thousands of services running behind constantly shifting sets of IP addresses. To help organizations:
Configure firewalls and security appliances
Whitelist Azure service IPs
Meet compliance or policy needs
Route traffic appropriately
…Microsoft provides this public JSON file that includes IP ranges tied to Service Tags like Storage, Sql, AppService, and many others, broken down by region.
If you are using Azure Firewall, then you can use these service tags directly in your firewall rules. More information can be found here. If you are using some other firewall, then you need to check if they support service tags directly e.g., Palo Alto Networks & External Dynamic Lists (EDL).
If you don’t have support for service tags in your firewall, then you need to use IP address prefixes directly. This is not ideal, since you need to update your firewall rules every time when new IP address prefixes are added or removed. This is why automating this process is important.
How Often Is the Data Updated?
Microsoft typically updates the Azure IP Ranges and Service Tags – Public Cloud file weekly, usually every Monday. Updates can reflect:
New IPs added for expanding infrastructure
Old ranges removed or reallocated
Changes to service or region mappings
Each release includes a "changeNumber" field and a "version" field to help you detect updates. Automation is key here — hence the script!
Changes, Changes…
Azure public IP addresses can change weekly! What does that mean for us?
Every week, there may be new IP addresses added to the list. Others may be removed if they’re no longer in use by Azure.
Why does that matter?
Let’s say an on-prem application needs access to an AzureSQL database, you previously added its IPs to your firewall allow-list. Then Azure updates its IP ranges, and the on-prem application keeps using the same IP or IP ranges that’s not in your allow-list. Boom — access denied. Not because you intended to block it, but because you didn’t know about the change.
It’s not just about adding the new IPs. You also need to handle removals to prevent your allow-lists from becoming bloated and insecure.
This isn’t a new problem in networking. But in Azure, the pace of change is faster — often weekly — and automation becomes essential.
How Is the JSON File Structured?
The JSON file is well-organized and structured to support automation. Here’s what it generally looks like:
Azure Key Vault is used to safely manage Secrets, Certificates and Crytographic Keys used by cloud and on-premise applications. After the secrets are stored in Azure Key Vault, we can assign access policies to user accounts or SPNs to retrieve and use them.
In this post, I will cover,
How to create Azure Key Vault
Create and update secrets in Azure Key Vault
Create a new Azure AD application and SPN
Create a client Secret
Assign a policy in Azure Key Vault to allow access to the SPN we create
Store the Azure AD application ID and client secret in the SecretStore vault
Retrieve Secret from Azure Key vault using credentials stored in SecretStore vault
I’ll go through the steps in both the portal and via PowerShell.
Before proceeding further, make sure you are connected to Azure AD and Azure via PowerShell. You may or may not use the same global admin account to connect to both Azure AD and Azure, either way you can use the below cmdlets and adjust it accordingly where necessary.
Leave the other options default and click Review + create
The other options in the creation steps,
Access policy = I’ll go through it later in this post
Networking = All networks to make it publicly accessible
Tags = As necessary
Below are my settings,
Create a key vault
Create SPN in Azure AD
In this step, we’re creating a service principal in Azure AD. We will assign permissions for this SP to retrieve secrets from the Azure Key vault in later step.
In Azure AD, the application registration is the template used to create the SP. Also, the SP is what can be authenticated and authorized. Application and SP are associated by Application ID and they differ in it Object ID.
To create a new Azure AD application,
$appname = Read-Host "Enter a name for the Azure AD application"
New-AzureADApplication -DisplayName $appname
To create a service principal,
$appname = Read-Host "Enter name of Azure AD application"
$AppId = (Get-AzureADApplication -Filter "DisplayName eq '$appname'").AppId
New-AzureADServicePrincipal -AccountEnabled $true -AppId $AppId -DisplayName $appname
Create Client Secret
Next, we create a new client secret using the Get-AzureADApplicationPasswordCredential cmdlet,
$appname = Read-Host "Enter Azure AD application name to determine Object ID"
$appObjID = (Get-AzureADApplication -Filter "DisplayName eq '$appname'").Objectid
$KeyId = Read-Host "Enter value for secret identifier"
New-AzureADApplicationPasswordCredential -ObjectId $appObjID -CustomKeyIdentifier $KeyId
Copy the value in the output to a notepad as I have highlighted above. This value will not be available to copy later.
Assign Permissions
Using PowerShell
We can assign necessary permissions to the Azure AD application we created in above step, using the Set-AzKeyVaultAccessPolicy cmdlet,
$appname = Read-Host "Enter Azure AD application name to determine Object ID"
$Appid = (Get-AzureADApplication -Filter "DisplayName eq '$appname'").AppId
$kvName = Read-Host "Enter a name for Key Vault"
Set-AzKeyVaultAccessPolicy -VaultName $kvName -ServicePrincipalName $Appid -PermissionsToSecrets list,get
In the left navigation menu, click on Access policies
Select Permission model as Vault access policy
Click +Add Access Policy
In the Add access policy window
For Secret permissions, select Get and List
For Select principal, select the SPN we created earlier
Click Add
Click Save to save policy
Manage Secrets in Key Vault
Applications, scripts or users can create, update, delete and retrieve secrets if they have the necessary policy assigned to them
Creating/Updating Secrets
To create a new secret, we can use the Set-AzureKeyVaultSecret cmdlet,
$Secret = ConvertTo-SecureString -String 'Password' -AsPlainText -Force
$kvName = Read-Host "Enter a name for Key Vault"
$SecName = Read-Host "Enter a name for secret"
Set-AzKeyVaultSecret -VaultName $kvName -Name $SecName -SecretValue $Secret
The secret can be updated to a new value using the same Set-AzureKeyVaultSecret cmdlet,
$Secret = ConvertTo-SecureString -String 'Password' -AsPlainText -Force
$kvName = Read-Host "Enter a name for Key Vault"
$SecName = Read-Host "Enter a name for secret"
$Expires = (Get-Date).AddYears(2).ToUniversalTime()
$NBF =(Get-Date).ToUniversalTime()
$Tags = @{ 'Department' = 'IT'; 'Environment' = 'Production'}
Set-AzKeyVaultSecret -VaultName $kvName -Name $SecName -SecretValue $Secret -Expires $Expires -NotBefore $NBF -Tags $Tags
Retrieving Secrets
To retrieve the current version of a secret, we use the Get-AzureKeyVaultSecret cmdlet,
$kvName = Read-Host "Enter a name for Key Vault"
$SecName = Read-Host "Enter a name for secret"
$secruretext = (Get-AzKeyVaultSecret -VaultName $kvName -Name $SecName).SecretValue
This will assign the stored secret to the $secruretext variable as a SecureString. We can now pass this to any other cmdlets that require a SecureString.
As I’ve already covered the Microsoft.PowerShell.SecretManagement and Microsoft.PowerShell.SecretStore PS modules in an earlier post, I’ll follow on the premise and store the client secret we created in the local vault. This way, I don’t have to save the client secret in the code as plaintext. To do this, we can store the Application ID and Client secret in a PSCredential object to the store,
In the Windows PowerShell Credential request window, for User Name input the Application (client) ID of the Azure AD application and for password input the Client Secret value we copied into a notepad earlier.
I’ve also created another secret as string in the local vault with my tenant ID value.
Putting this all together, we can use these below lines in PowerShell automation scripts,
To retrieve the secure string stored in the Azure Key vault, I’m using these lines below. Also for demo purposes, I’m including the -AsPlainText to the Get-AzKeyVaultSecret cmdlet but as I mentioned earlier, we can store this secure string to a variable and pass it on to other cmdlets.
$kvName = Read-Host "Enter a name for Key Vault"
$SecName = Read-Host "Enter a name for secret"
Get-AzKeyVaultSecret -VaultName $kvName -Name $SecName -AsPlainText
#or
$secruretext = (Get-AzKeyVaultSecret -VaultName $kvName -Name $SecName).SecretValue
I know this was a lengthy post and it may have gotten a little confusing right at the end with too many things named vault🤷♂️
Hope this helped you out in figuring out in including Azure Key Vault in your PowerShell automations.
If you aren’t familiar with Azure AD B2C, it is a customer identity access management (CIAM) solution and is a separate service from Azure Active Directory (Azure AD). It is built on the same technology as Azure AD but for a different purpose. It allows businesses to build customer facing applications, and then allow anyone to sign up into those applications with no restrictions on user account. Azure AD B2C uses standards-based authentication protocols including OpenID Connect, OAuth 2.0, and SAML.
In an earlier post, I detailed steps on how to configure ServiceNow with Azure AD SSO. In this post, I will go through steps on how to integrate Azure AD B2C with ServiceNow.
Below is a diagram show the high level implementation steps on how to do this integration,
OpenID Connect (OIDC) is an identity layer built on top of the OAuth protocol, which provides a modern and intuitive Single Sign-on (SSO) experience. ServiceNow supports OIDC to authenticate users in Azure B2C.
I will not cover the Azure AD B2C tenant creation steps in this post.
Create new user flow
A user flow lets us determine how users interact with our application when they do things like sign-in, sign-up, edit a profile, or reset a password.
Sign in to the Azure portal
Make sure you’re using the directory that contains your Azure AD B2C tenant. Select the Directories + subscriptions icon in the portal toolbar
On the Portal settings | Directories + subscriptions page, find your Azure AD B2C directory in the Directory name list, and then select Switch
In the Azure portal, search for and select Azure AD B2C
Under Policies, select User flows, and then select New user flow
On the Create a user flow page, select the Sign up and sign in user flow
Under version, select Recommended, and then select Create
Enter a Name for the user flow. For example, su_si-1
For Identity providers, select Email signup
Under User attributes and token claims, choose the claims and attributes to collect and send from the user during sign-up. Select Show more, and then choose attributes and claims. Click OK. Below screenshot shows the attributes I’m collecting but it is up to you. These attributes can be modified in the user flow at any time
Click Create to add the user flow. A prefix of B2C_1_ is automatically prefixed to the name
Create App Registration
Stay logged into the Azure portal
Make sure you are in the B2C directory
In the left navigation menu, under Manage, Click App registrations, and then select New registration
Enter a Name for the application. For example, ServiceNow
Under Supported account types, select Accounts in any identity provider or organizational directory (for authenticating users with user flows)
Under Redirect URI, select Web then enter your ServiceNow instance with /navpage.do in the URL text box
Under Permissions, select the Grant admin consent to openid and offline_access permissions check box
Click Register
Create a client secret
The client secret is also known as an application password. The secret will be used by ServiceNow to exchange an authorization code for an access token
In the left menu, under Manage, select Certificates & secrets
Click New client secret
Enter a description for the client secret in the Description box. For example, SnowSecret
Under Expires, select a duration for which the secret is valid, and then select Add
Note down the secret’s Value for use in ServiceNow. This value is never displayed again after you leave this page
Information needed to configure ServiceNow instance
Click on the Overview, copy the Application (client) ID
Next Click Endpoints
Copy the value in Azure AD B2C OpenID Connect metadata document
Replace with the User flow name we created earlier e.g. B2C_1_su_si-1. Browse to the URL in a Web browser to confirm you have the right URL
You should have these 3 values,
Application (client) ID
Client Secret Value
OIDC well-known endpoint
Configure ServiceNow Instance
Hopefully, you already have SSO enabled in your ServiceNow instance. If not, please refer to this earlier post of mine
Search for multi-provider sso and click Properties
Enable multiple provider SSO
You’ll be asked to setup a recovery account
Under Multi-Provider SSO and click Identity Providers
Click New
Click OpenID Connect
In the Import OpenID Connect Well Known Configuration window, provide following information
Name = Name of the IdP you wish. Example, B2C
Client ID = Application (client) ID from Azure B2C application
Client Secret = Client Secret Value we created earlier in the application
Well Known Configuration URL = URL we constructed earlier with the policy name
Click Import
Make sure the new IdP is marked Active and Show as Login option is checked
Click on the OIDC Entity tab and click to open the OIDC Entity
Click on OAuth Entity Scopes, double-click on OAuth scope and replace openid with the below value
Use your Application (client) ID from B2C app registration
This OAuth Scope value is required to generate an access token and without that ServiceNow will error out with a missing parameter. I realized this later on based on my research. I initially left it at openid and searching with the error, lead me to this.
Click Update to save changes
Click on OIDC Provider Configuration
Click on OIDC provider value
Update the User Claim to emails
Click Update
To keep things simple, I’m not enabling the Automatic user provisioning option
You can choose to enable automatic user provisioning during user login. When automatic user provisioning is enabled, a user record is automatically created in the ServiceNow instance if that user record does not exist.
Back in the Identity provider window, Click Update to save the OIDC Identity Provider values
Navigate to the login page of the instance to verify that IdP appears as a login option
Create a test user in ServiceNow and login with the credentials to test if the IdP configuration works
Optionally you can browse to the login page with the URL in following format,
To determine the sys_id, open the OIDC Identity provider we created, right-click on the grey bar and click Copy sys_id
Replace this sys_id in the URL below
This URL will take you directly to the sign-in page
Most organizations have Enterprise Agreement (EA) accounts for Azure billing. Microsoft offers the Cost Management App which can be used to view and analyze Azure costs using Power BI. The Cost Management App only works with EA accounts.
But you might have situations where you are managing Azure billing using Customer Agreement. Microsoft has updated the Azure Cost Manager connector in Power BI to support Customer Agreement. The Azure Cost Management connector for Power BI Desktop can be used to build powerful, customized visualizations and report to help us understand Azure spending.
Azure Cost Management allows 3 kinds of connections:
Customer Agreement: Common for small business or individual accounts
Enterprise Agreement: Accounts used by big organizations where payment goes through via purchase orders and such
Billing Profile: This is sort of like a subset of Customer Agreement. Allows us to organize via rules, like, cost-center, department, etc
To proceed further, make sure you have the Power BI desktop App downloaded and installed on your machine.
Determine Billing Information in Azure
To connect to a billing account, we need to retrieve the Billing account ID from Azure portal:
In the Azure portal, search for Cost Management + Billing
Select Billing profile
In the left navigation menu, under Settings in the menu, select Properties
Make sure this billing account has a at least Billing account reader assigned to it
This can be determined by clicking on Billing scopes in the left navigation menu or in the Properties tab
Under Billing profile, copy ID
Billing Account ID
Connect using Azure Cost Management in Power BI
To use the Azure Cost Management connector in Power BI Desktop:
Launch Power BI Desktop
Click Get data from the splash page or from the Home ribbon
Click Azure from the list of data categories
Select Azure Cost Management
Connect Azure Cost Management
Under Choose Scope,
To connect to a Billing Account
Select Manually Input Scope and input the connection string in below format, with {billingAccountId} that we determined in the earlier section
Select Manually Input Scope and input the connection string in below format, the {billingAccountId} and {billingProfileId} can be determined in the same properties tab as in earlier section
A Navigator window shows all the available data tables
Select a table to see a preview dialog
One or more tables can be selected by selecting the boxes beside their name and then click Load
For the report I have in mind, I only need the Usage details table and I’m selecting it to be loaded
Available Tables
Table
Description
Balance summary
Summary of the balance for the current billing month for EA
Billing events
Event log of new invoices, credit purchases, etc. Microsoft Customer Agreement only
Budgets
Budget details to view actual costs or usage against existing budget targets
Charges
A month-level summary of Azure usage, Marketplace charges, and charges billed separately. Microsoft Customer Agreement only.
Credit lots
Azure credit lot purchase details for the provided billing profile. Microsoft Customer Agreement only.
Pricesheets
Applicable meter rates for the provided billing profile or EA enrollment.
RI charges
Charges associated to Reserved Instances over the last 24 months. This table is in the process of being deprecated, please use RI transactions
RI recommendations (shared)
Reserved Instance purchase recommendations based on all subscription usage trends for the last 30 days
RI recommendations (single)
Reserved Instance purchase recommendations based on single subscription usage trends for the last 30 days
RI transactions
List of transactions for reserved instances on billing account scope
RI usage details
Consumption details for existing Reserved Instances over the last month
RI usage summary
Daily Azure reservation usage percentage
Usage details
A breakdown of consumed quantities and estimated charges for the given billing profile on EA enrollment
Usage details amortized
A breakdown of consumed quantities and estimated amortized charges for the given billing profile on EA enrollment
Data available through the connector
When we select Load, the data is loaded into Power BI Desktop
Depending on the tables you choose, you may be asked to for authentication
When the data we selected is loaded, the data tables and fields are shown in the Fields pane
Loaded fields
I built this visualization below using some of these fields,
Visualization showing Azure cost by ResourceGroup Name, Date, Meter and Meter sub-category
The data is there and you are only limited by the amount on time you have, to spend within Power BI and your imagination.😉
I’m a big fan of Tableau and I love creating visualizations. Now I’ve started using Power BI more and more with Azure related stuff. Plus Power BI Pro comes bundled with Office 365 E5.
Hope this post helped you in setting up your Azure cost reports with Power BI.