Unexplainable behavior’s with DefaultAzureCredential()

Long story, short (2 days later)

While implementing an Azure Function that is designed to fetch secrets from Azure KeyVault, I ran into a funny and odd issue. I am not able to explain why and what is going on, but I have tried every trick a google search can conjure, at least until page 30 in the search results. It was by coincidence I came across some of the parameters in the DefaultAzureCredentialOptions class that got me going, at least locally.

The idea, as far as I have understood, is that whenever you invoke the Azure.Identity.DefaultAureCredential class, it provides a flow for attempting authentication using one of the following credentials, in listed order:

I suspect that since I have deployed my Azure Function using the Managed Identity setting to a Systems Assigned identity, like this:

System Assigned Identity

AND the fact that ManagedIdentityCredential is before VisualStudioCredential in the authentication flow, it fails, since it is unable to authenticate the managed identity – which is the main principle of the design – none other than the service can assume the identity of the service.

See more detail here: https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview
Snip

  • System-assigned. Some Azure resources, such as virtual machines allow you to enable a managed identity directly on the resource. When you enable a system-assigned managed identity:
    • A service principal of a special type is created in Azure AD for the identity. The service principal is tied to the lifecycle of that Azure resource. When the Azure resource is deleted, Azure automatically deletes the service principal for you.
    • By design, only that Azure resource can use this identity to request tokens from Azure AD.
    • You authorize the managed identity to have access to one or more services.
    • The name of the system-assigned service principal is always the same as the name of the Azure resource it is created for. For a deployment slot, the name of its system-assigned identity is <app-name>/slots/<slot-name>.

Love rears it’s ugly head

Having assigned the proper permissions in the Azure KeyVault, you are able to connect using your credentials in Visual Studio to said KeyVault. A code example of that could look like this:

public static string GetSecret( string keyvault, string secret )
{            
   var kvUri = $"https://{keyvault}.vault.azure.net";
 
   var creds = new DefaultAzureCredential();
 
   var client = new SecretClient(new Uri(kvUri), creds);
   var secret = client.GetSecretAsync(secret).Result.Value.Value;
 
   return secret;
}

(link to NuGet: NuGet Gallery | Azure.Security.KeyVault.Secrets 4.5.0)

Usually this works, and I have no other explanation than having deployed the solution to a live running App Service is what breaks this otherwise elegant piece of code. The above listed code does not work for me.

Workaround

You can instantiate the DefaultAzureCredential class using a constructor that takes a DefaultAzureCredentialOptions object as a parameter and this object has a great number of attributes that are of interest. You can actively remove items in the authentication flow and you can specify the tenant id, if you have access to multiple tenants.

The code that resolved the issue locally looks something like this. (I can probably just do without the ManagedIdentity, will test)

public static string GetSecret( string keyvault, string secret )
{            
   var kvUri = $"https://{keyvault}.vault.azure.net";
 
 
    var creds = new DefaultAzureCredential(
        new DefaultAzureCredentialOptions() {
        TenantId = "<INSERT TENANT ID HERE>"
        , ExcludeAzureCliCredential = true
        , ExcludeAzurePowerShellCredential = true
        , ExcludeSharedTokenCacheCredential = true
        , ExcludeVisualStudioCodeCredential = true
        , ExcludeEnvironmentCredential = true
        , ExcludeManagedIdentityCredential = true
    });
 
 
    var client = new SecretClient(new Uri(kvUri), creds);
   var secret = client.GetSecretAsync(secret).Result.Value.Value;
 
   return secret;
}

I am not sure this will work when I deploy the solution, but I will probably create a test on environment (local debug or running prod)

HTH

Loading

New Microsoft certifications passed

This summer my family and I spent almost three weeks driving to Germany and into Italy by car. Not just any car I might add. The old Volvo clicked in 4.000 km and handled it like a charm 🥰 even when it was super packed for the final stage of the journey from the duty free shop just across the border.

Main cities visited Nürnberg, Venice, Bologna, Brisighella, Comacchio… and obviously I had to make a stop at the Mutti field of tomatoes, as Mutti is a client of ours 🙂

Assorted Pictures from the vacation

Just before the vacation began, I got notice, that I had passed the two beta exams I attended in the middle of May. With beta exams you do not get the passing score immediately, you have to wait ’till the program has collected enough data on the individual questions/answers to release the final version of the test.

Microsoft Power Automate RPA Developer (PL-500)

First of I passed the Microsoft Power Automate RPA Developer (PL-500) exam, which was quite a stretch for me, and I had even raised some concerns about the scope of the test before, in the below announcement on LinkedIn:

Candidates for this exam automate time-consuming and repetitive tasks by using Microsoft Power Automate (formerly known as Flow). They review solution requirements, create process documentation, and design, develop, troubleshoot, and evaluate solutions.

Candidates work with business stakeholders to improve and automate business workflows. They collaborate with administrators to deploy solutions to production environments, and they support solutions.

Additionally, candidates should have experience with JSON, cloud flows and desktop flows, integrating solutions with REST and SOAP services, analyzing data by using Microsoft Excel, VBScript, Visual Basic for Applications (VBA), HTML, JavaScript, one or more programming languages, and the Microsoft Power Platform suite of tools (AI Builder, Power Apps, Dataverse, and Power Virtual Agents).

 Important

Passing score: 700. Learn more about exam scores. (which is exactly what I scored 😁)

Part of the requirements for: Microsoft Certified: Power Automate RPA Developer Associate

The detailed skills are outlined here: Exam PL-500: Microsoft Power Automate RPA Developer – Skills Measured

Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI (DP-500)

Candidates for this exam should have advanced Power BI skills, including managing data repositories and data processing in the cloud and on-premises, along with using Power Query and Data Analysis Expressions (DAX). They should also be proficient in consuming data from Azure Synapse Analytics and should have experience querying relational databases, analyzing data by using Transact-SQL (T-SQL), and visualizing data.

 Important

Passing score: 700. Learn more about exam scores.

 Tip

Part of the requirements for: Microsoft Certified: Azure Enterprise Data Analyst Associate

Active certifications

New/Renewed 2022

Other Active Certifications

Loading

Microsoft Connected Learning Experience program

Excited to be a part of the panel for the Microsoft Connected Learning Experience program! Join us for the program starting April 22nd as we talk about Azure Fundamental exams and give you a personalized and prescriptive learning experience for exam readiness.


Study Smart…Not Hard! and pass MS Azure Fundamental exams.

Register here: https://clx.cloudevents.ai/

#avd #cloudlearning #microsoftcertified #microsoftazure #microsoft #CLX #MSAzureCLX #ConnectedLearningExperience #CloudLabs #SpektraSystems

Loading

Power BI Community Tour

Blog post in Danish 🙂

Om lidt under en måned (25/4-27/4) ruller Power BI bussen afsted og gør sit første stop på Power BI Community Touren 2022. Mere præcist, så begynder vi i Lyngby, kører videre dagen efter til Odense og runder Touren af i Aarhus. Så alt efter hvor du er i landet, vil der være god mulighed for at deltage.

På hvert stop vil der blive disket op med introdultion og best practices indefor de forskellige elementer af Power BI. Med oplæg om Introduktion til Power BI, Data Loading & Mashup, Data Modellering & DAX, Data Visualisering og Distribution og deling vil alle hjørner være dækket.

Der er tale om oplæg der retter sig mod begyndere eller meget let øvede brugere af Power BI, og du kan her få en tryggere start på din rejse med Power BI.

  • Har du brugt Power BI, men mangler at vide hvordan det hele hænger sammen?
  • Har du importeret noget data i Power BI, men mangler at vide hvordan man organiserer sine tabeller?
  • Har du lavet en Power BI rapport, men mangler at vide hvordan man bedst visualiserer dataene?
  • Har du udviklet nogle rapporter, men mangler at vide hvordan du deler dem med dine kollegaer?
  • Har du aldrig brugt Power BI, men vil gerne vide mere om hvorfor det er et af de mest populære rapporterings- og self-service BI værktøjer?

Hvis du svarer ja til ét eller flere af disse spørgsmål, så er Power BI Community Tour for dig. Hvis ikke – så send meget gerne denne information videre til relevante kollegaer!

Sign up her: https://lnkd.in/eVzcBMvp

En stor tak til JDM, Kapacity, Microsoft og Seges for at stille lokaler og forplejning til rådighed.

Loading

Setting up Azure Analysis Services database(s) refresh w/ Azure Automation

There are a lot of technical ways to achieve an updated database (to many called a model) in Azure Analysis Services, one of them is by using Azure Automation which allows you to orchestrate processes in Azure amongst other things.

Automation capabilities - src: https://docs.microsoft.com/en-us/azure/automation/automation-intro
src: https://docs.microsoft.com/en-us/azure/automation/automation-intro

One of the components of Azure Automation is the concept of a Runbook. A Runbook contains some sort of a script i.e. Powershell or graphical representation, which can be scheduled or activated by a Webhook. A webhook is an HTTP endpoint, which means you can activate the runbook from almost any service, application and/or device. In fact, if you can do a POST to the HTTP endpoint you are good to go.

So really, this comes down to how you create the runbook, because once created, you can think up a gazillion scenarios to invoke the script. Could be Power Apps, could be Power Automate, could be Azure Data Factory or a completely different process where you need to kick of an automation script.

To complete this guide, you will need the following services created:

For the Azure Analysis Services Model we can simply use a sample data option, provided by creating a new Model in Azure Analysis Services. This allows you to select a sample data which creates an Adventure Works sample model.

Create new Model
Choose Sample Data Model
Adventure Works Model

Now that we have our Automation Account and model ready, we can go ahead and stitch everything together.

In order for us to run this unattended, we will be needing an App Registration in our Azure Active Directory (make sure it’s in the same tenant). Microsoft has a guide here. You will need to record the Application ID (Client ID) and also the Secret you have created. With this information, our first step is to create our Automation Account Credentials in the Shared Resource section of the Automation Account.

Give the credentials a meaningful name (1), maybe even be bold and name it the same as you did when registering the App 😉. (2) use the Application ID (Client ID) as user name and finally the Secret as Password (3) – repeat for confirmation. Once these credentials have been setup, we can reference them from our Runbook, which is the next step.

Next step is to generate the Powershell script that we will schedule or call from the outside via a webhook.
This is done by creating a new Runbook, in the Automation Account.

Find the menu item Runbooks

Create a new Runbook, select a meaningful name, select the Runbook Type which in our case is Powershell. Lastly provide the correct version of Powershell you will be using – make sure the correct libraries are loaded, see how to manage the modules here.

And now to the actual Powershell code.
We will begin by defining the parameters for the Runbook, which are DatabaseName, AnalysisServer and RefreshType. All three combined makes a good starting point for a dynamic way to expose the option to refresh a model in Azure Analysis Services. The code looks like this:

param
(
    [Parameter (Mandatory = $false)]
    [String] $DatabaseName,
    [Parameter (Mandatory = $false)]
    [String] $AnalysisServer,
    [Parameter (Mandatory = $false)]
    [String] $RefreshType
)

This way, we can from the outside tell the Runbook which database on which server to refresh.
Then we will assign the tenant id to a variable (this could arguably be set from a Runbook variable or parameter) and then we will assign the credentials we just created to another variable. Please replace #!#CredentialName#!# with the name that you have created the credentials under.
As soon as we have the credentials assigned, we can log in to the Azure Analysis Services instance and perform the ProcessASDatabase method. Note that the refresh type has to match the definition below.

# Get the values stored in the Assets
$TenantId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
$Credential = Get-AutomationPSCredential -Name "#!#CredentialName#!#"

# Log in to Azure Analysis Services using the Azure AD Service Principal
Add-AzureAnalysisServicesAccount `
    -Credential $Credential `
    -ServicePrincipal `
    -TenantId $TenantId `
    -RolloutEnvironment "northeurope.asazure.windows.net"

# Perform a Process Full on the Azure Analysis Services database
Invoke-ProcessASDatabase `
    -Server $AnalysisServer `
    -DatabaseName $DatabaseName `
    -RefreshType $RefreshType

Refresh Type definitions (see detailed description here):

ProcessFull, ProcessAdd, ProcessUpdate, ProcessIndexes, ProcessData, ProcessDefault, ProcessClear, ProcessStructure, ProcessClearStructureOnly, ProcessScriptCache, ProcessRecalc, ProcessDefrag

Once we are at this stage, we can publish and/or test our Runbook by pressing Publish or opening the Test Pane. Note: You cannot run a Runbook that is not published.

When published, we have several options to invoke the Runbook, either by Schedule or by Webhook.

The Webhook creates a URL which we can use in other applications to invoke the Runbook. The parameters need to be assigned once the Webhook is defined. This means you can have a unique URL for each set of parameters you have.
Note, you need to copy and store the URL generated when creating the Webhook – as the warning says, you cannot go back and retrieve it.

Creating a Webhook

Last step is to define the parameter values. Assign the name of the Database and the Server as well as the Refresh Type you desire.

After the parameter assignment, you should end up with a final wizard page looking like this:

Once we click Create, the Webhook is live and usable.

I hope this post will be of use, or at least of inspiration to someone out there, on the great things possible in Azure.

Loading