Supervising the PL-300 Labs at the Microsoft Business Application Cloud Week for Partners

BizApps Cloud Week

The conference is taking place April 4-8 and only requires a 2-hour commitment each day (apart from the labs). It is a virtual skilling event designed to teach attendees the practical applications of Dynamics 365 and Power Platform and prepare them for certification.
The agenda focuses on role-based certification tracks that are very relevant to my responsibilities as a what ever position they pursuit. This conference will have instructor-led courses, labs, exam prep sessions, and both live and offline Q&A.
Attending this conference would enable you to build your skills and implement them into customer projects in a meaningful way that would benefit my team and your company.
Please check out the conference registration page: Business Applications Cloud Week for Partners. The conference is free to attend with the only commitment being my time, as long as your are a Microsoft Partner.

My role in this event is supervising the labs. Specifically the labs around the PL-300 course. If you are unfamiliar with the PL-300 exam, it might be that DA-100 rings a bell!? The PL-300 certification replaces the DA-100 exam and introduces a set of changes. You can compare the skills tested here for PL-300 and here for DA-100. In a quick glance, the differences are as follows:

PL-300DA-100
Prepare the data (15-20%)Prepare the data (20-25%)
Model the data (30-35%)Model the data (25-30%)
Visualize and analyze the data (25-30%)Visualize the data (20-25%)
Deploy and maintain assets (20-25%)Analyze the data (10-15%)
Deploy and maintain deliverables (10-15%)

Obviously Microsoft has updated the content, but also moved the certification from one branch (DA) to another (PL). The DA branch only contained the DA-100 certification, as all other data related certifications either reside in the DP branch. Now it’s moved in with the other Power Platform certifications in the PL branch.

Back to the labs!

In this week of Business Applications, I am supervising the labs that run as homework for the classes in Power BI. These labs are similar, but not the same, as the ones that are available on some of the Microsoft Learn Learning Paths. If you are not familiar with these new and powerful capabilities of the online learning experience with Microsoft Learn, you should definitely check them out.

The labs are highly specialized towards a specific learning goal and of great use, when you want to fiddle about with a well refined use case.

Setting up Azure Analysis Services database(s) refresh w/ Azure Automation

There are a lot of technical ways to achieve an updated database (to many called a model) in Azure Analysis Services, one of them is by using Azure Automation which allows you to orchestrate processes in Azure amongst other things.

Automation capabilities - src: https://docs.microsoft.com/en-us/azure/automation/automation-intro
src: https://docs.microsoft.com/en-us/azure/automation/automation-intro

One of the components of Azure Automation is the concept of a Runbook. A Runbook contains some sort of a script i.e. Powershell or graphical representation, which can be scheduled or activated by a Webhook. A webhook is an HTTP endpoint, which means you can activate the runbook from almost any service, application and/or device. In fact, if you can do a POST to the HTTP endpoint you are good to go.

So really, this comes down to how you create the runbook, because once created, you can think up a gazillion scenarios to invoke the script. Could be Power Apps, could be Power Automate, could be Azure Data Factory or a completely different process where you need to kick of an automation script.

To complete this guide, you will need the following services created:

For the Azure Analysis Services Model we can simply use a sample data option, provided by creating a new Model in Azure Analysis Services. This allows you to select a sample data which creates an Adventure Works sample model.

Create new Model
Choose Sample Data Model
Adventure Works Model

Now that we have our Automation Account and model ready, we can go ahead and stitch everything together.

In order for us to run this unattended, we will be needing an App Registration in our Azure Active Directory (make sure it’s in the same tenant). Microsoft has a guide here. You will need to record the Application ID (Client ID) and also the Secret you have created. With this information, our first step is to create our Automation Account Credentials in the Shared Resource section of the Automation Account.

Give the credentials a meaningful name (1), maybe even be bold and name it the same as you did when registering the App 😉. (2) use the Application ID (Client ID) as user name and finally the Secret as Password (3) – repeat for confirmation. Once these credentials have been setup, we can reference them from our Runbook, which is the next step.

Next step is to generate the Powershell script that we will schedule or call from the outside via a webhook.
This is done by creating a new Runbook, in the Automation Account.

Find the menu item Runbooks

Create a new Runbook, select a meaningful name, select the Runbook Type which in our case is Powershell. Lastly provide the correct version of Powershell you will be using – make sure the correct libraries are loaded, see how to manage the modules here.

And now to the actual Powershell code.
We will begin by defining the parameters for the Runbook, which are DatabaseName, AnalysisServer and RefreshType. All three combined makes a good starting point for a dynamic way to expose the option to refresh a model in Azure Analysis Services. The code looks like this:

param
(
    [Parameter (Mandatory = $false)]
    [String] $DatabaseName,
    [Parameter (Mandatory = $false)]
    [String] $AnalysisServer,
    [Parameter (Mandatory = $false)]
    [String] $RefreshType
)

This way, we can from the outside tell the Runbook which database on which server to refresh.
Then we will assign the tenant id to a variable (this could arguably be set from a Runbook variable or parameter) and then we will assign the credentials we just created to another variable. Please replace #!#CredentialName#!# with the name that you have created the credentials under.
As soon as we have the credentials assigned, we can log in to the Azure Analysis Services instance and perform the ProcessASDatabase method. Note that the refresh type has to match the definition below.

# Get the values stored in the Assets
$TenantId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
$Credential = Get-AutomationPSCredential -Name "#!#CredentialName#!#"

# Log in to Azure Analysis Services using the Azure AD Service Principal
Add-AzureAnalysisServicesAccount `
    -Credential $Credential `
    -ServicePrincipal `
    -TenantId $TenantId `
    -RolloutEnvironment "northeurope.asazure.windows.net"

# Perform a Process Full on the Azure Analysis Services database
Invoke-ProcessASDatabase `
    -Server $AnalysisServer `
    -DatabaseName $DatabaseName `
    -RefreshType $RefreshType

Refresh Type definitions (see detailed description here):

ProcessFull, ProcessAdd, ProcessUpdate, ProcessIndexes, ProcessData, ProcessDefault, ProcessClear, ProcessStructure, ProcessClearStructureOnly, ProcessScriptCache, ProcessRecalc, ProcessDefrag

Once we are at this stage, we can publish and/or test our Runbook by pressing Publish or opening the Test Pane. Note: You cannot run a Runbook that is not published.

When published, we have several options to invoke the Runbook, either by Schedule or by Webhook.

The Webhook creates a URL which we can use in other applications to invoke the Runbook. The parameters need to be assigned once the Webhook is defined. This means you can have a unique URL for each set of parameters you have.
Note, you need to copy and store the URL generated when creating the Webhook – as the warning says, you cannot go back and retrieve it.

Creating a Webhook

Last step is to define the parameter values. Assign the name of the Database and the Server as well as the Refresh Type you desire.

After the parameter assignment, you should end up with a final wizard page looking like this:

Once we click Create, the Webhook is live and usable.

I hope this post will be of use, or at least of inspiration to someone out there, on the great things possible in Azure.

SqlSaturday is back in Town

SQLSaturday #963 - Denmark 2020

At last we can look forward to having anew edition of SqlSaturday in Denmark on April 25th. As previously the event will be hosted at Microsoft HQ in Lyngby, so the venue will be familiar to many of you who are returning attendees.

This year we are looking at session in all of the following categories:

  • Analytics and Visualization
  • Application & Database Development
  • BI Platform Architecture, Development and Administration
  • Cloud Application Development and Deployment
  • Enterprise Database Administration and Deployment

Currently we have received over 100 abstracts to select from, which is always a daunting task. Luckily Bent Nissen Froning (t|b|l), Claus Lehmann Munch (t|b|l), Just Thorning Blindbæk (t|b|l) , David Bojsen (t|l) are all very accomplished professionals, who will be making the right choices I am sure.

On the day before the conference, the team is offering three (3) pre-cons with world renowned professionals on:

Image result for rob sewell
Image result for dbberater"
Bent (Nissen Pedersen) Nissen Froning
Image result for asgeir mvp"

For more details, go to http://www.sqlsaturday.dk and book your seat today!

Intelligent Cloud Conference 2019

This year the schedule for the Intelligent Cloud Conference is at a level of it’s own. The committee behind the conference has managed to bring in Azure and Data Superstars from all over the globe. So if you are new to the business I will try to round up some of the biggest topics that will be covered at the conference. If you are veteran, you will know the impact with just a quick glance at the schedule.

Regardless if you are in business with a Modern Data Warehouse approach, is doing Big Data Analytics or maybe just curious to know what the cloud has to offer in your area of interest, this conference will be the one you’d want to attend.

There will be three (3) Data Platform pre-cons, on 8th of April, that will be of interest to you, especially if you are doing Power BI, Data Warehousing or Advanced Analytics.

Modern Data Warehouse: Simon Whiteley, Monday (09:00-17:00)

The number of data & analytics components available in Azure has exploded over the past couple of years – understanding which components should be in your tool-belt and what part each plays can be a daunting task, especially given the speed technology is advancing at. However, if you want to meet the challenges of the growing data landscape, you have to adopt distributed cloud architectures!

Advanced Analytics and AI: Leila Etaati, Monday (09:00-17:00)

This training is designed for data science, data analysis and who want to do machine learning by writing R or Python code. A unique opportunity to learn from Lelia Etaati.

Definitive Power BI Architecture: Reza Rad, Monday (09:00-17:00)

This workshop is designed for data architect or administrator, who is designing the architecture of leveraging Power BI in a solution. Someone who wants to understand how all components of Power BI are sitting beside each other to build the whole solution. This training is designed for understanding the strategy of using Power BI rather than the development of it.

If you are into Azure stuff, we’ve got you covered as well, have a look here at the complete pre-con offer.

The Key Note will be hosted by none other than Rohan Kumar, CVP of Azure Data and this is really something to look forward to as well. I’m not sure I need to tell you this is rather big, but in case you’re in doubt, check out some of the preparations going into it:

LinkedIn does not allow for links :/

After this brilliant start we have two days of awesomeness lined up for you, regardless if you are into Azure or Data Platform. I know from the planning sessions, that the Azure geeks (MVP #1, #2, #3 and #4 – Hope I left no one out) were snickering a lot, like 4th graders, going over the submissions. All should be well with the Azure tracks. 👍

Looking at the Data Platform sessions I really have to pinch myself. I am truly proud of the job done by Just and Claus. Because I cannot recall a more power packed program at a Nordic conference, ever.

We have incredible Data Platform names like

Matthew Roche – Expect the unexpected
James Serra – Get ready for vast knowledge
David Peter Hansen – Insights from the trenches
Marco Russo – Learn from the Master
Simon Whiteley – Drinking from the fire hose
Christian Wade – “Clicky Clicky, Draggy Droppy” – enough said
and the list goes on and on…

We are covering sessions on

Azure SQL Database
Containers (and SQL Server)
Power BI
Azure Analysis Services
Databricks
Azure IoT
Azure Data Factory (v2)
Stream Analytics
R, AI & Cognitive Services & ML
Cosmos DB

If this was a party, it’s the one party you don’t want to miss…
Sign up here, now – thank me later 😉



Speaking at #SqlSaturday #Pittsburgh #770

September 29th I will be hosting two (2) sessions at the SqlSaturday event in Pittsburgh, Pennsylvania, US.

My first session will be on the basics of Power Query with some tips and tricks added to spice things up a bit. See details here.

My other session will be on Windows/Azure IoT and Raspberry PI. For that session, I used to carry a breadboard with my own poor soldering applied, but for this event I’ve bought a Grove PI starter kit which I am really excited to demonstrate as well. See more detail here.

The coincidence of a SqlSaturday and Pittsburgh Steelers playing in the same vicinity is something I experienced first time almost 5 years ago, as I went to my first SqlSaturday in Cambridge (which is a smashing event btw) where I got to see my first Steelers game as they played Minnesota Vikings at Wembley stadium

This time it will be on Heinz Field, taking on Baltimore Ravens – Classic arch rivalry in the AFC North. Wouldn’t be surprised if the final score was 2-0. But still very excited to go!