Setting up Azure Analysis Services database(s) refresh w/ Azure Automation

There are a lot of technical ways to achieve an updated database (to many called a model) in Azure Analysis Services, one of them is by using Azure Automation which allows you to orchestrate processes in Azure amongst other things.

Automation capabilities - src: https://docs.microsoft.com/en-us/azure/automation/automation-intro
src: https://docs.microsoft.com/en-us/azure/automation/automation-intro

One of the components of Azure Automation is the concept of a Runbook. A Runbook contains some sort of a script i.e. Powershell or graphical representation, which can be scheduled or activated by a Webhook. A webhook is an HTTP endpoint, which means you can activate the runbook from almost any service, application and/or device. In fact, if you can do a POST to the HTTP endpoint you are good to go.

So really, this comes down to how you create the runbook, because once created, you can think up a gazillion scenarios to invoke the script. Could be Power Apps, could be Power Automate, could be Azure Data Factory or a completely different process where you need to kick of an automation script.

To complete this guide, you will need the following services created:

For the Azure Analysis Services Model we can simply use a sample data option, provided by creating a new Model in Azure Analysis Services. This allows you to select a sample data which creates an Adventure Works sample model.

Create new Model
Choose Sample Data Model
Adventure Works Model

Now that we have our Automation Account and model ready, we can go ahead and stitch everything together.

In order for us to run this unattended, we will be needing an App Registration in our Azure Active Directory (make sure it’s in the same tenant). Microsoft has a guide here. You will need to record the Application ID (Client ID) and also the Secret you have created. With this information, our first step is to create our Automation Account Credentials in the Shared Resource section of the Automation Account.

Give the credentials a meaningful name (1), maybe even be bold and name it the same as you did when registering the App 😉. (2) use the Application ID (Client ID) as user name and finally the Secret as Password (3) – repeat for confirmation. Once these credentials have been setup, we can reference them from our Runbook, which is the next step.

Next step is to generate the Powershell script that we will schedule or call from the outside via a webhook.
This is done by creating a new Runbook, in the Automation Account.

Find the menu item Runbooks

Create a new Runbook, select a meaningful name, select the Runbook Type which in our case is Powershell. Lastly provide the correct version of Powershell you will be using – make sure the correct libraries are loaded, see how to manage the modules here.

And now to the actual Powershell code.
We will begin by defining the parameters for the Runbook, which are DatabaseName, AnalysisServer and RefreshType. All three combined makes a good starting point for a dynamic way to expose the option to refresh a model in Azure Analysis Services. The code looks like this:

param
(
    [Parameter (Mandatory = $false)]
    [String] $DatabaseName,
    [Parameter (Mandatory = $false)]
    [String] $AnalysisServer,
    [Parameter (Mandatory = $false)]
    [String] $RefreshType
)

This way, we can from the outside tell the Runbook which database on which server to refresh.
Then we will assign the tenant id to a variable (this could arguably be set from a Runbook variable or parameter) and then we will assign the credentials we just created to another variable. Please replace #!#CredentialName#!# with the name that you have created the credentials under.
As soon as we have the credentials assigned, we can log in to the Azure Analysis Services instance and perform the ProcessASDatabase method. Note that the refresh type has to match the definition below.

# Get the values stored in the Assets
$TenantId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
$Credential = Get-AutomationPSCredential -Name "#!#CredentialName#!#"

# Log in to Azure Analysis Services using the Azure AD Service Principal
Add-AzureAnalysisServicesAccount `
    -Credential $Credential `
    -ServicePrincipal `
    -TenantId $TenantId `
    -RolloutEnvironment "northeurope.asazure.windows.net"

# Perform a Process Full on the Azure Analysis Services database
Invoke-ProcessASDatabase `
    -Server $AnalysisServer `
    -DatabaseName $DatabaseName `
    -RefreshType $RefreshType

Refresh Type definitions (see detailed description here):

ProcessFull, ProcessAdd, ProcessUpdate, ProcessIndexes, ProcessData, ProcessDefault, ProcessClear, ProcessStructure, ProcessClearStructureOnly, ProcessScriptCache, ProcessRecalc, ProcessDefrag

Once we are at this stage, we can publish and/or test our Runbook by pressing Publish or opening the Test Pane. Note: You cannot run a Runbook that is not published.

When published, we have several options to invoke the Runbook, either by Schedule or by Webhook.

The Webhook creates a URL which we can use in other applications to invoke the Runbook. The parameters need to be assigned once the Webhook is defined. This means you can have a unique URL for each set of parameters you have.
Note, you need to copy and store the URL generated when creating the Webhook – as the warning says, you cannot go back and retrieve it.

Creating a Webhook

Last step is to define the parameter values. Assign the name of the Database and the Server as well as the Refresh Type you desire.

After the parameter assignment, you should end up with a final wizard page looking like this:

Once we click Create, the Webhook is live and usable.

I hope this post will be of use, or at least of inspiration to someone out there, on the great things possible in Azure.

Loading

SqlSaturday is back in Town

SQLSaturday #963 - Denmark 2020

At last we can look forward to having anew edition of SqlSaturday in Denmark on April 25th. As previously the event will be hosted at Microsoft HQ in Lyngby, so the venue will be familiar to many of you who are returning attendees.

This year we are looking at session in all of the following categories:

  • Analytics and Visualization
  • Application & Database Development
  • BI Platform Architecture, Development and Administration
  • Cloud Application Development and Deployment
  • Enterprise Database Administration and Deployment

Currently we have received over 100 abstracts to select from, which is always a daunting task. Luckily Bent Nissen Froning (t|b|l), Claus Lehmann Munch (t|b|l), Just Thorning Blindbæk (t|b|l) , David Bojsen (t|l) are all very accomplished professionals, who will be making the right choices I am sure.

On the day before the conference, the team is offering three (3) pre-cons with world renowned professionals on:

Image result for rob sewell
Image result for dbberater"
Bent (Nissen Pedersen) Nissen Froning
Image result for asgeir mvp"

For more details, go to http://www.sqlsaturday.dk and book your seat today!

Loading

Intelligent Cloud Conference 2019

This year the schedule for the Intelligent Cloud Conference is at a level of it’s own. The committee behind the conference has managed to bring in Azure and Data Superstars from all over the globe. So if you are new to the business I will try to round up some of the biggest topics that will be covered at the conference. If you are veteran, you will know the impact with just a quick glance at the schedule.

Regardless if you are in business with a Modern Data Warehouse approach, is doing Big Data Analytics or maybe just curious to know what the cloud has to offer in your area of interest, this conference will be the one you’d want to attend.

There will be three (3) Data Platform pre-cons, on 8th of April, that will be of interest to you, especially if you are doing Power BI, Data Warehousing or Advanced Analytics.

Modern Data Warehouse: Simon Whiteley, Monday (09:00-17:00)

The number of data & analytics components available in Azure has exploded over the past couple of years – understanding which components should be in your tool-belt and what part each plays can be a daunting task, especially given the speed technology is advancing at. However, if you want to meet the challenges of the growing data landscape, you have to adopt distributed cloud architectures!

Advanced Analytics and AI: Leila Etaati, Monday (09:00-17:00)

This training is designed for data science, data analysis and who want to do machine learning by writing R or Python code. A unique opportunity to learn from Lelia Etaati.

Definitive Power BI Architecture: Reza Rad, Monday (09:00-17:00)

This workshop is designed for data architect or administrator, who is designing the architecture of leveraging Power BI in a solution. Someone who wants to understand how all components of Power BI are sitting beside each other to build the whole solution. This training is designed for understanding the strategy of using Power BI rather than the development of it.

If you are into Azure stuff, we’ve got you covered as well, have a look here at the complete pre-con offer.

The Key Note will be hosted by none other than Rohan Kumar, CVP of Azure Data and this is really something to look forward to as well. I’m not sure I need to tell you this is rather big, but in case you’re in doubt, check out some of the preparations going into it:

LinkedIn does not allow for links :/

After this brilliant start we have two days of awesomeness lined up for you, regardless if you are into Azure or Data Platform. I know from the planning sessions, that the Azure geeks (MVP #1, #2, #3 and #4 – Hope I left no one out) were snickering a lot, like 4th graders, going over the submissions. All should be well with the Azure tracks. 👍

Looking at the Data Platform sessions I really have to pinch myself. I am truly proud of the job done by Just and Claus. Because I cannot recall a more power packed program at a Nordic conference, ever.

We have incredible Data Platform names like

Matthew Roche – Expect the unexpected
James Serra – Get ready for vast knowledge
David Peter Hansen – Insights from the trenches
Marco Russo – Learn from the Master
Simon Whiteley – Drinking from the fire hose
Christian Wade – “Clicky Clicky, Draggy Droppy” – enough said
and the list goes on and on…

We are covering sessions on

Azure SQL Database
Containers (and SQL Server)
Power BI
Azure Analysis Services
Databricks
Azure IoT
Azure Data Factory (v2)
Stream Analytics
R, AI & Cognitive Services & ML
Cosmos DB

If this was a party, it’s the one party you don’t want to miss…
Sign up here, now – thank me later 😉



Loading

Speaking at #SqlSaturday #Pittsburgh #770

September 29th I will be hosting two (2) sessions at the SqlSaturday event in Pittsburgh, Pennsylvania, US.

My first session will be on the basics of Power Query with some tips and tricks added to spice things up a bit. See details here.

My other session will be on Windows/Azure IoT and Raspberry PI. For that session, I used to carry a breadboard with my own poor soldering applied, but for this event I’ve bought a Grove PI starter kit which I am really excited to demonstrate as well. See more detail here.

The coincidence of a SqlSaturday and Pittsburgh Steelers playing in the same vicinity is something I experienced first time almost 5 years ago, as I went to my first SqlSaturday in Cambridge (which is a smashing event btw) where I got to see my first Steelers game as they played Minnesota Vikings at Wembley stadium

This time it will be on Heinz Field, taking on Baltimore Ravens – Classic arch rivalry in the AFC North. Wouldn’t be surprised if the final score was 2-0. But still very excited to go!

Loading

Speaking at the Power BI World Tour in Copenhagen

In just little over a week the Power BI World Tour will be stopping by in Copenhagen, Denmark. More precisely in Lyngby at Microsoft HQ. Copenhagen was luckily reelected for hosting the World Tour again this year, which I think will be very beneficial to the local Power BI adoption and community to say the least.

Copenhagen

Last time around I was honoured by having one of my sessions selected. This year I get to have two sessions. I am super excited about that!

My first session will be on Tuesday 11th about Power BI Best Practices. From the trenches of some of our own projects I have gathered a list of things to do in a project, to make your life easier.

My other session will be an Introduction to Power BI and Power Query (M). The query language is one of the few things of late that blown my mind in terms of capability and versatility. I will be showing you how to get started with the basics.

I am so looking forward to spending a couple of days with other Power BI Professionals, foreign and domestic!

Loading