Power BI Community Tour

Blog post in Danish 🙂

Om lidt under en måned (25/4-27/4) ruller Power BI bussen afsted og gør sit første stop på Power BI Community Touren 2022. Mere præcist, så begynder vi i Lyngby, kører videre dagen efter til Odense og runder Touren af i Aarhus. Så alt efter hvor du er i landet, vil der være god mulighed for at deltage.

På hvert stop vil der blive disket op med introdultion og best practices indefor de forskellige elementer af Power BI. Med oplæg om Introduktion til Power BI, Data Loading & Mashup, Data Modellering & DAX, Data Visualisering og Distribution og deling vil alle hjørner være dækket.

Der er tale om oplæg der retter sig mod begyndere eller meget let øvede brugere af Power BI, og du kan her få en tryggere start på din rejse med Power BI.

  • Har du brugt Power BI, men mangler at vide hvordan det hele hænger sammen?
  • Har du importeret noget data i Power BI, men mangler at vide hvordan man organiserer sine tabeller?
  • Har du lavet en Power BI rapport, men mangler at vide hvordan man bedst visualiserer dataene?
  • Har du udviklet nogle rapporter, men mangler at vide hvordan du deler dem med dine kollegaer?
  • Har du aldrig brugt Power BI, men vil gerne vide mere om hvorfor det er et af de mest populære rapporterings- og self-service BI værktøjer?

Hvis du svarer ja til ét eller flere af disse spørgsmål, så er Power BI Community Tour for dig. Hvis ikke – så send meget gerne denne information videre til relevante kollegaer!

Sign up her: https://lnkd.in/eVzcBMvp

En stor tak til JDM, Kapacity, Microsoft og Seges for at stille lokaler og forplejning til rådighed.

Loading

Setting up Azure Analysis Services database(s) refresh w/ Azure Automation

There are a lot of technical ways to achieve an updated database (to many called a model) in Azure Analysis Services, one of them is by using Azure Automation which allows you to orchestrate processes in Azure amongst other things.

Automation capabilities - src: https://docs.microsoft.com/en-us/azure/automation/automation-intro
src: https://docs.microsoft.com/en-us/azure/automation/automation-intro

One of the components of Azure Automation is the concept of a Runbook. A Runbook contains some sort of a script i.e. Powershell or graphical representation, which can be scheduled or activated by a Webhook. A webhook is an HTTP endpoint, which means you can activate the runbook from almost any service, application and/or device. In fact, if you can do a POST to the HTTP endpoint you are good to go.

So really, this comes down to how you create the runbook, because once created, you can think up a gazillion scenarios to invoke the script. Could be Power Apps, could be Power Automate, could be Azure Data Factory or a completely different process where you need to kick of an automation script.

To complete this guide, you will need the following services created:

For the Azure Analysis Services Model we can simply use a sample data option, provided by creating a new Model in Azure Analysis Services. This allows you to select a sample data which creates an Adventure Works sample model.

Create new Model
Choose Sample Data Model
Adventure Works Model

Now that we have our Automation Account and model ready, we can go ahead and stitch everything together.

In order for us to run this unattended, we will be needing an App Registration in our Azure Active Directory (make sure it’s in the same tenant). Microsoft has a guide here. You will need to record the Application ID (Client ID) and also the Secret you have created. With this information, our first step is to create our Automation Account Credentials in the Shared Resource section of the Automation Account.

Give the credentials a meaningful name (1), maybe even be bold and name it the same as you did when registering the App 😉. (2) use the Application ID (Client ID) as user name and finally the Secret as Password (3) – repeat for confirmation. Once these credentials have been setup, we can reference them from our Runbook, which is the next step.

Next step is to generate the Powershell script that we will schedule or call from the outside via a webhook.
This is done by creating a new Runbook, in the Automation Account.

Find the menu item Runbooks

Create a new Runbook, select a meaningful name, select the Runbook Type which in our case is Powershell. Lastly provide the correct version of Powershell you will be using – make sure the correct libraries are loaded, see how to manage the modules here.

And now to the actual Powershell code.
We will begin by defining the parameters for the Runbook, which are DatabaseName, AnalysisServer and RefreshType. All three combined makes a good starting point for a dynamic way to expose the option to refresh a model in Azure Analysis Services. The code looks like this:

param
(
    [Parameter (Mandatory = $false)]
    [String] $DatabaseName,
    [Parameter (Mandatory = $false)]
    [String] $AnalysisServer,
    [Parameter (Mandatory = $false)]
    [String] $RefreshType
)

This way, we can from the outside tell the Runbook which database on which server to refresh.
Then we will assign the tenant id to a variable (this could arguably be set from a Runbook variable or parameter) and then we will assign the credentials we just created to another variable. Please replace #!#CredentialName#!# with the name that you have created the credentials under.
As soon as we have the credentials assigned, we can log in to the Azure Analysis Services instance and perform the ProcessASDatabase method. Note that the refresh type has to match the definition below.

# Get the values stored in the Assets
$TenantId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
$Credential = Get-AutomationPSCredential -Name "#!#CredentialName#!#"

# Log in to Azure Analysis Services using the Azure AD Service Principal
Add-AzureAnalysisServicesAccount `
    -Credential $Credential `
    -ServicePrincipal `
    -TenantId $TenantId `
    -RolloutEnvironment "northeurope.asazure.windows.net"

# Perform a Process Full on the Azure Analysis Services database
Invoke-ProcessASDatabase `
    -Server $AnalysisServer `
    -DatabaseName $DatabaseName `
    -RefreshType $RefreshType

Refresh Type definitions (see detailed description here):

ProcessFull, ProcessAdd, ProcessUpdate, ProcessIndexes, ProcessData, ProcessDefault, ProcessClear, ProcessStructure, ProcessClearStructureOnly, ProcessScriptCache, ProcessRecalc, ProcessDefrag

Once we are at this stage, we can publish and/or test our Runbook by pressing Publish or opening the Test Pane. Note: You cannot run a Runbook that is not published.

When published, we have several options to invoke the Runbook, either by Schedule or by Webhook.

The Webhook creates a URL which we can use in other applications to invoke the Runbook. The parameters need to be assigned once the Webhook is defined. This means you can have a unique URL for each set of parameters you have.
Note, you need to copy and store the URL generated when creating the Webhook – as the warning says, you cannot go back and retrieve it.

Creating a Webhook

Last step is to define the parameter values. Assign the name of the Database and the Server as well as the Refresh Type you desire.

After the parameter assignment, you should end up with a final wizard page looking like this:

Once we click Create, the Webhook is live and usable.

I hope this post will be of use, or at least of inspiration to someone out there, on the great things possible in Azure.

Loading

SqlSaturday is back in Town

SQLSaturday #963 - Denmark 2020

At last we can look forward to having anew edition of SqlSaturday in Denmark on April 25th. As previously the event will be hosted at Microsoft HQ in Lyngby, so the venue will be familiar to many of you who are returning attendees.

This year we are looking at session in all of the following categories:

  • Analytics and Visualization
  • Application & Database Development
  • BI Platform Architecture, Development and Administration
  • Cloud Application Development and Deployment
  • Enterprise Database Administration and Deployment

Currently we have received over 100 abstracts to select from, which is always a daunting task. Luckily Bent Nissen Froning (t|b|l), Claus Lehmann Munch (t|b|l), Just Thorning Blindbæk (t|b|l) , David Bojsen (t|l) are all very accomplished professionals, who will be making the right choices I am sure.

On the day before the conference, the team is offering three (3) pre-cons with world renowned professionals on:

Image result for rob sewell
Image result for dbberater"
Bent (Nissen Pedersen) Nissen Froning
Image result for asgeir mvp"

For more details, go to http://www.sqlsaturday.dk and book your seat today!

Loading

Extracting SSAS Tabular Models w/ PowerShell

As a response to a comment on a previous blog post on how to extract SSAS Multidimensional [MD] databases with PowerShell, I decided to write a new blog post, to address the tabular version [Tab].

The main differences working with MD and Tab, programatically, is that MD is represented by XML for Analysis and Tab is using JSON. In management studio this makes no difference however, as you paste XMLA and JSON using the same query type; XMLA (I wonder when/if that will change?)

Obviously, the two technologies MD and Tab are vastly different in almost every other aspect as well, but for the scope of this exercise, we will keep it at that.

Just as in the previous example, we will be using the ability to load assemblies in PowerShell and leverage the functionality the product team has provided. With Analysis Services comes a library of objects to programatically access and manage an Analysis Services instance.

The namespace for MD:
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices?redirectedfrom=MSDN&view=analysisservices-dotnet

The namespace for Tab:
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular?view=analysisservices-dotnet

In this documentation, you can dig into the details of options available. All of this extensible from both C# and PowerShell.

Now, back to the problem at hand. We wanted to extract the models from one or more servers, to deploy to another (single) server them or even just persist them locally. To do this, we need to load the Tab version of the assembly, which is that first difference to the original script. Next we need to leverage different functionality within the assembly, to export the json.

The script in all it’s simplicity 🙂

#Load the Tabular version of the assembly
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.Tabular") >$NULL

#Add a comma seperated list of servers here
$SourceServers = @( "<SOURCE SERVERS HERE>" ); #Source
#Add a single server here
$TargetServer = "<TARGET SERVER HERE>"; #Target

cls;

#Uncomment to deploy to target server
#$TargetServer.Connect();

#Loop servers
ForEach( $srv in $SourceServers ) {
    
    #Connect to current server
    $server = New-Object Microsoft.AnalysisServices.Server
    $server.connect($srv)

    #Loop al databases on current server
    ForEach(  $database in $server.Databases ) {

        #Generate Create Script - Other options are available, see https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.jsonscripter?view=analysisservices-dotnet
        $json = [Microsoft.AnalysisServices.Tabular.JsonScripter]::ScriptCreate($database, $false)

        #Create Path
        $Path = "<INSERT DUMP LOCATION AND FILE NAME>" + ".json";        

        #Export the model to file
        $json | out-file -filepath $Path 
        
        #Uncomment to deploy to target server
        #$TargetServer.Execute($json);
    }    
}

Loading

Intelligent Cloud Conference 2019

This year the schedule for the Intelligent Cloud Conference is at a level of it’s own. The committee behind the conference has managed to bring in Azure and Data Superstars from all over the globe. So if you are new to the business I will try to round up some of the biggest topics that will be covered at the conference. If you are veteran, you will know the impact with just a quick glance at the schedule.

Regardless if you are in business with a Modern Data Warehouse approach, is doing Big Data Analytics or maybe just curious to know what the cloud has to offer in your area of interest, this conference will be the one you’d want to attend.

There will be three (3) Data Platform pre-cons, on 8th of April, that will be of interest to you, especially if you are doing Power BI, Data Warehousing or Advanced Analytics.

Modern Data Warehouse: Simon Whiteley, Monday (09:00-17:00)

The number of data & analytics components available in Azure has exploded over the past couple of years – understanding which components should be in your tool-belt and what part each plays can be a daunting task, especially given the speed technology is advancing at. However, if you want to meet the challenges of the growing data landscape, you have to adopt distributed cloud architectures!

Advanced Analytics and AI: Leila Etaati, Monday (09:00-17:00)

This training is designed for data science, data analysis and who want to do machine learning by writing R or Python code. A unique opportunity to learn from Lelia Etaati.

Definitive Power BI Architecture: Reza Rad, Monday (09:00-17:00)

This workshop is designed for data architect or administrator, who is designing the architecture of leveraging Power BI in a solution. Someone who wants to understand how all components of Power BI are sitting beside each other to build the whole solution. This training is designed for understanding the strategy of using Power BI rather than the development of it.

If you are into Azure stuff, we’ve got you covered as well, have a look here at the complete pre-con offer.

The Key Note will be hosted by none other than Rohan Kumar, CVP of Azure Data and this is really something to look forward to as well. I’m not sure I need to tell you this is rather big, but in case you’re in doubt, check out some of the preparations going into it:

LinkedIn does not allow for links :/

After this brilliant start we have two days of awesomeness lined up for you, regardless if you are into Azure or Data Platform. I know from the planning sessions, that the Azure geeks (MVP #1, #2, #3 and #4 – Hope I left no one out) were snickering a lot, like 4th graders, going over the submissions. All should be well with the Azure tracks. 👍

Looking at the Data Platform sessions I really have to pinch myself. I am truly proud of the job done by Just and Claus. Because I cannot recall a more power packed program at a Nordic conference, ever.

We have incredible Data Platform names like

Matthew Roche – Expect the unexpected
James Serra – Get ready for vast knowledge
David Peter Hansen – Insights from the trenches
Marco Russo – Learn from the Master
Simon Whiteley – Drinking from the fire hose
Christian Wade – “Clicky Clicky, Draggy Droppy” – enough said
and the list goes on and on…

We are covering sessions on

Azure SQL Database
Containers (and SQL Server)
Power BI
Azure Analysis Services
Databricks
Azure IoT
Azure Data Factory (v2)
Stream Analytics
R, AI & Cognitive Services & ML
Cosmos DB

If this was a party, it’s the one party you don’t want to miss…
Sign up here, now – thank me later 😉



Loading