Power BI Community Tour

Blog post in Danish 🙂

Om lidt under en måned (25/4-27/4) ruller Power BI bussen afsted og gør sit første stop på Power BI Community Touren 2022. Mere præcist, så begynder vi i Lyngby, kører videre dagen efter til Odense og runder Touren af i Aarhus. Så alt efter hvor du er i landet, vil der være god mulighed for at deltage.

På hvert stop vil der blive disket op med introdultion og best practices indefor de forskellige elementer af Power BI. Med oplæg om Introduktion til Power BI, Data Loading & Mashup, Data Modellering & DAX, Data Visualisering og Distribution og deling vil alle hjørner være dækket.

Der er tale om oplæg der retter sig mod begyndere eller meget let øvede brugere af Power BI, og du kan her få en tryggere start på din rejse med Power BI.

  • Har du brugt Power BI, men mangler at vide hvordan det hele hænger sammen?
  • Har du importeret noget data i Power BI, men mangler at vide hvordan man organiserer sine tabeller?
  • Har du lavet en Power BI rapport, men mangler at vide hvordan man bedst visualiserer dataene?
  • Har du udviklet nogle rapporter, men mangler at vide hvordan du deler dem med dine kollegaer?
  • Har du aldrig brugt Power BI, men vil gerne vide mere om hvorfor det er et af de mest populære rapporterings- og self-service BI værktøjer?

Hvis du svarer ja til ét eller flere af disse spørgsmål, så er Power BI Community Tour for dig. Hvis ikke – så send meget gerne denne information videre til relevante kollegaer!

Sign up her: https://lnkd.in/eVzcBMvp

En stor tak til JDM, Kapacity, Microsoft og Seges for at stille lokaler og forplejning til rådighed.

Loading

Setting up Azure Analysis Services database(s) refresh w/ Azure Automation

There are a lot of technical ways to achieve an updated database (to many called a model) in Azure Analysis Services, one of them is by using Azure Automation which allows you to orchestrate processes in Azure amongst other things.

Automation capabilities - src: https://docs.microsoft.com/en-us/azure/automation/automation-intro
src: https://docs.microsoft.com/en-us/azure/automation/automation-intro

One of the components of Azure Automation is the concept of a Runbook. A Runbook contains some sort of a script i.e. Powershell or graphical representation, which can be scheduled or activated by a Webhook. A webhook is an HTTP endpoint, which means you can activate the runbook from almost any service, application and/or device. In fact, if you can do a POST to the HTTP endpoint you are good to go.

So really, this comes down to how you create the runbook, because once created, you can think up a gazillion scenarios to invoke the script. Could be Power Apps, could be Power Automate, could be Azure Data Factory or a completely different process where you need to kick of an automation script.

To complete this guide, you will need the following services created:

For the Azure Analysis Services Model we can simply use a sample data option, provided by creating a new Model in Azure Analysis Services. This allows you to select a sample data which creates an Adventure Works sample model.

Create new Model
Choose Sample Data Model
Adventure Works Model

Now that we have our Automation Account and model ready, we can go ahead and stitch everything together.

In order for us to run this unattended, we will be needing an App Registration in our Azure Active Directory (make sure it’s in the same tenant). Microsoft has a guide here. You will need to record the Application ID (Client ID) and also the Secret you have created. With this information, our first step is to create our Automation Account Credentials in the Shared Resource section of the Automation Account.

Give the credentials a meaningful name (1), maybe even be bold and name it the same as you did when registering the App 😉. (2) use the Application ID (Client ID) as user name and finally the Secret as Password (3) – repeat for confirmation. Once these credentials have been setup, we can reference them from our Runbook, which is the next step.

Next step is to generate the Powershell script that we will schedule or call from the outside via a webhook.
This is done by creating a new Runbook, in the Automation Account.

Find the menu item Runbooks

Create a new Runbook, select a meaningful name, select the Runbook Type which in our case is Powershell. Lastly provide the correct version of Powershell you will be using – make sure the correct libraries are loaded, see how to manage the modules here.

And now to the actual Powershell code.
We will begin by defining the parameters for the Runbook, which are DatabaseName, AnalysisServer and RefreshType. All three combined makes a good starting point for a dynamic way to expose the option to refresh a model in Azure Analysis Services. The code looks like this:

param
(
    [Parameter (Mandatory = $false)]
    [String] $DatabaseName,
    [Parameter (Mandatory = $false)]
    [String] $AnalysisServer,
    [Parameter (Mandatory = $false)]
    [String] $RefreshType
)

This way, we can from the outside tell the Runbook which database on which server to refresh.
Then we will assign the tenant id to a variable (this could arguably be set from a Runbook variable or parameter) and then we will assign the credentials we just created to another variable. Please replace #!#CredentialName#!# with the name that you have created the credentials under.
As soon as we have the credentials assigned, we can log in to the Azure Analysis Services instance and perform the ProcessASDatabase method. Note that the refresh type has to match the definition below.

# Get the values stored in the Assets
$TenantId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
$Credential = Get-AutomationPSCredential -Name "#!#CredentialName#!#"

# Log in to Azure Analysis Services using the Azure AD Service Principal
Add-AzureAnalysisServicesAccount `
    -Credential $Credential `
    -ServicePrincipal `
    -TenantId $TenantId `
    -RolloutEnvironment "northeurope.asazure.windows.net"

# Perform a Process Full on the Azure Analysis Services database
Invoke-ProcessASDatabase `
    -Server $AnalysisServer `
    -DatabaseName $DatabaseName `
    -RefreshType $RefreshType

Refresh Type definitions (see detailed description here):

ProcessFull, ProcessAdd, ProcessUpdate, ProcessIndexes, ProcessData, ProcessDefault, ProcessClear, ProcessStructure, ProcessClearStructureOnly, ProcessScriptCache, ProcessRecalc, ProcessDefrag

Once we are at this stage, we can publish and/or test our Runbook by pressing Publish or opening the Test Pane. Note: You cannot run a Runbook that is not published.

When published, we have several options to invoke the Runbook, either by Schedule or by Webhook.

The Webhook creates a URL which we can use in other applications to invoke the Runbook. The parameters need to be assigned once the Webhook is defined. This means you can have a unique URL for each set of parameters you have.
Note, you need to copy and store the URL generated when creating the Webhook – as the warning says, you cannot go back and retrieve it.

Creating a Webhook

Last step is to define the parameter values. Assign the name of the Database and the Server as well as the Refresh Type you desire.

After the parameter assignment, you should end up with a final wizard page looking like this:

Once we click Create, the Webhook is live and usable.

I hope this post will be of use, or at least of inspiration to someone out there, on the great things possible in Azure.

Loading

Extracting SSAS Tabular Models w/ PowerShell

As a response to a comment on a previous blog post on how to extract SSAS Multidimensional [MD] databases with PowerShell, I decided to write a new blog post, to address the tabular version [Tab].

The main differences working with MD and Tab, programatically, is that MD is represented by XML for Analysis and Tab is using JSON. In management studio this makes no difference however, as you paste XMLA and JSON using the same query type; XMLA (I wonder when/if that will change?)

Obviously, the two technologies MD and Tab are vastly different in almost every other aspect as well, but for the scope of this exercise, we will keep it at that.

Just as in the previous example, we will be using the ability to load assemblies in PowerShell and leverage the functionality the product team has provided. With Analysis Services comes a library of objects to programatically access and manage an Analysis Services instance.

The namespace for MD:
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices?redirectedfrom=MSDN&view=analysisservices-dotnet

The namespace for Tab:
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular?view=analysisservices-dotnet

In this documentation, you can dig into the details of options available. All of this extensible from both C# and PowerShell.

Now, back to the problem at hand. We wanted to extract the models from one or more servers, to deploy to another (single) server them or even just persist them locally. To do this, we need to load the Tab version of the assembly, which is that first difference to the original script. Next we need to leverage different functionality within the assembly, to export the json.

The script in all it’s simplicity 🙂

#Load the Tabular version of the assembly
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.Tabular") >$NULL

#Add a comma seperated list of servers here
$SourceServers = @( "<SOURCE SERVERS HERE>" ); #Source
#Add a single server here
$TargetServer = "<TARGET SERVER HERE>"; #Target

cls;

#Uncomment to deploy to target server
#$TargetServer.Connect();

#Loop servers
ForEach( $srv in $SourceServers ) {
    
    #Connect to current server
    $server = New-Object Microsoft.AnalysisServices.Server
    $server.connect($srv)

    #Loop al databases on current server
    ForEach(  $database in $server.Databases ) {

        #Generate Create Script - Other options are available, see https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.jsonscripter?view=analysisservices-dotnet
        $json = [Microsoft.AnalysisServices.Tabular.JsonScripter]::ScriptCreate($database, $false)

        #Create Path
        $Path = "<INSERT DUMP LOCATION AND FILE NAME>" + ".json";        

        #Export the model to file
        $json | out-file -filepath $Path 
        
        #Uncomment to deploy to target server
        #$TargetServer.Execute($json);
    }    
}

Loading

Speaking at #SqlSaturday #Pittsburgh #770

September 29th I will be hosting two (2) sessions at the SqlSaturday event in Pittsburgh, Pennsylvania, US.

My first session will be on the basics of Power Query with some tips and tricks added to spice things up a bit. See details here.

My other session will be on Windows/Azure IoT and Raspberry PI. For that session, I used to carry a breadboard with my own poor soldering applied, but for this event I’ve bought a Grove PI starter kit which I am really excited to demonstrate as well. See more detail here.

The coincidence of a SqlSaturday and Pittsburgh Steelers playing in the same vicinity is something I experienced first time almost 5 years ago, as I went to my first SqlSaturday in Cambridge (which is a smashing event btw) where I got to see my first Steelers game as they played Minnesota Vikings at Wembley stadium

This time it will be on Heinz Field, taking on Baltimore Ravens – Classic arch rivalry in the AFC North. Wouldn’t be surprised if the final score was 2-0. But still very excited to go!

Loading

#TSQL2SDAY #101 Round-Up : My Essential SQL Server Tools

Tuesday 3rd of this Month I invited people in the SQL Server community to share which tools are essential to their daily work. I was really overwhelmed by the number of stories that the topic triggered. 22 in total took the time to write down and share which tools they use for their work chores.
Going through 22 posts and aggregating them has been taking more time than I had hoped for, since my trusted laptop broke down – blinking codes are well and alive I tell you!

Going through the lot, I found some similarities to the posts, and have categorized them accordingly. But first off a BIG thank you to all how participated!

Without further ado, here goes.

Relational Heavy Lifting

Kamil Nowinski (b|l|t) takes us through the classic stuff, I mean, the real classic stuff – some would call it vintage – by showing how Total Commander still has a place in the tool belt, this century 😉

Matthew McGiffen (b|l) shows how to set up a Central Management Server, in order to execute queries against multiple instances in ad-hoc scenarios, seamlessly. Very nice tip. Matthew also did a second post, lining up multiple tools he’s written about in the past, nicely aggregated in this post.

Jess Pomfret (b|l|t) does a really nice post on how Powershell and the dbatools has changed her way of working. Jess even provides some useful snippets to get you going. I share the same enthusiasm for Powershell as Jess does, and was very pleased to see homage paid to the dbatools – incredible tool. Best of luck on your speaking adventures!

Marek Masko (b|l|t) has a huge post on classic DBA tools as well as a pleasant surprise on testing using tSQLt. Also some good pointers to free community scripts and tools as well. Great read!

Tracy Boggiano (b|l|t) covers dbatools and a specific Powershell Command and T-SQL Stored Procedure, but also on Telegraf and VS Code.

Dan Clemmens (b|l|t) goes all in on DBA tools for statistics, execution plans and tracing, even including the legendary diagnostic scripts from Glenn Berry.

Steve Jones (b|l|t) has a huge list of free and paid tools, from SQL Server sentric tools to a good deal of process related tools – i.e. DevOps and such.
Also Steve manages to sneak in a reminder on the password thingy magicky, that, according to domain expert Troy Hunt we all should rely on, be it pwsafe or any other tool like that.

Doug Purnell (b|l|t) is short and concise in his praise of Ola Hallengren maintenance scripts  and Jared Zagelbaums extension of those in Powershell.

Warren Estes (b|l) is praising the usual suspects in the DBA field, but adds a couple of interesting options for productivity and benchmarking/testing and also rounds up a couple of SentryOne products.

Devon Leann Ramirez (b|l|t) is offering a thorough introduction to their free plan explorer offering. Devon also makes a good point in marking the company’s presence on the community. If you want the quick tour, head over to Vimeo.

Rob Farley (b|l|t) talks about two things I really hold dear; Coffee… and I forgot the other thing. No really, Rob has an excellent blog post on Live Query Stats (LQS), and what some of the use cases are for that feature/tool. There are more ways of using LQS than I had thought about – thanks for sharing!

Riley Major (b|l|t) share his story on how he works with Management Studio and how the cool could be improved to further support a common way of working. Besides the tips on SSMS Riley also lists his favorite (and not so favorite) tools.

The BI Power in Power BI

James McGillivray (b|l|t) is first and foremost writing about my trusted travel mate; The Kindle (App) as his favored tool of the trade. Besides that treasure trove books can be, James also has some pointers to daxformatter and a theme generator which is pretty hefty!

Community Zone

Jo Douglas (b|l|t) argues that the most important tool for any professional is networking and community, and it’s hard not to agree completely. Jo also writes about some great points of where to begin this journey.

Jason Brimhall (b|l|t) brings up the aspect of blogging itself as a great tool of the trade, and I have to agree here and couldn’t have stated it more clearly that Jason:

Blogging helps you become a better technical person.

Googlefoo is also described in Jason’s blog for this party, and he manages to sneak in a reference to his extensive work and blogging on Extended Events, which in itself is an awesome contribution.

Reid DeWolfe (b|l|t) offers a quick write up on classic DBA must haves; SQL Prompt, Plan Explorer and GitHub/SourceTree. Reid also describes some of the main benefits of the tools.

Other

Garland MacNeil (b|t) brings another perspective into the party, by writing from a borrows laptop – not sure it was intentional, but I guess the exercise is very rich in terms of learning. I know others have been there too:

https://twitter.com/_AlexYates_/status/986137414723866625

Chrissy LeMaire (b|l|t) has, surprisingly enough, not written about dbatools, and if you believe in that you may call me Bill 🙂
In Chrissys blog post you’ll find a great list of auxiliary tools for all the things you do around programming; Screen shot/Image handling, code repositories, clip board management and video editing tools.

Josh (b) gives us the DevOps perspective of a Database DBA/Developer in a not so uncommon scenario – well, I think we’ve all been there at some point. Some prominent 3rd party tooling is getting some ❤

The Other Tools

Catherine Wilhelmsen (b|l|t) offers a completely different and refreshing view on tools that were completely new to me, at least. Going from database modeling to data generators to time keeping tools and beyond.

Finally, No Tools

Hugo Kornelis (b|l|t) makes a good argument on not becoming addicted/dependent on the presence of certain tools in order to perform your job. I guess this applies in particular, when you’re a consultant and can’t always BYOD. Apart from that Hugo really likes SQL Prompt and Plan Explorer 😉


The Tools Mentioned (in no particular order)

dbatools dbatools  PowerShell  dbareports  SQL Server Management Studio
 Redgate SQL Compare  Minionware  Sentry One Plan Explorer dbachecks  SQL Operations Studio
 SQL Database Modeler  Dynamic Restore Script  Scooter Software Beyond Compare  Redgate DLM Dashboard  Ola Hallengrens’s maintenance scripts
 Trello  SQL Server Data Tools  Passwordsafe  Sublimetext  Notepad++
 Redgate SQL Prompt  Mockaroo  Dropbox Visual Studio Code  SQLCover
 Sourcetree  SQLNexus  Coblis  tSQLt  Advanced Theme Generator (PowerBI)
 DAX Formatter  R Studio  Scale SQL Clear Trace   PSSDiag  Devart’s dbForge
 Toggl  Grammarly  SCD Merge Wizard Statisticsparser  Adam Machanics whoisactive
 Winmerge  Mythicsoft Agent Ransack  Redgate SQL Search    Glenn Berry’s Diagnostic Scripts

 

Loading