SqlSaturday Cambridge, #SqlSat228, re-cap

SQL Saturday

This past weekend, was one of the most exciting weekends so far this year. In case my wife is reading, this is of course in a different league, and nothing like when you gave birth to our third child just four months ago¬†ūüôā

In the summer of 2013, it was annouced that The Minnesota Vikings would host The Pittsburgh Steelers at Wembley Stadium, London UK. I was really exited about the opportunity, to actually see the team I root for live, would be within reach, them playing a venue this close. I quickly realised, that there was to be held a SQL Saturday in Cambridge, just the day before. As I described in my post here on getting the mail from Mark Broadbent (b|t), it would require all of my chaneled into a single burst, to be able to attend both events.

But here I am, having just had an amazing weekend in both Cambridge and London. Flew from Malm√∂ in Sweden to Stansted, took the train from there, arriving just in time for the speaker dinner. The venue was amazing and I think everyone was a bit humbled by the surroundings in the great dining hall. That feeling faded, as more wine was served and the chatter started getting louder ūüôā

Speaker Dinner

Got up first thing Saturday morning, to get ready for the first session on InMemory¬†OLTP¬†Introduction by Anthony Saxby (t). I didn’t get up as early as some of the other attendants, Rob Sewell (b|t)¬†reporting he got up at 04:00, but it¬†still felt early.¬†After the first session, I just wanted to run through my session, which wasn’t until the afternoon. But better safe than sorry. So when I tried to plug my laptop into the adapter I brought, I found out it didn’t fit. Since I bought the adapter, I’ve gotten a new laptop, with a new plug, that didn’t match. “Ok, you’ve got time” i¬†thought, and went through my bags to find some kind of pliers, to make room for the new plug. Of course I didn’t bring any pliers! I am flying Ryanair,¬†where you are only¬†allowed to carry 10Kg¬†of hand luggage. After thinking it through, I went to the reception¬†at the Queens College, and asked for any Tool that would make adapter and plug fit. The guy behind the counter brought the full toolkit, so I finally got it fixed and went back to my room to prepare. I had enough time to get one test-run done.

My presentation went really well, at least I myself¬†think so ūüôā Except for the guy snoring in the front row, people seemed to listen to what I was trying to bring forward. Front-row guy was excused, as this was the second to last session of the day.¬†And after¬†the session a couple of¬†attendees approached with some questions and shared their own experience with AMO. Always nice to chat on a subject, where you get the distinct feeling, that the¬†other party has put heart and soul into it.

Around 17:00 all the attendees, speakers and volunteers gathered in the big auditorium for some prizes ūüôā
There were actually some really great prizes among the lot. A Nokia Lumia 1020, All PASS Summit 2012 sessions on disc, great books and other really cool stuff. last but not least, I should mention Buck Woody, who did a fantastic job on bringing a game show atmosphere into the room. Apparently the trip was worth it for Rob, since he left with one of the prizes!

Buck Woody - Buck

The evening offered a really late night snack at Zizzi (French people giggle every time they go by) in the company of several of the volunteers, speakers and attendees. Here’s a shot from the end of the table.

Dinner Saturday Night

And by that final note, a great event ended. A shet load of work invested by Mark and the rest of the team. Cudos on a job well done!


Next morning, it was off to London, to get ready for the game at Wembley. Great game, not the result I’d hoped for and an appalling display of fan-care by the players. I went straight to the hotel, then off to Wembley Stadium. I had bought a ticket on eBay, and was kinda nervous that it could turn out to be fake. It wasn’t ūüėĬ†Wembley Stadium

Since Minnesota Viking got the better of the Steelers, and¬†you could imagine some sort of disappointment would show of off the players. But the way both Big Ben, Coach Tomlin and Brett Keisel went right past all the fans, without even looking in their direction was way below the professional level¬†I would have expected. Troy Polamalu was the only player I saw, who actually engaged with the fans. Big thumps up for that ūüôā
Troy Polamalu #43
So, all in all the Sunday ended well. I got to see some of the top players of the league and last but not least, a great statdium
Wembley by Night

Using Powershell to view Partition Slice Information in Excel


In a blog post written by Chris Webb (b|t), years ago, Chris displays how to query the ASSP.DiscoverPartitionSlices(…). I stumbled upon the post while investigating solutions to problems with partition slicing. But since there were about 20 cubes, each¬†containing several measure groups, I decided to let Powershell do the bulk of the work.

Data collection

I created a script, that loops all databases on a given server, loops all measure groups in every cube and records the result from calling ASSP.DiscoverPartitionSlices(…). The result is then dumped in a .CSV file at a location of your choosing, and the data is ready to be explored.

#Import Assemblies            
[System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.AnalysisServices' ) | out-null;            
[System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.AnalysisServices.AdomdClient' ) | out-null;            
[System.Reflection.Assembly]::LoadWithPartialName( 'System.IO' ) | out-null;            
[System.Reflection.Assembly]::LoadWithPartialName( 'System.Text' ) | out-null;            

#Instantiate and connect            
$Server = New-Object('Microsoft.AnalysisServices.Server' );            
$StringBuilder = New-Object( 'System.Text.StringBuilder' );            
$DataSet = New-Object( "System.Data.DataSet" );            
$Con = New-Object('Microsoft.AnalysisServices.AdomdClient.AdomdConnection')            

#Set the server to explore            
$ServerName = ""            

#Create Connection            
$Con.ConnectionString = [string]::Format("Provider=MSOLAP.4;Data Source={0};Initial Catalog={1};Connect Timeout=0", $ServerName, $Database.Name);            

#Loop the databases            
foreach( $Database in $Server.Databases ) {                
    #Loop the cubes                
    foreach( $Cube in $Database.Cubes ) {            
        #Loop the measure groups            
        foreach( $MeasureGroup in $Cube.MeasureGroups ) {            
            #Only query processed MeasureGroups            
            if( $MeasureGroup.State = "Processed" ) {                             
                #Create Adapter            
                $Adapter = New-Object( 'Microsoft.AnalysisServices.AdomdClient.AdomdDataAdapter' );            
                #Create Command            
                $Cmd = New-Object('Microsoft.AnalysisServices.AdomdClient.AdomdCommand' );            
                #Set Command            
                $Cmd.Connection = $Con;            
                $Cmd.CommandText = "call assp.DiscoverPartitionSlices('{0}', '{1}') " -f $Cube.Name, $MeasureGroup.Name                        
                #Set Adapter            
                $Adapter.SelectCommand = $Cmd;            
                $DataTable = New-Object( "System.Data.DataTable" );            
                #Fill data table (kill any output, can be verbose)                        
                $Adapter.Fill( $DataTable ) | out-null;            
                #Loop the rows of the result            
                foreach ($Row in $DataTable.Rows)            
                    $line = "";            
                    #Loop each "cell" in the current row            
                    foreach( $Cell in $Row.ItemArray ) {            
                        $line += "," + $Cell;            
                    #Remove pre-fixed comma            
                    $line = $line.SubString( 1, $line.Length - 1 );                                
                    #Append line to collection            
#Write content of the string collection to a file            
[System.IO.File]::WriteAllText( "<INSERT PATH HERE>", $StringBuilder.ToString());

Data preparation

As in Chris’ blog post, I’d like for the data to be shown in the preffered data visualization tool: Excel. But there are no range-bar chart types in Excel, but here’s a trick to make it look like one. Once the script has produced the data file, we need to open it with Excel. In here, we can manipulate the output, to better suit our needs. Of course, this could have been done programatically, but this isn’t the point of this excercise, so some manual labour is involved. First we need to convert the comma separated text lines, into columns. Excel has a feature for this on the Data-ribbon. Here you’ll find a button labeled “Text to Columns”.

Data Ribbon

When the column containing the text lines is highlighted, and we press this button, the data will automatically become rows and columns. Once this is done we can then load the data into PowerPivot, by pressing the “Create Linked Table” in the PowerPivot-ribbon.

Powershell Ribbon

In PowerPivot, we can add a column that calculates the difference between the min count and the max count using the following formula (replace table name accordingly):


The +1 is to ensure, that when min == max, there is actually something to see on the chart, not just a blank (which would indicated something entirely different).
NB! Hence the results of the DataIds in the PowerPivot cannot be directly used to track dimension members!

 Ready, Set, Browse

Using these results, you are able to very quickly browse and identify if there is any anomalies to the distribution, that you would be able to take advantage of. Following this guide on how to create Range-Bar charts in Excel, we are able to display the data in a very illustrative an intuit way.
I have added some examples, from some of the attributes.

DataId Distribution Example

DataId Distribution ExampleDataId Distribution ExampleDataId Distribution ExampleDataId Distribution Example

That being said, it is important to remember, that because you have no explicit control over what DataIds are assigned, you could end up causing a havoc. DataIds are assigned in no particullar order, so the range of years {2010, 2011, 2012, 2013} could have the corresponding DataIds {10, 13, 11, 12}

I have attached the Powershell script : Partition Slice View Powershell Script

Using Powershell to create Local Users from AW Customers

As a part of my upcomming presentation at SqlSaturday Cambridge, I will probably be needing a lot of users to demo some aspects of security in Analysis Services. Therefore I have created a Powershell script, my first actually, to generate a bunch of users. The first source¬†for a lot of¬†users that¬†came to mind was the¬†company AD, but that probably wouldn’t be given the green light, if I asked. So what to do? Adventure Works to the rescue!

The Customer dimension in the Adventure Works database has roughly 18.000 entries. This would make a nice pile of users for a demo. So, connecting to the database, selecting the distinct customers and just adding them as local users can actually be done rather quickly through Powershell. The run-time is just around 10 minutes, to create or delete.

The script looks like this:


Just to take a peek at what this script generates, here is a screendump:


I have attached both a create and a delete script: PS_Create Local Users From AW

As this is my first script, I’d be happy to hear if there is a better way of doing this!?

Same members declared in both Sub Select and Where clause of same MDX

What to make of it?

Currently I am conducting a review on all existing solutions. These solutions have seen their share of developers over the years. Hence there are several “trademarks” to be found.¬†One such trademark¬†threw me off a bit. It seemed as if¬†the same method had been applied to all queries¬†in the solution. For every expression in the where clause, there would be a similar sub select statement. So if¬†there was¬†a slice by [Product].[Category].&[4], there would be a Sub Select eg. ( SELECT [Product].[Category].&[4] ON COLUMNS FROM… )

I hadn’t seen this construct before, so I¬†felt challenged to find out, if this had some purpose or it was poorly written¬†MDX. I acknowledge, that sometimes development is rushed, and you leave behind unwanted code. But this looked as if it was written to stay. There was almost no complexity to the MDX, other than the odd construct… An example of a statement converted into Adventure Works lingo, would look something like this:

    {[Measures].[Internet Sales Amount]} ON COLUMNS
    , NONEMPTY {[Promotion].[Promotion].[Promotion]} ON ROWS
    ( SELECT [Scenario].[Scenario].&[1] ON COLUMNS FROM (
        SELECT [Sales Territory].[Sales Territory].[Group].&[Europe] ON COLUMNS FROM (
            SELECT [Product].[Category].&[4] ON COLUMNS FROM
                [Adventure Works] ) ) )
    , [Sales Territory].[Sales Territory].[Group].&[Europe]
    , [Product].[Category].&[4]

I looked up Sub Cubes in Analysis Services 2008 Unleashed, which is a ressource I would highly recommend it, if you don’t already¬†know of it. In this book, a sub cube is described as an arbitrary shaped discret and limited sub space of the cube. This illustrated by the following query (Query 1)¬†creating the following sub space (Figure 1 – grey area):

Sub Cube Figure 3
Query 1
Sub Cube Figure 2
Figure 1

One of the important Things to notice about sub cubes are, that the Default member can change, if the “normal” default member is not a part of the sub space. Then the first member of the sub space will become the default member.

I was reconfirmed in most of my knowledge on sub cube when I dug into the documentation on the subject, and nothing would lead me to believe, that the construct was other than a misunderstanding. So I tweetet a question using hashmark #ssashelp, and Brent Ozar (blog|twitter), Chris Webb (blog|twitter)¬†and Robert L Davis (blog|twitter)¬†replied, after twisting some arms ūüôā This lead to this blog post, describing the issue, in just a little more than 140 characters…So lets see what replies are in store! To my best knowledge, the first query in this blog post, is more confusing than brilliant.

SQL Server Quick Tip 26 – 30