Summarizing my #Summit15

Finally I am able to, coherently write down my take away’s from this years PASS Summit. I came a long way, and endured all kinds of dangers on the way to Seattle. Not sure if the person before me, in the flight seat had been shot, but there was a distinct hole in the window, WTF Lufthansa!:

Lufthansa Hole in the Window

Gun crazed Americans? Or was it ze Germans?

As always, at least for me, the schedule changes as soon as the summit begins. This happened this time around as well. I had planned on seeing a bunch of sessions on various topics, but as game day came along, my choices changed. On Day 1, only  Dimensional Modeling Design Patterns: Beyond the Basics by Jason Horner (b|t) came out strong. Jason is an awesome presenter, so if you ever get the chance, go for it!  I knew most of the stuff in advance, but sometimes it’s nice to have reassurance of what you think is the proper road ahead. Jason’s session gave me just that. SQL Server on Azure Virtual Machines: Features, Best Practices & Roadmap by Luis Vargas was a bit disappointing as there were almost no best practices tied into the presentation. Luis ran out of time, and had to race through the final 4-5 slides of the deck, leaving no time to go into detail. Finally wrapping up day 1 was Cortana Analytics Deep Dive: Big Data Stores by Rajesh Dadhia, Matt Usher which started out as a very solid presentation. But. The so called demonstration of the Azure Data Lake import into Azure SQL Data Warehouse through Azure Data Analytics was not a demo, at least not in my opinion. Presenting a bunch of screen dumps (with errors occurring) is not a demo.
As soon as the last session ended, 40+ Danish attendees went to dine at the Pike Brewing Company. Jørgen Guldmann (b|l) captured the night out, on his blog. Good times.
Day 2 ended up being all about my Lightning Talk – which I think went well. No tomatoes, no rotten eggs, so I guess it wasn’t that bad :)
Day 3 began with some U-SQL Query Execution and Performance Tuning by Ed Triou which didn’t end too well, as the speaker decided to go 13 minutes over time, leaving only two (2) minutes for Jason Thomas (b|l|t) to get ready for Memoirs of Building a 150 GB (and Growing) SSAS Tabular Model. This session was surprisingly more on hardware than on architecture, which I somehow had my mind set upon. Nevertheless, Jason delivered, as usual, a cool, calm and collected session. Good job!

 

Right after, in same room came Bill Anton on Analysis Services: Show Me Where It Hurts. If you work with Analysis Services and do not yet have logging and performance counter collection in place, have a look at the session description, downloads and Bill’s blog Here’s how you get up and running quickly.
I spent the afternoon attending Mike Diehl on Agile Analytics: Making Progress Visible which, again, was one of those sessions that provided the reassurance of being on the right track. No ground breaking news, but that’s OK on many levels. Mike did a great job presenting and I think his message is very important to the community. Work smarter, not harder.
I don’t really know how to describe the last session of the day: How to Build a Virtual Test Lab for SQL Server by Ed Leighton-Dick and David Klee. The session began really strong (read: funny) with this video (which I think is both hilarious and scary at the same time)

But after that, it became a show case of who was the most geeky on hardware @ home. I realize that a geek, by now,  is socially accepted. But I honestly thought we (yes, I just labeled myself geek there) had moved on from the “I’ll show you how much geek I am by displaying the size of my soldering station” – sigh.
I was hoping to see some insight, from undisputed authority on the field, into how to setup you _virtual_ dev/test environment. We ended up hearing too much about how much old hardware spare parts would cash in on eBay. Again, don’t know what to say…

This year in Seattle, I had time to visit the Space Needle, where I played around, making this panoramic photo:
Seattle Skyline

Oh yeah, final pro tip: When dining at Local 360, ask for the price before ordering – These two babies cost me $50 and I was expecting half that…

Expensive Short Ribs

Posted in Community, Programming | Tagged , , | Leave a comment

PASS Summit 2015 – Personal Schedule

Only 20 hrs untill my flight is taking off from CPH -> FRA -> SEA – But who’s counting? :)
This year is no different from any year at PASS Summit (saying that as a 2nd year veteran). The time to build your schedule, is probably the most frustrating part leading up to the conference. This goes for SQL Bits as well, another well planned SQL Server event.

This year I opted out from the pre-con on various reasons, mainly because I was very disappointed last year, in terms of cost vs. output. Although I must say a couple of them looked really interesting, especially The Complete Primer to SQL Server Virtualization by David Klee, Argenis Fernandez, Jimmy MayOptimize “All Data” with a Modern Data Warehouse Solution by Bradley Ball, Josh Luedeman and not least The Enterprise Scripting Workshop by Sean McCown, Jennifer McCown. But the fact having three kids age Eight, Six and Two makes my trip to Seattle short, this year at least. I had also planned on attending the comeback of Big Ben in Pittsburgh hosting Cincinatti, but hopefully they can manage without me (fingers crossed).

So in short, my sessions on interest will be:

Wednesday:

Thursday:

Friday:

As with any schedule, this is subject to change. I certainly always find myself making “game time” decisions influenced by whom ever I am talking to, in between sessions. Often there is some aspect that others point out, that influence the final call. Sometimes I even “miss” sessions, just to socialize and extend my network – There is always someone interesting to talk to, no matter where you turn at PASS Summit.

Hope to see you there!

Posted in Community | Tagged , , | Leave a comment

Using PowerShell to Discover Temp/Log/Backup and Data Directories of SSAS

Objective: Locate all folders related to SQL Server Analysis Services (SSAS) on disclosed servers.

During a massive cleanup on some of our environments, I was asked to point out all the possible locations of SSAS related files on a list of 10+ servers. The main reason for this not being a trivial task done with blindfold is that, at some point in time, someone made a decision to change the default value in the SSAS configuration. This change lead to new cubes being deployed to a different location than the existing ones. Also contributing to the mess was a dubious naming standard, which makes guessing the folder names a difficult task.

My immediate strategy was to check if I could use the Analysis Service Management Objects (AMO) to get the current values from the configuration on each instance. I happen to be very fond of the AMO API, so this was my first instinct. Unfortunately there were several drawbacks and shortcomings to this, as only the Data directory is exposed, and I would have no way of telling the historic values of each property (TempDir, LogDir, BackupDir and DataDir).

Asking the question on Twitter (using hashtag ssashelp), whether there was some clever way of getting this information returned only a few responses, from Bill Anton (b|l|t) and Chris Webb (b|l|t) – but nothing directly solving the issue at hand. Bill shared this link to lookup Instance Configuration via PowerShell and Chris shared this one on Using MDX to browse the File System.

I think that the API should at least expose the current values for all possible configurations on a SSAS Instance. I have filed a Microsoft Connect ticket for this. But since this is not currently available, the only option was to script it myself.
The script will produce a csv file containing a list of servers with corresponding paths to all folders containing files of specified extension. In this example I have chosen to go for .abf (Analysis services Backup File), .dstore (Compressed store and is a decoding table for SSAS) and .trc (Trace file for SSAS). The script look as follows:

 

Obviously this script comes with the normal DON’T JUST COPY/PASTE INTO PRODUCTION disclaimer. Please read through the script and adapt it for your own use. $Path and $Servers should be set accordingly.

Lastly you need to be aware of security. Your account may not be granted the proper permissions to actually remotely crawl the files and folders – please make sure this is addressed in advance.

Note also this script will scan all logical drives on all listed server, which can take some time,

Posted in Programming | Tagged , , | Leave a comment

[Issue] – SQL Server Installer

Helping out a collegue I was asked to install features from SQL Server 2008R2 on an existing box. That sounded like something I could do with both hands tied and a blindfold on. But Boy was I in for a surprise.

The server, which had some features (SSAS mainly) from SQL Server 2012 SP2 installed, was to run SSIS packages from a SQL Server 2008R2 instance. Don’t ask why, just accept it :)

Initially I downloaded an iso file from MSDN, that would contain SQL Server 2008R2. Btw. if you ever need to mount an iso and do not have the feature in the OS, like in my example Windows 2008, visit Slysoft and download the Virtual Clone Drive (Free).
Running the installation from that media resultet in an unusual error, one which I haven’t seen before – and to be honest, I don’t get why this would ever be a problem. The error message tells you, that the chached file is missing and you have to copy the original, and rename it in the process, to C:\Windows\Install (which is a hidden folder). Let’s repeat that, a cached file is missing… WTF!

SQL Server 2012 Install ErrorNote that the cached file in question is related to the SQL Server 2012 installation, but this was SQL Server 2008R2…

Going by the link described in the error message, I am directed to a page that describes this error in detail, for a good deal of previous versions of SQL Server (2005, 2008, 2008R2 & 2012). The KB provides a script that lets you identify which files are actually missing. The output format of this script is questionable – some would say chatty. So to get the details that actually matter, a bit of regular expression comes in handy. This may well be one of the most simple regular expressions I have ever written, and I think everyone can understand without further introduction:

This little beauty will pull out all the rows that begin with “Copy ” followed by any number of characters. The result from this query, can be pasted directly into a txt file, which then can be renamed to have .bat as postfix. Please note that you may need to do additional cleansing of the paths in the txt file before running it and you need the install media mounted on the box in question. One final note is to run a cmd prompt in elevated mode, and then kick off the bat job. This will copy the missing files to the Windows\Install folder.

Going through the output of the bat job, I can verify that all files have been copied succesfully. So you can imagine the surprise as I ran the installer once again and received below error message:

SQL Server ErrorOhhhkay… As the aware reader you are, you will by now have noticed the reference to same KB as above.

Alrighty ThenI had to find the file on the SQL Server install media, copy the file to my desktop, rename it and then copy/move that file into the Windows\Install directory.

Getting back in installation mode, I kicked off the installation once more, selecting all the required features (no by default option) and then got the install going. Only to be met by a new error message:

SQL Server ErrorIf you see keyboards flying, you know why…

And yes, computer screens used to look like that 😉

Posted in Programming | Tagged , | Leave a comment

[Issue] – The Mystery of the Hidden Column

Yesterday I was trying to setup a new Power BI dashboard for our SQL Server Analysis Services logging data.
We are running traces on most of our servers, in order to keep up with what is going on – this mainly because all other access to the boxes is restricted.

Now, having the data in a SQL Server database, the connection to Power BI Desktop is of course supported out of the box, no issues here what-so-ever, one should think. I however, came across creating a particular view on top of a table, the just didn’t compute in the Power BI Desktop Designer. As depicted below, the view in question would just not be possible to load/edit. Microsoft is looking into this.

View Error

I managed to get around the view issue by persiting the data in another table, consolidating and filtering to only have relevant data there.

Loading the data, from the table into Power BI Desktop Designer is easy, you just point to the server, pick the table and you are presented with two options, either to load or to edit. In this case I opted to load the data. I started playing around with the data and quickly discovered that I was missing out on some attributes in the data. For one thing I needed to convert my date to day, month and year (would have loved some basic week functionality as well, but maybe later eh?).
This functionality is actually build into the tool (Power BI Desktop Designer), so you can generate those extra columns by selecting “Add column” in the ribbon, select the desired date (if you have more) and choose from one of the many options in the “date convert picker”

Month Date Selector

– apologies for the screenshot in Danish.

The above selection will yield the Column you see in same screenshot, named [Month No], and as you can see, it returns a number. This is not very end user friendly, so a little convertion is in order.
The normal Excel IF(test, success, fail) does not work in the Power BI desktop Designer, we need to write it a little bit differently – this is how I convert month number into text:

as a custom column:

User Defined Column MonthPlease note the little green check mark indicating everything’s Dandy (b|l|t)… No not that Dandy….

But to my great surprise, the column does not show up in the data source view (or whatever they call it in Power BI Desktop Designer) – Look for [Month Name] as I renamed the column from [Month] in a later step.

Data Source View
WTH is my column?

Posted in Programming | Tagged , , | Leave a comment