T-SQL Tuesday #85- SSAS(MD) Backup Galore

This months T-SQL Tuesday is hosted by Kenneth Fisher(b|l|t) and the invitation is found following this link.

T-SQL Tuesday was started by Adam Machanic (b|t), and this is the SQL Server community’s monthly blog party, where everyone is invited to write about a single common topic. This time, the topic is Backup (and Recovery).

Admitted, I was a bit challenged this past week as I decided to update my WordPress to a newest version 4.7. This however caused the admin module of my site to become unavailable. After creating a support ticket with Unoeuro.com I was guided to a blog post by Vashistha Pathak (b|t) which in the end led me to the solution, and I fixed it without the need of a backup. Phew… Kind of nerve wrecking not knowing if my site was backed up – fortunately I wouldn’t have lost a single character, according to Unoeuro support. That was quite reassuring and somehow coincidentally very fitting for this months topic, no?

My approach for this blog post is to accept Kenneth’s challenge from this tweet, where Kenneth triggered a thought stream using the word keys:

I immediately came to think of SSRS encryption key backup and I knew by then I had to try to cover all basics on BI solutions, this being SSAS, SSIS and SSRS as well. But honestly, I haven’t been able to set aside enough time to cover all. So without further ado, here is SSAS Backup Galore (the others will come in good time):

SSAS (MD)

For Analysis Services databases there are two (2) fairly straight forward options. However, you may want to discuss up front, how much effort you want to spend setting this up. Consider you already have a process for deploying updated cubes into production; In which scenarios would it be alright for the business to wait for a normal deploy, for the numbers to come back online?
In Brent Ozar’s (b|l|t) First Responder Kit there is a document where IT and Business need to state how quickly they would like to recover from different categories of “down time” i.e. Corruption, “Oops” Queries and so on. That’s actually still a good exercise to do, with both sides of the table, even though the document operates at a higher level.

Prerequisites
  • You must have administrative permissions on the Analysis Services instance or Full Control (Administrator) permissions on the database you are backing up.
  • Restore location must be an Analysis Services instance that is the same version, or a newer version, as the instance from which the backup was taken.
  • Restore location must be the same server type.
  • SSAS does not provide options for differential or partial backups of any kind. Nor log backups (should you be wondering in from the DBA world).
  • A SSAS backup contains only one (1) database, so no option to create backup sets.
  • SSAS can only have one fixed backup location at a time; It’s configurable, but not meant to be changed for each database.
GUI

We begin by connecting to the server where the database resides. Via SQL Server management Studio (SSMS) you then expand the Node Databases.
In my example here, I will be using a Database  named ‘Test Database’.

When right-clicking on the selected database, we get the option to Back Up.

This kicks off a wizard where we need to select a few options for our backup.
First of all, we can edit or designate the name of the backup file. One thing to note with regards to backups on SSAS is that only one designated backup folder can be active at a time. In the server options dialog, it’s possible to change the destination for backups, but only after adding the new folder location to the attribute AllowedBrowsingFolders, which is considered an advanced setting, so remember to tick that option in order to set the new location. Also, you need to do this by opening the dialog twice; 1st add the new folder to AllowedBrowsingFolders, close and reopen. 2nd change the BackupDir to the newly added folder.

My settings are just the standard ones, yours should differ in the sense that you should put your Data, Backup and Log on different disks for optimal performance.

Well, this is why you only get to type a name for the file in the next dialog and not the complete path and file name. You also get to select, if the backup can overwrite an existing file of same name. Tick if you would allow this.

The two other options I will cover in this post are Apply Compression and Encrypt backup file.

I have not met a setup where applying compression was not an option, yet. Obviously this has a penalty cost on CPU while executing the backup, and will affect the rest of the tasks running on the server (even if you have your data and backup dir on different drives). But in my experience, the impact is negligible.

This may not be the case with the encryption option, as this has a much larger foot print on the server. You should be using this with some caution in production. Test on smaller subsets of the data if in doubt.
Another thing to keep in mind, as always when dealing with encryption, do remember the password. There is no way of retrieving the data other than with the proper password.

In my example I allow file overwrite and apply compression.

When we clicking the OK button, the backup will execute and you will be able to track the progress in the lower left corner of the dialog. The dialog will close after successful completion. You have the option to stop the backup action from the progress dialog by clicking Stop action now.

After completion the file should be sitting on disk, where you designated it.

XMLA

Same options as mentioned above in the GUI section are available when doing your backups via XMLA.

<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
  <Object>
    <DatabaseID>Test Database</DatabaseID>
  </Object>
  <File>Test Database.abf</File>
  <AllowOverwrite>true</AllowOverwrite>
  <Password>9sse6</Password>
</Backup>

Connection to the server, start an XMLA query and fire away.
To leave out Password, just leave out the entire tag. Vice versa goes for Compression which you have to specify with False, in order to not have your backup compressed. In the above example, the backup is compressed.

Often I have seen this used within a SQL Agent job, that support SSAS Commands, just as the one above. Look for the SQL Server Analysis Services Command Step Type.

Obviously setting this up in a SQL Agent job is only half the job. After the backup has been done and/or scheduled, you need to ensure that the backup files produced are stored in a secure and available location.

PowerShell

Via the PowerShell cmdlet Backup-ASDatabase you can execute the same backup as described above. In fact this is a perfect example of why you should learn PowerShell, if your not already on that band wagon. Simply put, it doesn’t get easier than this. In my example, I will create a backup of all the databases on a server, in a couple of lines of code – Behold PowerShell AwesomeSauce (PSAS)

Import-Module sqlps -DisableNameChecking
sqlserver: | Out-Null
cd sqlas |Out-Null
cd <server>\<namedinstance>\databases
dir | Backup-ASDatabase

Remember to exchange your own values for <server> and if present <namedinstance>.
This will backup all databases on the given server.

Obviously you can extend this script with all the parameters exposed by the cmdlet. For more details, see this page.

Analysis Services Management Objects

The final method I will be covering is via Analysis Services Management Objects or AMO for short. AMO exposes, as PowerShell, a great deal of methods, where backup is one.
In order to use AMO to back up an SSAS database we need to get an create or load instance of said database through the object model in AMO. This can be done fairly easy.

The AMO classes/functionality comes with the Microsoft.AnalysisServices.dll which is installed with the product. In order to utilize it, we need to load it first or reference it, if you are doing a proper .Net project in SQL Server Data Tools or Visual Studio.

One thing to note before we go to the example is that the Backup method in the Database object has several implementations. Depending on your parameters, you can call different implementations of the same method. According to Technet, these are your options:

Backup(String) Provides programmatic access to Backup operations for a Database with file parameter.
 Backup(BackupInfo) Provides programmatic access to Backup operations for a Database with BackupInfo parameter.
 Backup(String, Boolean) Provides programmatic access to Backup operations for a Database with file and AllowOverWrite parameters.
 Backup(String, Boolean, Boolean) Provides programmatic access to Backup operations for a Database with file, AllowOverWrite, and BackupRemotePartitions parameters.
 Backup(String, Boolean, Boolean, BackupLocation[]) Provides programmatic access to backup operations for a Database with file, AllowOverWrite, BackupRemotePartitions, and locations parameters.
 Backup(String, Boolean, Boolean, BackupLocation[], Boolean) Provides programmatic access to backup operations for a Database with five parameters.
 Backup(String, Boolean, Boolean, BackupLocation[], Boolean, String) Provides programmatic access to Backup operations for a Database with six parameters.

I will show how to invoke the AMO functionality from <drum roll please…> PowerShell.

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") >$NULL
$Server = New-Object Microsoft.AnalysisServices.Server
$Server.connect(".\SQL2016DEV")

$BackupInfo = New-Object Microsoft.AnalysisServices.BackupInfo

$BackupInfo.File = 'Test AMO Backup.abf'
$BackupInfo.AllowOverwrite = $true
$BackupInfo.ApplyCompression = $true
$BackupInfo.Password = '9sse6'

foreach( $Database in $Server.Databases ) {
	$Database.Backup( $BackupInfo );
}

You should note that only via the method that takes a BackupInfo class as parameter are you able to specify a password.

Final thoughts

I think I have covered all possible ways of executing an Analysis Services backup – If I somehow missed and option, please let me know!

But as you can see, ways to back up your SSAS database are abundant and very flexible.

You should, as with any backup, on a regular basis restore and test them on a separate environment, to ensure that you can actually recover when needed.
Depending on data update frequency this can however be rather a complex setup.

Loading

#TSQL2SDAY #77 – Favorite SQL Server Feature – RoundUp

This week, I have been looking forward to the time where I got to read through all the contributions to my #TSQL2SDAY invitation: Favorite SQL Server Feature. Very happy to see this many. I have added a short description of each blog post, as well as my own personal key take-away(s).

So, in no particular order, I give you the round-up:

Rob Farley (b|l|t)

Rob writes about SQL Server being backwards compatible and why that is cool. Rob also encourages to upgrade, if you are stuck on older versions – it’s what Microsoft recommends as well.
I also like that fact that with Azure SQL Databases, you don’t need to make the decision – it’s taken care of for you!
Key Take-Away: “…we now recommend ongoing, proactive installation of CU’s as they become available.

Reasons To Upgrade SQL Server

Ben Miller (b)

Ben makes a great point in introducing Server Management Objects (SMO) – I like the fact that the post is kept as simple as possible, but still explains the why’s and how’s. If you are not aware of SMO, please do get going and Ben’s place is a good start!
Key Take-Away: I know SMO already, so no gain here…

T-SQL Tuesday #77- SMO is my favorite feature

John Morisi (b)

John must have balls the size of Texas. His favorite SQL Server feature is Replication! No, kidding aside. John offers a quick intro into what options there are with Replication and makes an effort to debunk some of the bad reputation Replication has. Do you wanna know more?
Key Take-Away: “…Replicate data to any ODBC or OLE DB accessible database including Oracle, AKA heterogeneous replication.

Favorite SQL Server Feature – Replication

James Anderson (b|l|t)

Has written about Partition switching. This feature is as cool as it sounds – and what even more cool, it’s quick! I have used this technique in my ETL, at times. Normally data can be moved quickly enough, but when you need it done ultra fast, this is definitely an option worth investigating.
James also provides a little insight into new features in the upcoming release of SQL Server 2016.
Key Take-Away: “The good news is that the SWITCH command works on regular tables and in any edition.

Partition Switching #TSQL2SDAY #77

Shane O’Neill (b|t)

Seems this is Shanes first blog post ever, so kudos for making the entrance on the #TSQL2DAY stage! Shanes first adventure in blogging hits it out with an excellent run-down of Common Table Expressions, and a recursive one that is!
Key Take-Away: “…you are SELECTing from the CTE while still defining the CTE!!

T-SQL Tuesday #77: Recursive Common Table Expression (CTE)

Andy Mallon (b|t)

Andy’s blog post is a challenge to the topic. A challenge which I think is utterly justified. Andy advocates to take a step back. Look at the bigger picture. Take a moment of thought, before you start hammering away with your new found hammer.
Key Take-Away: “…configuring a multi-subnet WSFC, with an AG that spans two FCIs, then take backups from the readable secondary to use for cross-domain log shipping to a third data center.” (who doesn’t eat that for Breakfast?)

T-SQL Tuesday #77 – My Favorite SQL Server Feature

Andy Yun (b|l|t)

Andy offers up Dynamic Management View/Dynamic Management Functions (DMV/DMF) as his favorite SQL Server feature. The blog post is a quick introduction to DMV’s/DMF’s and makes a great starting point for anyone who haven’t yet been using those. Andy also shows a couple of pointers, to what DMV’s/DMF’s has to offer. The information is practically endless…
Key Take-Away: “…DMVs/DMFs are the gateway into SQL Server!

T-SQL Tuesday #77: My Favorite SQL Server Feature

Jason Brimhall (b|l|t)

Jason has written an awesome post about the almost unmentionable feature. In the end however, he caves in and spills the whole deal. Unintentionally, I think, there are several take-away’s from Jason’s blog post – both the feature (which I will not dare utter), but also Synonyms – If nothing else, Jason demonstrates and example to follow, in the sense that the customer gets a perfectly working solution at minimum cost (even though an upgrade would have been way more cool to work with etc…)
Key Take-Away: “Using a synonym, I can extend this database beyond the 10GB limitation…”

Awesome SQL Server Feature

Kennie Nybo Pontoppidan (b|l|t)

Kennie offers another shot at CTE’s; This one also recursive.
Key Take-Away: “…we continually join the data set to the result set as it exists for each “loop”

T-SQL Tuesday #76 – Parsing code hierachies with CTEs

Edwin M Sarmiento (b|l|t)

Via a Dell Laptop, Edwin steers us through a good number of uses of SQL Server Logging. To anyone who thinks logging is just logging, have a look at the uses in Edwin blog post.
Key Take-Away:”How To Forecast Database Disk Capacity If You Don’t Have A Monitoring Tool

TSQL Tuesday #77 – Favorite SQL Server Feature: Logging

John Sterret (b|l|t)

John talks about Transactional Replication for Azure SQL Databases and offers three main reasons why this is an ûber cool feature. I have to agree, that this is a really cool feature and John does a good job explaining why. Looking forward to the follow-up post containing the How To.
Key Take-Away: “The reporting subscriber database can be scaled up or down as needed based on your needs…

Replication for Azure Databases – #TSQL2SDAY #77

Derik Hammer (b|t)

With the End of Support for SQL Server 2005, Derik finds it appropriate to pay his respects by picking Database Mirroring as his favorite feature. The feature will be replaced by Basic Availability Groups in SQL Server 2016.
Key Take-Away: “…mirroring fails over even faster than Availability Groups.

Favorite SQL Server Feature – #TSQL2SDAY #77

Kenneth Fisher (b|l|t)

In this post Kenneth walks us through the different applications of Delayed Durability in SQL Server. Advice on Simple and Full Recovery Mode as well as HA scenarios are covered. Delayed Durability is a new feature to SQL Server 2014 and applies to Azure SQL Database as well as the upcoming SQL Server 2016 release.
Key Take-Aways: “The fact that this should be utilized for massive Data Warehouse loads, where durability is not of concern.”

The Risk of Delayed Durability

I was really impressed by the hard work put into this round of #TSQL2SDAY, so thank you, to all the contributors. Great job all!

Loading

Invitation to #TSQL2SDAY #77 – Favorite SQL Server Feature

tsql2sdayThe Why

If you’re not familiar, T-SQL Tuesday is a blogging party hosted by a different person each month. It’s a creation of Adam Machanic (b|l|t), and it’s been going on for ages now! Basically the host selects a topic, defines the rules (those are almost always the same), and then everyone else blogs about it. Once the deadline is reached, the host summarize each of the submitted posts on their site/blog.

The What

This month I get to pick the topic, and I am going to go with:

My favorite SQL Server Feature

This can be anything from Reporting Services as a report creating tool, down to the Columnstore Indexes. Anything goes!
I selected this topic precisely to illustrate the breadth and depth of what SQL Server has evolved into over the last decade+. Starting out as a RDBMS (w/ DTS), adding Analysis Services, then Reporting Services, following a myriad of auxiliary services and tools. So please, share your story about a specific SQL Server feature close to your heart. So please, share precisely what you think makes SQL Server such an awesome tool.

The How

There’s only a few rules for T-SQL Tuesday:

  • Your post must be published between 00:00 GMT Tuesday April 12th 2016 and 00:00 GMT Wednesday April 13th 2016.
  • Your post must contain the T-SQL Tuesday logo (see above) at the top and the image must link back to this blog post.
  • Trackbacks should work, but if they don’t, please put a link to your post in the comments section so I (and everyone else) can see your contribution!
  • Tweet about your post using the #tsql2sday hashtag

There’s also a few optional you can do that might help:

  • Include “T-SQL Tuesday #77” in your blog post’s title.
  • Contact Adam Machanic (b|l|t) and tell him you’d like to host a T-SQL Tuesday from your blog

Loading

Summarizing my #Summit15

Finally I am able to, coherently write down my take away’s from this years PASS Summit. I came a long way, and endured all kinds of dangers on the way to Seattle. Not sure if the person before me, in the flight seat had been shot, but there was a distinct hole in the window, WTF Lufthansa!:

Lufthansa Hole in the Window
Gun crazed Americans? Or was it ze Germans?

As always, at least for me, the schedule changes as soon as the summit begins. This happened this time around as well. I had planned on seeing a bunch of sessions on various topics, but as game day came along, my choices changed. On Day 1, only  Dimensional Modeling Design Patterns: Beyond the Basics by Jason Horner (b|t) came out strong. Jason is an awesome presenter, so if you ever get the chance, go for it!  I knew most of the stuff in advance, but sometimes it’s nice to have reassurance of what you think is the proper road ahead. Jason’s session gave me just that. SQL Server on Azure Virtual Machines: Features, Best Practices & Roadmap by Luis Vargas was a bit disappointing as there were almost no best practices tied into the presentation. Luis ran out of time, and had to race through the final 4-5 slides of the deck, leaving no time to go into detail. Finally wrapping up day 1 was Cortana Analytics Deep Dive: Big Data Stores by Rajesh Dadhia, Matt Usher which started out as a very solid presentation. But. The so called demonstration of the Azure Data Lake import into Azure SQL Data Warehouse through Azure Data Analytics was not a demo, at least not in my opinion. Presenting a bunch of screen dumps (with errors occurring) is not a demo.
As soon as the last session ended, 40+ Danish attendees went to dine at the Pike Brewing Company. Jørgen Guldmann (b|l) captured the night out, on his blog. Good times.
Day 2 ended up being all about my Lightning Talk – which I think went well. No tomatoes, no rotten eggs, so I guess it wasn’t that bad 🙂
Day 3 began with some U-SQL Query Execution and Performance Tuning by Ed Triou which didn’t end too well, as the speaker decided to go 13 minutes over time, leaving only two (2) minutes for Jason Thomas (b|l|t) to get ready for Memoirs of Building a 150 GB (and Growing) SSAS Tabular Model. This session was surprisingly more on hardware than on architecture, which I somehow had my mind set upon. Nevertheless, Jason delivered, as usual, a cool, calm and collected session. Good job!

 

Right after, in same room came Bill Anton on Analysis Services: Show Me Where It Hurts. If you work with Analysis Services and do not yet have logging and performance counter collection in place, have a look at the session description, downloads and Bill’s blog Here’s how you get up and running quickly.
I spent the afternoon attending Mike Diehl on Agile Analytics: Making Progress Visible which, again, was one of those sessions that provided the reassurance of being on the right track. No ground breaking news, but that’s OK on many levels. Mike did a great job presenting and I think his message is very important to the community. Work smarter, not harder.
I don’t really know how to describe the last session of the day: How to Build a Virtual Test Lab for SQL Server by Ed Leighton-Dick and David Klee. The session began really strong (read: funny) with this video (which I think is both hilarious and scary at the same time)

But after that, it became a show case of who was the most geeky on hardware @ home. I realize that a geek, by now,  is socially accepted. But I honestly thought we (yes, I just labeled myself geek there) had moved on from the “I’ll show you how much geek I am by displaying the size of my soldering station” – sigh.
I was hoping to see some insight, from undisputed authority on the field, into how to setup you _virtual_ dev/test environment. We ended up hearing too much about how much old hardware spare parts would cash in on eBay. Again, don’t know what to say…

This year in Seattle, I had time to visit the Space Needle, where I played around, making this panoramic photo:
Seattle Skyline

Oh yeah, final pro tip: When dining at Local 360, ask for the price before ordering – These two babies cost me $50 and I was expecting half that…

Expensive Short Ribs

Loading

PASS Summit 2015 – Personal Schedule

Only 20 hrs untill my flight is taking off from CPH -> FRA -> SEA – But who’s counting? 🙂
This year is no different from any year at PASS Summit (saying that as a 2nd year veteran). The time to build your schedule, is probably the most frustrating part leading up to the conference. This goes for SQL Bits as well, another well planned SQL Server event.

This year I opted out from the pre-con on various reasons, mainly because I was very disappointed last year, in terms of cost vs. output. Although I must say a couple of them looked really interesting, especially The Complete Primer to SQL Server Virtualization by David Klee, Argenis Fernandez, Jimmy MayOptimize “All Data” with a Modern Data Warehouse Solution by Bradley Ball, Josh Luedeman and not least The Enterprise Scripting Workshop by Sean McCown, Jennifer McCown. But the fact having three kids age Eight, Six and Two makes my trip to Seattle short, this year at least. I had also planned on attending the comeback of Big Ben in Pittsburgh hosting Cincinatti, but hopefully they can manage without me (fingers crossed).

So in short, my sessions on interest will be:

Wednesday:

Thursday:

Friday:

As with any schedule, this is subject to change. I certainly always find myself making “game time” decisions influenced by whom ever I am talking to, in between sessions. Often there is some aspect that others point out, that influence the final call. Sometimes I even “miss” sessions, just to socialize and extend my network – There is always someone interesting to talk to, no matter where you turn at PASS Summit.

Hope to see you there!

Loading