Using PowerShell to Discover Temp/Log/Backup and Data Directories of SSAS

Objective: Locate all folders related to SQL Server Analysis Services (SSAS) on disclosed servers.

During a massive cleanup on some of our environments, I was asked to point out all the possible locations of SSAS related files on a list of 10+ servers. The main reason for this not being a trivial task done with blindfold is that, at some point in time, someone made a decision to change the default value in the SSAS configuration. This change lead to new cubes being deployed to a different location than the existing ones. Also contributing to the mess was a dubious naming standard, which makes guessing the folder names a difficult task.

My immediate strategy was to check if I could use the Analysis Service Management Objects (AMO) to get the current values from the configuration on each instance. I happen to be very fond of the AMO API, so this was my first instinct. Unfortunately there were several drawbacks and shortcomings to this, as only the Data directory is exposed, and I would have no way of telling the historic values of each property (TempDir, LogDir, BackupDir and DataDir).

Asking the question on Twitter (using hashtag ssashelp), whether there was some clever way of getting this information returned only a few responses, from Bill Anton (b|l|t) and Chris Webb (b|l|t) – but nothing directly solving the issue at hand. Bill shared this link to lookup Instance Configuration via PowerShell and Chris shared this one on Using MDX to browse the File System.

I think that the API should at least expose the current values for all possible configurations on a SSAS Instance. I have filed a Microsoft Connect ticket for this. But since this is not currently available, the only option was to script it myself.
The script will produce a csv file containing a list of servers with corresponding paths to all folders containing files of specified extension. In this example I have chosen to go for .abf (Analysis services Backup File), .dstore (Compressed store and is a decoding table for SSAS) and .trc (Trace file for SSAS). The script look as follows:

cls;

# Setting up Parameters
$Path = "<Insert Valid Path and Filename w/.csv extension here";
$Include = '*.abf,*.trc,*.dstore'
$Servers= @( "<Insert Comma Seperated List of Server here>" )
$Output = @();

# Looping all servers listed
foreach ($Server in $Servers) {
    # Getting the logical disks on server
    $Disks = gwmi -ComputerName $Server Win32_LogicalDisk | where {$_.DriveType -eq '3'} 
    ForEach( $Disk in $Disks ) {                    
        # Getting all folders that contains included file extensions
        $ChildItems = Get-ChildItem ($Disk.Name + "\") -Recurse -ErrorAction SilentlyContinue -Include $Include
        # For each file, create an entry
        ForEach( $Item in $ChildItems ) { 
            $Temp = New-Object System.Object
            $Temp | Add-Member -MemberType NoteProperty -Name "Server" -Value $Server
            $Temp | Add-Member -MemberType NoteProperty -Name "Path" -Value $Item.Directory                        
            # Add Entry to list
            $Output += $Temp
        }         
    }
}

# Remove duplicates and select only Path member
$Result = $Output | Select-Object -un -Property Path

# Write to CSV file
$Result |out-file -filepath $Path;

 

Obviously this script comes with the normal DON’T JUST COPY/PASTE INTO PRODUCTION disclaimer. Please read through the script and adapt it for your own use. $Path and $Servers should be set accordingly.

Lastly you need to be aware of security. Your account may not be granted the proper permissions to actually remotely crawl the files and folders – please make sure this is addressed in advance.

Note also this script will scan all logical drives on all listed server, which can take some time,

Loading

Extracting SSAS Database(s) XMLA w/ PowerShell

For some time, we have been trying to get Informatica’s Metadata Manager to extract information from our SSIS, SSRS and SSAS instances in production. However, we were facing issues with credentials and service account clashes that lead us to just extract the Metadata and deploy it on a server where we had full control.

The easist wasy of doing this, was to create a PowerShell script that extracts, saves and deploys the XMLA from the databases in question. So we would just be looping through a database collection on a number of servers, in order to get the required information.

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") >$NULL

$SourceServers = @( "<SOURCE SERVER LIST HERE>" ); #Source
$TargetServer = "<TARGET SERVER HERE>"; #Target

cls;

$TargetServer.Connect();

ForEach( $srv in $SourceServers ) {
    $server = New-Object Microsoft.AnalysisServices.Server
    $server.connect($srv)
    ForEach(  $database in $server.Databases ) {
        
        $stringbuilder = new-Object System.Text.StringBuilder
        $stringwriter = new-Object System.IO.StringWriter($stringbuilder)
        $xmlOut = New-Object System.Xml.XmlTextWriter($stringwriter)
        $xmlOut.Formatting = [System.Xml.Formatting]::Indented

        $ScriptObject = New-Object Microsoft.AnalysisServices.Scripter

        $MSASObject=[Microsoft.AnalysisServices.MajorObject[]] @($database)

        $ScriptObject.ScriptCreate($MSASObject,$xmlOut,$false)

        $ServerPath = $server -replace "\\", "_"

        $Path = "<INSERT DUMP LOCATION>" + ".xmla";        

        $stringbuilder.ToString() |out-file -filepath $Path 
        
        #$TargetServer.Execute(@StringBuilder.ToString());
    }    
}

Basically there are a few steps you need to do, in order to fit this script into your environment. Mind the list of sources, multiple servers can be added, although this has not been tested for name clashing on target box.

If you don’t need a physical file, for the archive, this step can obviously be skipped.

Finally, you may have to clear the database collection on the Target server, before running the script. This was done as a one off. This step is also diabled, commented out, in the script above. Less distruction on account of copy/paste.

Obviously if the cubes needs to be processed at the Target location, this needs to be added as well. We were just after the Metadata, hence no processing was needed.

Loading

How to Change the Service Account for Analysis Services

As part of a routine check, I found that one of our servers had an instance of Analysis Services (SSAS) running under a local service account. As many of our solution pull data from various sources, on other servers, there is almost always a need to do a double hop. To enable that, the service needs to run under a domain service account.

This being a simple task and only a small part of the bigger Kerberos puzzle, I filed a ticket with our support, for them to change the service account to one already existing. The reply I got is the cause of this blog post. I needed to provide the individual steps for the change. A quick googlebing turned up rather empty on specific SSAS guides, to my surprise, so I decided to create on myself (I had to anyhow).

EDITED PART:

As Patrice Truong (b|t|l) correctly called out, the recommended way of doing this change, is to do it through the SQL Server Configuration Manager. To do so follow these steps:

Type [Windows] + [r], in the promt type ‘SQLServerManagerXX.msc’ (depending on the edition XX, 12 for 2014, 11 for 2012, 10 for 2008) – i.e ‘SQLServerManager12.msc’ for SQL Server 2014

Depending on your security settings, you may encounter this next dialog:

UAC

 

 

 

 

 

 

 

Clicking ‘Yes’ will bring you to the SQL Server Configuration Manager:

SQL Server Configuration Manager

Here you can select (double click) the particular instance you would like to re-configure. Doing so will open this Dialog, where you can edit the login information:

SSAS Configuration Log On Dialog

 

 

 

 

 

 

 

 

 

 

 

 

 

Change the login information and provide the corresponding password:

SSAS Configuration Log On Dialog Edit

 

 

 

 

 

 

 

 

 

 

 

 

 

Click OK/Apply and the Service will need a restart for the changes to be applied.

ORIGINAL PART:

This guide will be for Windows 2012 R2, but will be applicable on other Windows versions.

Type [Windows] + [r], in the prompt type ‘services.msc’

Services Prompt

 

 

 

 

 

 

This will open up the Services Dialog, where you can scroll to the SQL Server services listed

Services SQLServer

Double click the Analysis Services Service, and the following Dialog will appear:

SSAS Service

 

 

 

 

 

 

 

 

 

 

 

 

Click the ‘Log on’ tab

SSAS Service LogOn
 

 

 

 

 

 

 

 

 

 

 

 

Change the account to the desired domain account and type the corresponding password

SSAS Service LogOn Edit

 

 

 

 

 

 

 

 

 

 

 

 

Click OK/Apply and the Service will need a restart, before the changes are applied.
Happy hopping 🙂

Loading

TSQL Tuesday #63 – How do you manage security

 

TSQL Tuesday

This month’s T-SQL Tuesday is hosted by Kenneth Fisher (blog | twitter) and the topic is security.

Security is one of those subjects that most DBAs have to deal with regardless of specialty. So as something we all have to work with at some point or another what are some tips you’d like to share? What’s the best security design? You’ve picked up a legacy system and the security is awful, how do you fix it? Any great tools out there you’d like to share? Hate it or love it I’m betting we all have something to say.

As others in the community contribute in their own way, in their field of expertise, I’d like to chip in with some of my own observations and lessons learned through my years of working with Analysis Services (SSAS). Here goes my first #TSQL2SDAY blog post.

Having worked with the security model in SSAS and having presented on the topic a couple of times, this topic is really something I have had my hands on. I have previously posted several blog posts on the topic and this blog post will introduce a Microsoft Security Management Tool named Forefront Identity Manager (FIM) which is a  Tool that enables self-service identity management for business users. Yeah, you heard that right. So how does that fit in with cubes and SSAS?

In every SSAS database there is a Collection of roles. Each role can be assigned specific permissions in terms of cube access (r/w), drill-through, dimension and cell access. The magic link between the SSAS roles and FIM are Active Directory (AD) Groups. Each role can have one-to-many members, which can be specific (local) Users or Security Groups. Through FIM this allows the users provisioning and de-provisioning access through an semi-automated approach.

Personally, I haven’t even scratched the surface of FIM, but for the purpose of letting the business decide who gets access to what, and who doesn’t, it was well worth raising this flag. In the end, the business is happy to feel in control, and you are happy that the business is locked down to what options you expose through the cube. In the end, the mechanism of creating and managing the roles within the cubes, still reside on developer/administrative side, e.g. IT (and not Business).

If you don’t already know about Forefront Identity Manager I urge you to take a look at the capabilities. I bet you’ll be pleasantly surprised. 🙂

 

Loading

Run your cubes in Warp Speed – Really Really Fast!

So, finally Microsoft came up with a new feature for the MOLAP part of Analysis Services. The community has been hungering for updates ever since the new kid Tabular came around.

Now it apears that the SQL Server team has unlocked some underlying functionality in the Windows OS, to enable the MOLAP cubes to operate in Warp Speed.

Run the latest CU (get it here) and check out the newly added check box in the properties.

WarpSpeed for MOLAP Cubes
WarpSpeed for MOLAP Cubes

 

Loading