Using PowerShell to Discover Temp/Log/Backup and Data Directories of SSAS

Objective: Locate all folders related to SQL Server Analysis Services (SSAS) on disclosed servers.

During a massive cleanup on some of our environments, I was asked to point out all the possible locations of SSAS related files on a list of 10+ servers. The main reason for this not being a trivial task done with blindfold is that, at some point in time, someone made a decision to change the default value in the SSAS configuration. This change lead to new cubes being deployed to a different location than the existing ones. Also contributing to the mess was a dubious naming standard, which makes guessing the folder names a difficult task.

My immediate strategy was to check if I could use the Analysis Service Management Objects (AMO) to get the current values from the configuration on each instance. I happen to be very fond of the AMO API, so this was my first instinct. Unfortunately there were several drawbacks and shortcomings to this, as only the Data directory is exposed, and I would have no way of telling the historic values of each property (TempDir, LogDir, BackupDir and DataDir).

Asking the question on Twitter (using hashtag ssashelp), whether there was some clever way of getting this information returned only a few responses, from Bill Anton (b|l|t) and Chris Webb (b|l|t) – but nothing directly solving the issue at hand. Bill shared this link to lookup Instance Configuration via PowerShell and Chris shared this one on Using MDX to browse the File System.

I think that the API should at least expose the current values for all possible configurations on a SSAS Instance. I have filed a Microsoft Connect ticket for this. But since this is not currently available, the only option was to script it myself.
The script will produce a csv file containing a list of servers with corresponding paths to all folders containing files of specified extension. In this example I have chosen to go for .abf (Analysis services Backup File), .dstore (Compressed store and is a decoding table for SSAS) and .trc (Trace file for SSAS). The script look as follows:

cls;

# Setting up Parameters
$Path = "<Insert Valid Path and Filename w/.csv extension here";
$Include = '*.abf,*.trc,*.dstore'
$Servers= @( "<Insert Comma Seperated List of Server here>" )
$Output = @();

# Looping all servers listed
foreach ($Server in $Servers) {
    # Getting the logical disks on server
    $Disks = gwmi -ComputerName $Server Win32_LogicalDisk | where {$_.DriveType -eq '3'} 
    ForEach( $Disk in $Disks ) {                    
        # Getting all folders that contains included file extensions
        $ChildItems = Get-ChildItem ($Disk.Name + "\") -Recurse -ErrorAction SilentlyContinue -Include $Include
        # For each file, create an entry
        ForEach( $Item in $ChildItems ) { 
            $Temp = New-Object System.Object
            $Temp | Add-Member -MemberType NoteProperty -Name "Server" -Value $Server
            $Temp | Add-Member -MemberType NoteProperty -Name "Path" -Value $Item.Directory                        
            # Add Entry to list
            $Output += $Temp
        }         
    }
}

# Remove duplicates and select only Path member
$Result = $Output | Select-Object -un -Property Path

# Write to CSV file
$Result |out-file -filepath $Path;

 

Obviously this script comes with the normal DON’T JUST COPY/PASTE INTO PRODUCTION disclaimer. Please read through the script and adapt it for your own use. $Path and $Servers should be set accordingly.

Lastly you need to be aware of security. Your account may not be granted the proper permissions to actually remotely crawl the files and folders – please make sure this is addressed in advance.

Note also this script will scan all logical drives on all listed server, which can take some time,

Loading

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.