This is my ninth post in a series of entry-level posts, as suggested by Tim Ford (b|l|t) in this challenge.
I have not set any scope for the topics of my, at least, twelve (12) posts in this blog category of mine, but I can assure, it’ll be focused on SQL Server stuff. This time it’s going to be about how to create a consistent test setup of your Analysis Services (SSAS MD) databases, across servers, using build in features in Visual Studio (VS).
Previous Posts in Entry Level Serie:
- Creating a Striped Disk on Your Azure VM
- How to Query SSASMD using #DAX
- How to Quickly Remove Snippet Line Numbers in Visual Studio
- SSAS Compare by Red-Gate
- Filtering Tables in #SQLServer Management Studio
- A Hello World Custom Assembly in SSAS
- Permission Scope in Analysis Services
This technique requires a tiny bit of C# programming (or VB if you prefer that) – But do not let this scare you off. I guarantee the effort pays of well in the end. But if you want to jump right into testing, I have shared some code in this blog post, that will get you going even faster.
One of the main drawbacks of using this approach, is that test methods in VS does not support parameters. This means we have to create a method for each type of MDX we want to execute against a cube.
We can however execute the same MDX on multiple cubes, from the same method.
Depending on whether you test for consistency or performance, having the test methods run synchronously, one after the other, will either work for you or against you. We do, unfortunately not, have the liberty to choose execution mode as we see fit. This blog post sums up how to enable parallel testing, but in our case we do no meet the requirements which are (I have highlighted the ones we fail to meet):
- Ensure you have a multi-core/CPU machine
- Ensure you are running only unit tests
- Ensure your tests are thread-safe
- Ensure you do not have any data adapters on
- Ensure you are running locally
- Modify your test settings file.
We begin by creating a new Unit Test Project in Visual Studio.
Once the project has been created, we can add classes to our project. Each of these classes can contain one (1) or more tests. In this example, I will expose two (2) tests to the Visual Studio test framework.
But before we can do any testing, some plumbing is in order. I have attached code to this blog post, in which there is a class that wraps the ADOMD class library as well as implements some easy of use functionality. This class is called Helper. In this implementation it contains a method to clear the cache of a cube, execute XMLA or MDX against a cube and finally a method to assert if two (2) CellSets are identical. The Helper class relies on settings found in the App.config file. You are free to make the code your own, but can use it as is.
Here is an example of one test you might have in a cube. Testing if the grand total of a specific measure [Internet Sales] stays consistent for each deployment.
When we build our Visual Studio project, the Test Explorer will automatically pick up on any newly added test methods in any of the classes in the project. So you can add as many classes with as many (or few) methods you’d like.
In this quick introduction I will only show how to execute the test via the Test Explorer. Obviously this should be automated, as I will blog about in another post.
Knowing right away, that you introduced a bug into the code base, can save you hours of back tracking.
In the Test Explorer you can select any single test or multiple tests, then right-click and select Run Selected Tests.
You will end up with a result in the same pane; either Success, Failure or Undecided. The last option is not included in this example, but is a viable option if you extend your unit/integration testing further.
My result looks like this:
With this test, we established that the grand total of the measure Customer Count was consistent across both versions of the cube. If this fails, you may be into trouble with either your data model, your cube script or something completely different. The test won’t tell what is wrong, only that you have an inconsistency.
Fail fast, fail often!
Unit/Integration testing in Visual Studio is actually not that hard and can save you a lot pain down the road. The testing framework in Visual Studio offers extensive ways of executing batches of tests. You can group tests by Class, Duration, Outcome, Trait or Project.
When you right-click a test, you get the option to select how you want the tests in the Test Explorer to be grouped.
Traits can be added as properties to the method declaration in the class. The attribute is called TestCategory and you can have multiple declarations per method. As you can probably tell, you can weave an intrinsic web of categories of test methods. In other words, one test method can be part of one or more schemes by which you test your code. You should test development deployments differently from test deployments, which again should be tested differently from production deployments.
Or you can store tests in playlists by right clicking and adding them separately to one (1) or more playlists.
In the top of the Test Explorer, you select which playlist you want active.
Happy testing 🙂
After publishing this post, Fabrice Michellonet (b|l|t) did clarify the difference between Unit Testing, and what I am showing in this post; Integration Testing. Now, admittedly I hash tagged the tweet #UnitTesting, which was incorrect. However, it seems easy to fall into that trap, as Microsofts Test Template for this kind of project is Unit Test Project – even though you are not going to do any Unit Testing.
Code available here: ssas-unittestproject