#TSQL2SDAY #101 Round-Up : My Essential SQL Server Tools

Tuesday 3rd of this Month I invited people in the SQL Server community to share which tools are essential to their daily work. I was really overwhelmed by the number of stories that the topic triggered. 22 in total took the time to write down and share which tools they use for their work chores.
Going through 22 posts and aggregating them has been taking more time than I had hoped for, since my trusted laptop broke down – blinking codes are well and alive I tell you!

Going through the lot, I found some similarities to the posts, and have categorized them accordingly. But first off a BIG thank you to all how participated!

Without further ado, here goes.

Relational Heavy Lifting

Kamil Nowinski (b|l|t) takes us through the classic stuff, I mean, the real classic stuff – some would call it vintage – by showing how Total Commander still has a place in the tool belt, this century 😉

Matthew McGiffen (b|l) shows how to set up a Central Management Server, in order to execute queries against multiple instances in ad-hoc scenarios, seamlessly. Very nice tip. Matthew also did a second post, lining up multiple tools he’s written about in the past, nicely aggregated in this post.

Jess Pomfret (b|l|t) does a really nice post on how Powershell and the dbatools has changed her way of working. Jess even provides some useful snippets to get you going. I share the same enthusiasm for Powershell as Jess does, and was very pleased to see homage paid to the dbatools – incredible tool. Best of luck on your speaking adventures!

Marek Masko (b|l|t) has a huge post on classic DBA tools as well as a pleasant surprise on testing using tSQLt. Also some good pointers to free community scripts and tools as well. Great read!

Tracy Boggiano (b|l|t) covers dbatools and a specific Powershell Command and T-SQL Stored Procedure, but also on Telegraf and VS Code.

Dan Clemmens (b|l|t) goes all in on DBA tools for statistics, execution plans and tracing, even including the legendary diagnostic scripts from Glenn Berry.

Steve Jones (b|l|t) has a huge list of free and paid tools, from SQL Server sentric tools to a good deal of process related tools – i.e. DevOps and such.
Also Steve manages to sneak in a reminder on the password thingy magicky, that, according to domain expert Troy Hunt we all should rely on, be it pwsafe or any other tool like that.

Doug Purnell (b|l|t) is short and concise in his praise of Ola Hallengren maintenance scripts  and Jared Zagelbaums extension of those in Powershell.

Warren Estes (b|l) is praising the usual suspects in the DBA field, but adds a couple of interesting options for productivity and benchmarking/testing and also rounds up a couple of SentryOne products.

Devon Leann Ramirez (b|l|t) is offering a thorough introduction to their free plan explorer offering. Devon also makes a good point in marking the company’s presence on the community. If you want the quick tour, head over to Vimeo.

Rob Farley (b|l|t) talks about two things I really hold dear; Coffee… and I forgot the other thing. No really, Rob has an excellent blog post on Live Query Stats (LQS), and what some of the use cases are for that feature/tool. There are more ways of using LQS than I had thought about – thanks for sharing!

Riley Major (b|l|t) share his story on how he works with Management Studio and how the cool could be improved to further support a common way of working. Besides the tips on SSMS Riley also lists his favorite (and not so favorite) tools.

The BI Power in Power BI

James McGillivray (b|l|t) is first and foremost writing about my trusted travel mate; The Kindle (App) as his favored tool of the trade. Besides that treasure trove books can be, James also has some pointers to daxformatter and a theme generator which is pretty hefty!

Community Zone

Jo Douglas (b|l|t) argues that the most important tool for any professional is networking and community, and it’s hard not to agree completely. Jo also writes about some great points of where to begin this journey.

Jason Brimhall (b|l|t) brings up the aspect of blogging itself as a great tool of the trade, and I have to agree here and couldn’t have stated it more clearly that Jason:

Blogging helps you become a better technical person.

Googlefoo is also described in Jason’s blog for this party, and he manages to sneak in a reference to his extensive work and blogging on Extended Events, which in itself is an awesome contribution.

Reid DeWolfe (b|l|t) offers a quick write up on classic DBA must haves; SQL Prompt, Plan Explorer and GitHub/SourceTree. Reid also describes some of the main benefits of the tools.

Other

Garland MacNeil (b|t) brings another perspective into the party, by writing from a borrows laptop – not sure it was intentional, but I guess the exercise is very rich in terms of learning. I know others have been there too:

https://twitter.com/_AlexYates_/status/986137414723866625

Chrissy LeMaire (b|l|t) has, surprisingly enough, not written about dbatools, and if you believe in that you may call me Bill 🙂
In Chrissys blog post you’ll find a great list of auxiliary tools for all the things you do around programming; Screen shot/Image handling, code repositories, clip board management and video editing tools.

Josh (b) gives us the DevOps perspective of a Database DBA/Developer in a not so uncommon scenario – well, I think we’ve all been there at some point. Some prominent 3rd party tooling is getting some ❤

The Other Tools

Catherine Wilhelmsen (b|l|t) offers a completely different and refreshing view on tools that were completely new to me, at least. Going from database modeling to data generators to time keeping tools and beyond.

Finally, No Tools

Hugo Kornelis (b|l|t) makes a good argument on not becoming addicted/dependent on the presence of certain tools in order to perform your job. I guess this applies in particular, when you’re a consultant and can’t always BYOD. Apart from that Hugo really likes SQL Prompt and Plan Explorer 😉


The Tools Mentioned (in no particular order)

dbatools dbatools  PowerShell  dbareports  SQL Server Management Studio
 Redgate SQL Compare  Minionware  Sentry One Plan Explorer dbachecks  SQL Operations Studio
 SQL Database Modeler  Dynamic Restore Script  Scooter Software Beyond Compare  Redgate DLM Dashboard  Ola Hallengrens’s maintenance scripts
 Trello  SQL Server Data Tools  Passwordsafe  Sublimetext  Notepad++
 Redgate SQL Prompt  Mockaroo  Dropbox Visual Studio Code  SQLCover
 Sourcetree  SQLNexus  Coblis  tSQLt  Advanced Theme Generator (PowerBI)
 DAX Formatter  R Studio  Scale SQL Clear Trace   PSSDiag  Devart’s dbForge
 Toggl  Grammarly  SCD Merge Wizard Statisticsparser  Adam Machanics whoisactive
 Winmerge  Mythicsoft Agent Ransack  Redgate SQL Search    Glenn Berry’s Diagnostic Scripts

 

Loading

TSQL2SDAY #101 Invitation: My Essential SQL Server Tools

tsql2sdayThe Why

If you’re not familiar, T-SQL Tuesday is a blogging party hosted by a different person each month. It’s a creation of Adam Machanic (b|l|t), and it’s been going on for ages now! Basically the host selects a topic, defines the rules (those are almost always the same), and then everyone else blogs about said topic. Once the deadline is reached, the host summarizes each of the submitted posts on their site/blog.

T-SQL Tuesday #101

This is the second time I host a T-SQL Tuesday, and hopefully not the last

The What

This month I get to pick the topic, and I am going to go with:

The Essential SQL Server Tools in my stack

Besides SQL Server Management Studio and Visual Studio Data Tools we all have our own set of tools that we use for everyday chores and tasks. But how do we get to know which tools are out there, if not for other professionals telling us about them? Does it have to a fully fledged with certification and all? Certainly not! If there’s some github project out there, that is helping you be double as productive, let us know about it. You can even boast about something you’ve built yourself – if you think others will benefit from using it.

Basically I think, that by establishing awareness about what kinds of tools that are out there, new professionals will not have as steep a curve getting the pace up, as they would have had. But I suspect that even some veteran guys could have an “a-ha” moment from reading the summary.

Additionally, you can (read: should) share how you came to depend on said tool – and of course you are encouraged to give credit, where credit is due in terms of making you aware of the tool.

Another approach for this topic, is to approach it as kind of A Day in the Life of kind of blog post, as has been done before by Erin Stellato (b|l|t). Writing with the specific angle to describing how your everyday is made easier by the use of your tool stack.

The How

There’s only a few rules for T-SQL Tuesday:

  • Your post must be published on Tuesday April 10th 2018 between 00:00 GMT  and 23:59 GMT.
  • Your post must contain the T-SQL Tuesday logo (see above) at the top and the image must link back to this blog post.
  • Trackbacks should work, but if they don’t, please put a link to your post in the comments section so I (and everyone else) can see your contribution!
  • Tweet about your post using the #tsql2sday hashtag
  • Include “T-SQL Tuesday #101” in your blog post’s title.
  • Optionally add @vestergaardj to your tweet, to make it harder for me to miss 😉

If you want to host a topic of your own:

  • Contact Adam Machanic (b|l|t) and tell him you’d like to host a T-SQL Tuesday from your blog.

Loading

TSQL Tuesday #93 – My Interviewing Experiences

Kendra Little (b|l|t) is hosting this months TSQL2SDAY which is on the topic of:

TSQL Tuesday #93: Interviewing Patterns & Anti-Patterns

What advice do you have for people preparing for or going through an interview?

I got of to a late start, and I apologize in advance for what may be a bit messy blog post.

Background

A couple of years ago, when I was working at a pretty large blueish container shipping company, we got the opportunity to interview candidates for our off-shore team. This was, as far as I know, a first off in the company, and only came to be because my manager at the time was adamant about it.

We quickly decided, that I was to take on the technical part of the interview and my manager the more person/team oriented aspects. So I devised a good number of questions, so I could accommodate for any specialization the candidate may have. We had to cover a lot because our landscape was, well, let’s just put it: Not so single vendor-ish.
To make a long story short, the setup in the company at that point in time was that the Data Warehouse (DWH) was situated on a Teradata box. On top of that there were several options for business reporting. One such option was Microsoft Reporting Services (SSRS), which was done in the department where I was sitting. The reports were however not sourcing directly from Teradata, but from one of many Analysis Services (SSAS) cubes we hosted as well. All data was sourced using Informatica (into the DWH), in yet another department, so Integration Services (SSIS) was officially (and unofficially) not allowed as sourcing tool, which is why no questions reflect that aspect of the Microsoft BI Stack.

So, we were looking for strong profiles in SSRS and SSAS as well as having Teradata knowledge. Along with a general understanding of Data Warehouse fundamentals of course. In other words, not junior profiles.

Preparation

In order to lock in on the level of the one being interviewed I prepared a set of questions with increasing technical difficulty. Since my knowledge on Teradata was limited I asked around among the DBA’s and decided after some contemplating to completely avoid checking for technical Teradata skills. We would have to train them, if need be.
The list of questions evolved during the whole process. I honed in on the  general level of applicants, after firing questions East and West. Me being more precise gave everyone a much better experience. Conversations tend to stop, when no-one understands you and, you don’t look like a Donkey asking about simple stuff.
So hopefully you get the opportunity to tune into the level of the applicant before you unload your tech quiz.

The Interviews

I wound up having three categories of questions:

Proficiency

Do they have any strong sides? Usually people tend to have a favorite tech in which they excel. So for us it was important not to hire only SSRS Wizards/Ninjas or whatever it’s called nowadays. We wanted a fair mix.

I would ask the interviewee to rate them selves in the different tools and languages and use their rating as pointer to the question I would ask on the technical stuff. If they rated themselves > 7/10 in MDX and are having trouble naming at least five (5) frequently used functions, you know something is rotten.

Something I notoriously have gotten wrong would be:

  • Q: Explain what you get from having Enable Visual Totals checked?
  • A: When Enable Visual Totals are checked, the Grand Total reflects the total you are allowed to see. The Grand Total is not a Grand Total, but your Local Total. At least that works in my mind!
Data Modeling

Data modeling was usually the area where the interviewee’s were least knowledgeable. So the nature of the questions in this category was on a very high level.

  • Q: What’s the difference between a data mart and a data warehouse?
  • A: A data mart is a subset of the data warehouse.
  • Q: Do you know any architectural principles of data warehousing
  • A: Inmon & Kimble
  • Q: Delivery Date and Shipping Date in a Fact table, how would you model that?
  • A: I would use a role playing dimension, Time.
Optimization

In this category of questions, I’d ask them to explain how they would optimize say a poorly performing dimension in SSAS.
What about query performance tuning – what would they look for and how would they attack the issue?

Final Question

But, the one question I always looked forward to was:

Tell me about a Business issue you solved, which you are particularly proud of.

This question I grew fond of, because it allows for the interviewee to brag about how cool they were doing this and that awesome thing to save the business a Centillion Dollars. This allows the person really to shine and at the same time, you get a feeling for their level of engagement and technical flair, not their skill set as such. Don’t you want to know how ingenious they can be a their best? This will give you a hint in that direction, or at least it did for me.

Tests

Back in the days before the big blue container hauling company, I was also involved in the hiring process. I was in fact tested myself, when I first began working there and in my experience tests can have two immediate take-away’s:

  1. It was all talk, and no walk.
  2. Does the person think and act like you expect them to?

1)

I’ve had a number of people skipping the test after we had a good 1-1½ hour talk. Excuses flying around like butterflies in the summer.

“Oh, I forgot! It’s my Grandmother’s birthday, I have to leave urgently. Bye Bye!”

“My car is double parked… I need to move it immediately!”

“Is that the time?”

– Never to be heard from again.

For this kind of situation, a test is very handy – apparently there are a fair amount of bul$hitters out there and fortunately the test prevented you from hiring them.

2)

In my mind, a test should display how the person got to the result, rather than just showing a number. I want to see the T-SQL code, the MDX, M or whatever they used to solve the puzzle. This tells me way more than a result and time.

With the actual code in hand, you can go into discussion (something a lot of coders dislike) about how they came up with what they submitted as final answer. By discussing the code, you get a sense of how their personality is when dealing with their own code. Expect a lot of back-paddling, as time, room temperature, humidity and other factors can have a high impact on the delicate nature of some coders 😉

Summary

There are a lot of ways to conduct an Interview, and I bet my humble contribution to this tsql2sday is one of the more novice ones, but I hope I have allowed you to see some of the thought processes I’ve had, when having to interview people for a position. I am very much looking forward to seeing the other contributions to this blog party – can hardly wait.

Thanks Kendra for hosting!

Loading

T-SQL Tuesday #87 – Fixing Old Problems with Shiny New Toys

Matt Gordon (b|l|t) is hosting this months TSQL2SDAY which is on the topic fixing an old issue/problem with shiny new toys. I am really happy about this topic, as it offers the opportunity to display not only the short comings of earlier versions, but also the progress made by the SQL Server team.

Story Line

My contribution to this blog party, is going to be about SQL Server 2016 SP1, which is the edition of SQL Server we implemented the solution on. Before I was part of the company, there had been numerous attempts to solve what was know as the Zero Sales Issue. Working with Retailer/POS data some customers are interested in identifying if a product didn’t sell on any given combination of date, product and store, and in such case calculate some sort of lost potential for that store (on that specific date for that specific product). For some of our customers, this quickly becomes a super massive matrix, as we serve 3.500 stores on top of (for some) app. 5.000 products. The calculations were to be conducted on a two year running period (730 days). With this example we end up with a combination of 3.500 store x 5.000 products x 730 days = 12.7B rows, just for this particular costumer; We have potentially (at the moment of writing) 50+ customers.
Generating a table of this magnitude, along filling in the gaps of the days that actually have sale was previously a too time consuming task to offer this kind of analysis in our portfolio. Enter SQL Server 2016.

With SQL Server 2016 we were able to generate the table and fill in the blanks that was needed in order to do the calculation (Yay Window Functions!). After that, we are offering not only one (1) but three (3) different calculations on top of the data. Whenever a blank (a case of zero sale) is encountered, we calculate the average sales value of the same product in the same store over the last 7, 14 and 28 days. In addition to this, we also add a filtering mechanism, such that the client can focus on products that are “normally” selling on all days in the selected period. Products that are sold on rare occasions are not displaying the issue of Zero Sale, as this is supposed to identify if and when a store fails to offer the product in question. Empty shelves for a top selling product I think everyone can acknowledge is a serious issue.

Tech specs

The setup is sort of attached to our regular data import and is split out on a separate server of its own. We are currently migrating from on-premises to a more cloud based solution. Not sure when we will be fully in the cloud, in time I guess.

The server itself is a pretty standard setup, currently running on a Standard DS13 v2 Virtual Machine in Azure (8 cores and 56 GB memory). On top of that we’ve added a couple of striped disks in order to serve both data and sql-temp operations better. Read more on how to stripe disks on an Azure VM here.

This about covers the “hardware” specs of this setup and the rest is comprised of some in-memory tables, column store indexes as well as a fairly straight forward SSIS package.

Conclusion

In previous attempts (years back) the time consumed by doing the calculations was way over limit, hence the product/feature wasn’t offered to the market. This is particularly troublesome if it’s a client request. With column store indexes as the main contributor we are now able to accommodate for this type of calculation although not on the fly, in a more reasonable time frame which is all good and well when your data update is on a weekly basis.

Loading

T-SQL Tuesday #86 : SQL Server Bugs & Enhancement Requests : [SSMS] Intellisense for MDX

This months T-SQL Tuesday is hosted by Brent Ozar (b|l|t) and the invitation is found following this link.

T-SQL Tuesday was started by Adam Machanic (b|l|t) and this is the SQL Server community’s monthly blog party where everyone is invited to write about a common topic. Actually Adam just published a complete version of the rules of engagement here.
This time the topic is SQL Server Bugs & Enhancement Requests.

Introduction

I think this topic is a really good selection. I have previously worked on an idea to make connect items visible at the SqlSaturdays I helped organize here in Denmark. I haven’t come up with a really good solution yet, but I hope that we maybe next time will have a way to make the attendees aware of how they can impact the evolution of SQL Server.

[SSMS] Intellisense for MDX

The Connect Item I have chosen to write about is an old one and is about getting Intellisense for MDX in SQL Server Management Studio [SSMS]. Despite the fact that it was created back in 2009 by Jamie Thomson (b|l|t), it is still active and there has been a public acknowledgement back then, by the Analysis Service Team, that they will consider this request for an upcoming release. 2009, still active… True story.

Are You Done Considering Yet?

Meat n’ Potatoes

This connect item is about a functionality many of you SQL Server Database Developers and Administrators take for granted, namely code completion. Either you have had it through a third-party product (here, here or here) for years or have been blessed by the SQL Server Team with the built-in functionality in SSMS, supported since, drum roll please, 2008.

I can’t stress enough how much productivity would go up by adding such a feature to Management Studio. If you don’t believe me, try turning off your T-SQL code completion for just one (1) hour – I bet you’ll be screaming by then. And with all I hear about MDX being hard to master, it really doesn’t invite new people in to join the party, not having code completion. So please vote here.

Yes, you may argue that MDX is slowly being replaced by DAX. But I predict (seems oh so popular these days) that SSASMD cubes and MDX will stick around for quite some time still. And you may have a point in that this might not even be up to the Analysis Services Team anymore – since the break up of SSMS and SQL Server releases, the responsibility for this feature request was probably dropped between two chairs – that’s why one could hope that the MSSQL Tiger Team was made aware of this request, because it really seems to have been forgotten. If not for this reply I got on Twitter:

Help make them become aware by voting here or shouting it out on Twitter or by any other means you have. I urge you to vote, as a fellow compassionate database developer, as a struggling BI Developer, as a project manager wanting to see his developers pick up pace. Even if you don’t care, please vote.

Vote

Please, if I could just get you to vote. Thank you!

Vote

Is it getting pathetic? 😉

Jamie Thomson who initially created the request doesn’t seem to think this stands a chance. I hope we can prove him wrong – Even if he seems to have moved onto other technologies and wouldn’t really benefit from this massive upgrade of the developing experience with SSAS and MDX.

 

 

Power BI – Ideas

On a side note, there is a similar forum for feature requests for Power BI, which I would like to advertise for that – in case you didn’t know.

Please see this link.

Loading