Speaking at the Power BI World Tour in Copenhagen

In just little over a week the Power BI World Tour will be stopping by in Copenhagen, Denmark. More precisely in Lyngby at Microsoft HQ. Copenhagen was luckily reelected for hosting the World Tour again this year, which I think will be very beneficial to the local Power BI adoption and community to say the least.

Copenhagen

Last time around I was honoured by having one of my sessions selected. This year I get to have two sessions. I am super excited about that!

My first session will be on Tuesday 11th about Power BI Best Practices. From the trenches of some of our own projects I have gathered a list of things to do in a project, to make your life easier.

My other session will be an Introduction to Power BI and Power Query (M). The query language is one of the few things of late that blown my mind in terms of capability and versatility. I will be showing you how to get started with the basics.

I am so looking forward to spending a couple of days with other Power BI Professionals, foreign and domestic!

#TSQL2SDAY #101 Round-Up : My Essential SQL Server Tools

Tuesday 3rd of this Month I invited people in the SQL Server community to share which tools are essential to their daily work. I was really overwhelmed by the number of stories that the topic triggered. 22 in total took the time to write down and share which tools they use for their work chores.
Going through 22 posts and aggregating them has been taking more time than I had hoped for, since my trusted laptop broke down – blinking codes are well and alive I tell you!

Going through the lot, I found some similarities to the posts, and have categorized them accordingly. But first off a BIG thank you to all how participated!

Without further ado, here goes.

Relational Heavy Lifting

Kamil Nowinski (b|l|t) takes us through the classic stuff, I mean, the real classic stuff – some would call it vintage – by showing how Total Commander still has a place in the tool belt, this century 😉

Matthew McGiffen (b|l) shows how to set up a Central Management Server, in order to execute queries against multiple instances in ad-hoc scenarios, seamlessly. Very nice tip. Matthew also did a second post, lining up multiple tools he’s written about in the past, nicely aggregated in this post.

Jess Pomfret (b|l|t) does a really nice post on how Powershell and the dbatools has changed her way of working. Jess even provides some useful snippets to get you going. I share the same enthusiasm for Powershell as Jess does, and was very pleased to see homage paid to the dbatools – incredible tool. Best of luck on your speaking adventures!

Marek Masko (b|l|t) has a huge post on classic DBA tools as well as a pleasant surprise on testing using tSQLt. Also some good pointers to free community scripts and tools as well. Great read!

Tracy Boggiano (b|l|t) covers dbatools and a specific Powershell Command and T-SQL Stored Procedure, but also on Telegraf and VS Code.

Dan Clemmens (b|l|t) goes all in on DBA tools for statistics, execution plans and tracing, even including the legendary diagnostic scripts from Glenn Berry.

Steve Jones (b|l|t) has a huge list of free and paid tools, from SQL Server sentric tools to a good deal of process related tools – i.e. DevOps and such.
Also Steve manages to sneak in a reminder on the password thingy magicky, that, according to domain expert Troy Hunt we all should rely on, be it pwsafe or any other tool like that.

Doug Purnell (b|l|t) is short and concise in his praise of Ola Hallengren maintenance scripts  and Jared Zagelbaums extension of those in Powershell.

Warren Estes (b|l) is praising the usual suspects in the DBA field, but adds a couple of interesting options for productivity and benchmarking/testing and also rounds up a couple of SentryOne products.

Devon Leann Ramirez (b|l|t) is offering a thorough introduction to their free plan explorer offering. Devon also makes a good point in marking the company’s presence on the community. If you want the quick tour, head over to Vimeo.

Rob Farley (b|l|t) talks about two things I really hold dear; Coffee… and I forgot the other thing. No really, Rob has an excellent blog post on Live Query Stats (LQS), and what some of the use cases are for that feature/tool. There are more ways of using LQS than I had thought about – thanks for sharing!

Riley Major (b|l|t) share his story on how he works with Management Studio and how the cool could be improved to further support a common way of working. Besides the tips on SSMS Riley also lists his favorite (and not so favorite) tools.

The BI Power in Power BI

James McGillivray (b|l|t) is first and foremost writing about my trusted travel mate; The Kindle (App) as his favored tool of the trade. Besides that treasure trove books can be, James also has some pointers to daxformatter and a theme generator which is pretty hefty!

Community Zone

Jo Douglas (b|l|t) argues that the most important tool for any professional is networking and community, and it’s hard not to agree completely. Jo also writes about some great points of where to begin this journey.

Jason Brimhall (b|l|t) brings up the aspect of blogging itself as a great tool of the trade, and I have to agree here and couldn’t have stated it more clearly that Jason:

Blogging helps you become a better technical person.

Googlefoo is also described in Jason’s blog for this party, and he manages to sneak in a reference to his extensive work and blogging on Extended Events, which in itself is an awesome contribution.

Reid DeWolfe (b|l|t) offers a quick write up on classic DBA must haves; SQL Prompt, Plan Explorer and GitHub/SourceTree. Reid also describes some of the main benefits of the tools.

Other

Garland MacNeil (b|t) brings another perspective into the party, by writing from a borrows laptop – not sure it was intentional, but I guess the exercise is very rich in terms of learning. I know others have been there too:

https://twitter.com/_AlexYates_/status/986137414723866625

Chrissy LeMaire (b|l|t) has, surprisingly enough, not written about dbatools, and if you believe in that you may call me Bill 🙂
In Chrissys blog post you’ll find a great list of auxiliary tools for all the things you do around programming; Screen shot/Image handling, code repositories, clip board management and video editing tools.

Josh (b) gives us the DevOps perspective of a Database DBA/Developer in a not so uncommon scenario – well, I think we’ve all been there at some point. Some prominent 3rd party tooling is getting some ❤

The Other Tools

Catherine Wilhelmsen (b|l|t) offers a completely different and refreshing view on tools that were completely new to me, at least. Going from database modeling to data generators to time keeping tools and beyond.

Finally, No Tools

Hugo Kornelis (b|l|t) makes a good argument on not becoming addicted/dependent on the presence of certain tools in order to perform your job. I guess this applies in particular, when you’re a consultant and can’t always BYOD. Apart from that Hugo really likes SQL Prompt and Plan Explorer 😉


The Tools Mentioned (in no particular order)

dbatools dbatools  PowerShell  dbareports  SQL Server Management Studio
 Redgate SQL Compare  Minionware  Sentry One Plan Explorer dbachecks  SQL Operations Studio
 SQL Database Modeler  Dynamic Restore Script  Scooter Software Beyond Compare  Redgate DLM Dashboard  Ola Hallengrens’s maintenance scripts
 Trello  SQL Server Data Tools  Passwordsafe  Sublimetext  Notepad++
 Redgate SQL Prompt  Mockaroo  Dropbox Visual Studio Code  SQLCover
 Sourcetree  SQLNexus  Coblis  tSQLt  Advanced Theme Generator (PowerBI)
 DAX Formatter  R Studio  Scale SQL Clear Trace   PSSDiag  Devart’s dbForge
 Toggl  Grammarly  SCD Merge Wizard Statisticsparser  Adam Machanics whoisactive
 Winmerge  Mythicsoft Agent Ransack  Redgate SQL Search    Glenn Berry’s Diagnostic Scripts

 

TSQL2SDAY #101 Invitation: My Essential SQL Server Tools

tsql2sdayThe Why

If you’re not familiar, T-SQL Tuesday is a blogging party hosted by a different person each month. It’s a creation of Adam Machanic (b|l|t), and it’s been going on for ages now! Basically the host selects a topic, defines the rules (those are almost always the same), and then everyone else blogs about said topic. Once the deadline is reached, the host summarizes each of the submitted posts on their site/blog.

T-SQL Tuesday #101

This is the second time I host a T-SQL Tuesday, and hopefully not the last

The What

This month I get to pick the topic, and I am going to go with:

The Essential SQL Server Tools in my stack

Besides SQL Server Management Studio and Visual Studio Data Tools we all have our own set of tools that we use for everyday chores and tasks. But how do we get to know which tools are out there, if not for other professionals telling us about them? Does it have to a fully fledged with certification and all? Certainly not! If there’s some github project out there, that is helping you be double as productive, let us know about it. You can even boast about something you’ve built yourself – if you think others will benefit from using it.

Basically I think, that by establishing awareness about what kinds of tools that are out there, new professionals will not have as steep a curve getting the pace up, as they would have had. But I suspect that even some veteran guys could have an “a-ha” moment from reading the summary.

Additionally, you can (read: should) share how you came to depend on said tool – and of course you are encouraged to give credit, where credit is due in terms of making you aware of the tool.

Another approach for this topic, is to approach it as kind of A Day in the Life of kind of blog post, as has been done before by Erin Stellato (b|l|t). Writing with the specific angle to describing how your everyday is made easier by the use of your tool stack.

The How

There’s only a few rules for T-SQL Tuesday:

  • Your post must be published on Tuesday April 10th 2018 between 00:00 GMT  and 23:59 GMT.
  • Your post must contain the T-SQL Tuesday logo (see above) at the top and the image must link back to this blog post.
  • Trackbacks should work, but if they don’t, please put a link to your post in the comments section so I (and everyone else) can see your contribution!
  • Tweet about your post using the #tsql2sday hashtag
  • Include “T-SQL Tuesday #101” in your blog post’s title.
  • Optionally add @vestergaardj to your tweet, to make it harder for me to miss 😉

If you want to host a topic of your own:

  • Contact Adam Machanic (b|l|t) and tell him you’d like to host a T-SQL Tuesday from your blog.

Azure Saturday hos Microsoft i Kgs. Lyngby

En lørdag i Kgs. Lyngby, i selskab med 35 talere, der kommer med massiv viden på hver deres område, for kun en plovmand. Det lyder næste for godt til at være sandt – men det er det ikke, og faktisk er det ikke kun om lørdagen der sker ting og sager.

Der vil være fire (4) forskellige workshops, for en merpris, d. 30. og 31. august og selve konferencen er d. 1. og 2. september. Forvirret? Interesseret?
Så se mere på http://azuresaturday.dk

Her er et lille udsnit af de indlæg der vil være at vælge imellem:

 

T-SQL Tuesday #87 – Fixing Old Problems with Shiny New Toys

Matt Gordon (b|l|t) is hosting this months TSQL2SDAY which is on the topic fixing an old issue/problem with shiny new toys. I am really happy about this topic, as it offers the opportunity to display not only the short comings of earlier versions, but also the progress made by the SQL Server team.

Story Line

My contribution to this blog party, is going to be about SQL Server 2016 SP1, which is the edition of SQL Server we implemented the solution on. Before I was part of the company, there had been numerous attempts to solve what was know as the Zero Sales Issue. Working with Retailer/POS data some customers are interested in identifying if a product didn’t sell on any given combination of date, product and store, and in such case calculate some sort of lost potential for that store (on that specific date for that specific product). For some of our customers, this quickly becomes a super massive matrix, as we serve 3.500 stores on top of (for some) app. 5.000 products. The calculations were to be conducted on a two year running period (730 days). With this example we end up with a combination of 3.500 store x 5.000 products x 730 days = 12.7B rows, just for this particular costumer; We have potentially (at the moment of writing) 50+ customers.
Generating a table of this magnitude, along filling in the gaps of the days that actually have sale was previously a too time consuming task to offer this kind of analysis in our portfolio. Enter SQL Server 2016.

With SQL Server 2016 we were able to generate the table and fill in the blanks that was needed in order to do the calculation (Yay Window Functions!). After that, we are offering not only one (1) but three (3) different calculations on top of the data. Whenever a blank (a case of zero sale) is encountered, we calculate the average sales value of the same product in the same store over the last 7, 14 and 28 days. In addition to this, we also add a filtering mechanism, such that the client can focus on products that are “normally” selling on all days in the selected period. Products that are sold on rare occasions are not displaying the issue of Zero Sale, as this is supposed to identify if and when a store fails to offer the product in question. Empty shelves for a top selling product I think everyone can acknowledge is a serious issue.

Tech specs

The setup is sort of attached to our regular data import and is split out on a separate server of its own. We are currently migrating from on-premises to a more cloud based solution. Not sure when we will be fully in the cloud, in time I guess.

The server itself is a pretty standard setup, currently running on a Standard DS13 v2 Virtual Machine in Azure (8 cores and 56 GB memory). On top of that we’ve added a couple of striped disks in order to serve both data and sql-temp operations better. Read more on how to stripe disks on an Azure VM here.

This about covers the “hardware” specs of this setup and the rest is comprised of some in-memory tables, column store indexes as well as a fairly straight forward SSIS package.

Conclusion

In previous attempts (years back) the time consumed by doing the calculations was way over limit, hence the product/feature wasn’t offered to the market. This is particularly troublesome if it’s a client request. With column store indexes as the main contributor we are now able to accommodate for this type of calculation although not on the fly, in a more reasonable time frame which is all good and well when your data update is on a weekly basis.