T-SQL Tuesday #148 – Advice on Running a User Group

Invitation from Rie Merritt.

For this edition of T-SQL Tuesday, I’d like to ask everyone to write about all the various aspects of running a user group.  I’m not asking you to write a James Michener novel that starts at the beginning of time and includes everything a user group leader needs to know. Specifically, I’m asking people to pick one or two things and go deep on what works for you, what didn’t work for you and lessons learned.   Running a user group is a large umbrella of tasks and duties: be it in-person, virtual or hybrid.  My goal this month is to break that up into small bite sized pieces of knowledge for people.  There are so many directions you can go here: finding speakers, growing your membership, utilizing technology to make things easier, finding sponsors, finding a venue, picking a pattern to when you meet, etc.  The possibilities are almost endless!  

The Azure Data Community is building a How to run a user group wiki and we’d love to link to your post to offer new and old user group leaders some great resources to build and maintain a successful, healthy community.  

Special thanks to Steve Jones for running T-SQL Tuesday and finding a spot for me, thanks to Kenneth Fisher for being flexible with when he sponsors T-SQL Tuesday and finally to John MorehouseAnnette Allen and Josh Smith for all of the sweat equity they’ve already put into building the wiki.  

T-SQL Tuesday #147 Invitation – Upgrade Strategies

Invitation and wrap-up from Steve Jones

Planning for Upgrades

In my career, most of the time we don’t upgrade production databases very often. In most of my jobs, we’d change versions for new databases, but existing ones often lived on their original version. It’s how I got into a job where I was managing 4 different versions of SQL Server. These days I expect it’s common for many DBAs to have to deal with that many, or more, versions.

I do have customers these days that try to upgrade often, and limit the number of versions they work with. I have customers now that are on a mix of 2016-2019 only, some that might be working on 2014-2016 only, and I’ve run into a customer that only has SQL Server 2017. Of course, they have few databases and look to upgrade about every 5 years when mainstream support is running out for their edition.

This month I want you to write about how you look at SQL Server upgrades. A few things you might think about:

  • Why we wait to upgrade?
  • Strategies for testing an upgrade
  • Smoke tests or other ways to verify the upgrade worked
  • Moving to the cloud to avoid upgrades
  • Using compatibility levels to upgrade an instance by not a database.
  • Checklists of things to use in planning
  • The time it takes to upgrade your environment
  • What you evaluate in making a decision to upgrade or not?
  • Anything else

I don’t know when SQL Server 2022 will release, but certainly many of us will need to consider in 2023 whether we want to upgrade systems or not. Think about it and write about something that matters to you.

T-SQL Tuesday #146: Upending Preconceived Notions

Invitation and round-up from Andy Yun.

Welcome back to another edition of T-SQL Tuesday! I’m honored to be your host once again!

Theme to Kick off 2022

This month, I’d like to ask everyone to think about something you’ve learned, that subsequently changed your opinion/viewpoint/etc. on something. Maybe you’ve had a certain opinion, belief, or bias? Perhaps you’ve always thought something worked a certain way? Or you’ve always thought that a certain something (called “X”) was only good for “A”, only to later learn that it can help with “B”, and “C” as well. Regardless, you learned something and it totally upended and changed that preconceived notion you held previously.

When has this happened to you Andy?

Let me share an example. In my past as a T-SQL developer, I remember when I first learned about CTEs. I thought the world of them and started using them everywhere! However, there was one slight problem. I was under the mistaken impression that they pre-materialize each sub-query. And they do… in OTHER RDBMS’s. Whoops! After a few years, I learned that they don’t behave that way in SQL Server. Instead the query optimizer inlines the query in the CTE, making them functionally no different that a subquery. And well, let’s just say that that made me regret some of the coding decisions I’d made during my “CTEs-are-awesome” phase.

T-SQL Tuesday #144 – Data Governance reimagination

Invitation from Dr. Victoria Holt.

Data governance is a topic that has raised its head again in the last year, with the introduction of a GA service called Azure Purview.  Data governance is not a new topic in the realms of data management.  I think over the last few years data governance has had a focus on meeting data protection law, government legislation and formalised control and standards. Then with data sovereignty issues being everyday considerations within the global market place, storage locations in the cloud and the likes of the General Data Protection Act, data governance has been very focused on meeting legislative requirements.

There has been a substantive cost involved in setting up data governance within an organisation to avoid those heavy fines if a data breach were to occur. I think because of this many organisations compartmentalise personal data systems.

A change in perspective, a reimagination, of data governance is occurring. Data governance is really about ‘data erudition’, showing an interest in learning about the data we have, improving the quality and creating a more productive and trusted data asset. Starting small and incrementing data change in a way that matches the business need, is a way to gain targeted business value. Azure Purview provides this great opportunity for us to start at the beginning and create a data catalog, data inventory, data dictionary and much more in an automated way.  Trusting the data quality, knowing your risks and what data you have sets your business up for success.

My invitation to you for this month’s #tsql2sday is…

I want to invite you to share your experiences on data governance

  • The current cost of data governance versus its benefits
  • The amazing things data governance has enabled you to achieve or will enable you to achieve in the future
  • The potential uses for Azure Purview within your estates and the automated deployment options for that

T-SQL Tuesday #143 – Short code examples

Invitation and wrap-up from John McCormack.

T-SQL Tuesday this month is going back to basics and its all about code. I’d like to know “What are your go to handy short scripts”?

What are those little short bits of code that you can’t live without? I’m talking about little snippets that are only a few lines, that you may even have memorised. It could be T-SQL, PowerShell, Python or anything else you use day to day.

e.g. I manage a lot of SQL agent jobs. Quite often, I need to find out which job has a certain t-sql string in the command so I’ll run:

SELECT * from msdb..sysjobs sj 
JOIN msdb..sysjobsteps sjs 
on sj.job_id = sjs.job_id 
where sjs.command like 'backup log%'

Of course, there are many other ways to find this out including DBATools commands but sometime I just revert to memory for convenience.

Another one I like is to get the estimated completion rate of a backup or restore. Now there are better scripts than this but sometimes, nothing beats getting a quick estimation back from a couple of lines of memorised t-sql.

SELECT percent_complete pc,*
FROM sys.dm_exec_requests
order by pc desc

My invitation to you for this month’s #tsql2sday is…

I would like you to share with the community what your go to script snippets are and why you find them useful. By sharing these, you will undoubtedly be helping someone who hasn’t thought of doing it that way, and hopefully you’ll pick up some handy hints as well.

  • Any language is fine, not just t-sql
  • Please share as many as you wish
  • Perhaps you never do this and always work off saved scripts or convert your snippets to stored procedures? Tell us why this works for you.

T-SQL Tuesday #142: Using descriptive techniques to build database environments

Invitation from Frank Geisler.

In the old glory days back then it was usual that you must deal with one or two or probably three SQL Servers. As you all know these times are over. Through the rise of the cloud, every one of us must deal with more and more systems, not only Infrastructure but also Platform as a Service (PaaS) offerings. The systems themselves are getting more complex through all the new services and technologies that are involved and somehow interconnected. New movements like Azure Arc enabled Data Services bring a whole new aspect to the table where you can easily choose weather to run your data workload on your on-premises Kubernetes Cluster or in the cloud.

All these systems can easily be built with the Azure Portal, but this is not sustainable. Each time you use the portal, you must remember how to build a certain system and – and that is more important – how to apply best practices. For sure you can build e.g., an Azure SQL Database with an open endpoint into the internet and secure this by Firewall settings but this should be done with much caution because you are exposing your database to the internet. A better approach would be to build an Azure SQL Database that does not have a public endpoint but a private endpoint to an Azure V-Net which hosts the systems that must access the database, or which is connected to a local Network via VPN Gateway. As you can imagine there are a lot of moving parts to get such an environment up and running and you must remember (or document) each of these. This is very cumbersome work. There must be a better solution and for sure there is one: Scripting.

When you write a script, you are making your work once and whenever there is the same or a similar situation e.g., deploy an Azure SQL Database best practice, you can just pull out your script and there you go. This can be even taken to another level when you have a parametrization for your script that allows to just put in the parameters and let the script do the rest. Using this as a mantra I developed several scripts to build different cloud environments in PowerShell. This has the big advantage that the environment is documented as you have a script, and that the environment is versioned as well because all our scripts are saved within a source control system. The overall approach is called Infrastructure as Code (https://en.wikipedia.org/wiki/Infrastructure_as_code).

But doing imperative scripting in PowerShell also has its shortcomings. The cloud and the internet in general, is a very uncertain environment. While running the script that deploys your environment many things can happen. Your internet connection can break down, there could be an error deploying your script for whatever reason and so many other things you can think of. So you have to build many conditions in your script: If the resource group exists skip that and just build the Azure SQL Database if the V-Net exists skip that, check if all needed subnets exist and so on. Right? Wrong! Besides the imperative way of telling the Azure Resource Manager what to do you can also use a declarative approach to build resources in Azure.

This declarative approach is very common to everyone who has ever written T-SQL Code. If you write e.g., a query that selects data, you don’t instruct the database system how to retrieve the data from the underlying file structure. You only tell the system how the data you are looking for should look like: Select all the rows of data where the first name is “Frank”. This is the exact same approach that techniques like ARM-Templates, Bicep-Templates or if we are talking about Kubernetes YAML-Scripts take. The scripts are a description of how the target environment should look like. How this target environment is reached is fully up to the underlying System like the Azure Resource Manager. And there is even more: If you are changing an existent environment, only the parts that changed in the script will be altered in the target environment. Say you have an Azure SQL Database of a certain size and you change the size in your Bicep script. Next time you deploy the script the Azure SQL Database will be resized without deleting and redeploying it.

The ideal process of working with Infrastructure as Code would be that the code is checked in to Azure DevOps and that an automatic process will then deploy the changes to your target environment. To change your environment or to add resources you will only have to write the needed changes into your bicep scripts, check them in and let Azure do the magic.

My invitation to you for this month’s #tsql2sday is:

This is my invitation to you this T-SQL Tuesday to think about deploying SQL Component through descriptive Methods and of course to blog about it. It does not matter if you are using Azure and ARM-Templates or Bicep or Kubernetes and YAML. Just write about it and build some new cool Templates that implement some of your best practices infrastructure / environment wise. Or you can write an article on where you have already used descriptive scripts to build environments.

As always there is a whole lot of stuff on the internet you can use as a starting point. I summarize a little bit here:

T-SQL Tuesday #141: Work/Life Balance?

Invitation and wrap up from TJay Belt.

Having gone thru this last year of strange upheaval around the world, I’ve been thinking about this topic quite a bit lately. My company has a mentor program where one can sign up and discuss this topic. I signed up. I’ve had several ‘sessions’ with my mentor. We’ve had some productive conversations about what this all means to me, my career, my skillset, and so on.
As I transitioned to work at home, some 500 days ago, it was a shift in perspective, timing, and many things. Trying to keep those scales balanced has been a focus and a struggle from time to time. Especially recently, but throughout my career, it’s a topic I often lapse back into. Trying to find that balance.

My invitation to you for this month’s #tsql2sday is…

I want to invite you to share your experiences dealing with your own personal journey with Work/Life Balance, either thru experience, or hopeful future goals…

  • What are the cool things have you learned?
  • What things did you try, that simply didn’t work out? 
  • What rules have you implemented that made it easier to balance?
  • What rules made it harder to find that balance?
  • What are tips or tricks can you share to help others?

T-SQL Tuesday #140: What have you been up to with containers?

Invitation and wrap-up from Anthony Nocentino.

In recent years containers have come into the data platform world, exposing new technologies to data professionals. Microsoft put SQL Server in Linux, and shortly after that, SQL Server made its way into containers. SQL Server in Containers has become the foundation for things like Big Data Clusters and Azure Arc-enabled Data Services

My invitation to you for this month’s #tsql2sday is…

I want to invite you to share your experiences using containers and not just SQL Server in containers…

  • What are the cool things you’ve done with containers in your environment, test lab, or even presentation demos?
  • Are you using containers in production? If so, what are the tips or tricks you can share to help others?

If you haven’t tried containers yet…here’s a video showing you how to do the following…

  • Deploy a SQL Server in just a few minutes!
  • Connect to your container-based SQL Server.
  • Upgrade a container-based SQL Server to a new version.

So, if you haven’t used containers before, go ahead and try out the demos from this video, which you can get here, and write about your experience!

T-SQL Tuesday #139: The data world is hybrid

Invitation from Ben Weissman.

Our data estates are not just on premises anymore. And they haven’t all moved to the cloud either.

New offerings for a hybrid world like Azure Arc-enabled Data Services pop up but are far from being widely used in production environments.

It has become pretty evident to me, that we’ll be living in a hybrid world for a long time – potentially indefinitely. But what does that mean for our data strategy?

My Invitation

On this month’s #tsql2sday, I would therefore like to invite you all to share your hybrid and edge experiences!

Where are you on your journey?

Which challenges did you face?

How did you address them?

T-SQL Tuesday #138: Managing Technology Changes

Invitation and wrap-up from Andy Leonard.

One point I make (repeatedly) in my latest book – titled Building Custom Tasks for SQL Server Integration Services – is “software changes.” In fact, software changed on me between the completion of editing and the release of the book! I wrote about the changes in a post titled Building Custom Tasks for SSIS Second Edition Errata, Chapters 1-9. There’s a live stream video at the bottom of that post, as well.

Changing software inspired this month’s T-SQL Tuesday topic:

“How Do You Respond When Technology Changes Under You?”