Automating SQL Service Packs and Cumulative Updates

Applying Service Packs and Cumulative updates is dull, mundane work. It’s hard to keep on top of sometimes and also with some 24/7 databases, it’s hard to find an update window. Sometime it’s hard to even get the business to agree to something if they’re not exactly seeing any end results. Sorry, this blog will not provide any assistance on improving your nagging and fear-mongering skills.

For critical and security patches (as well as windows updates), most places I work use tools like SCCM to take care of the checking and installing. I’m more than happy with that. If I had to take care of those too, then I’d never do anything else apart from an endless cycle of patching.

I use Kendra Little’s approach of grouping my database servers together in priority groups so that the first bunch of servers to get updates are the ones I least care about (my old Dev boxes, some little used UAT boxes) then my slightly more important UAT boxes, followed by pre-production and maybe low priority production boxes before I finally do my high priority boxes.

For Service packing and Cumulative updates, I use the same approach and until recently these were done manually. After finally getting into PowerShell, I wanted to write my own script to check and update servers. There are other, more skilled  people than me that have done a better job (believe me, I’ve read their blogs) but sometimes I find the best way to learn something is to build something your own from scratch. By all means use building blocks from other sources (of course I have) but it’s always good to know something inside out. It’s one thing running a program and knowing how to use it but another thing entirely when something goes wrong and you have to support it. If you’ve written it yourself then your going to know it’s inner most workings. For a while anyway. The idea of starting a blog again is that it’s a great way to document all the new stuff I am constantly learning and keeping that information in an easy to reach place if that easy to reach place is not my own head.

My first port of call was to jot down what I wanted my script to achieve (Check a server for SPs and Updates for all versions of SQL Server, check if a reboot was required and have the option to update with SPs/CUs or not and also reboot the server). The second port of call was a google search and to look at other peoples automation efforts and use these as a starting point. Unanimously all roads seem to lead back to Adam Bertram and his scripts that can update multiple SQL Servers. In fact, I could have just taken this script and used  tested it, but that is not learning any PowerShell.

The thing I did definitely use from Adam’s script and procedure was the file structure (ultimately an SQL folder, a year folder for each version under this and then an ‘updates’ folder containing your SPs and  CUs), naming conventions and the CSV file (please see my screen shots at the bottom). Ironically it was boring mundane work renaming all the SPs and CUs but ultimately it has been worth it. After completing this, I was then able to start putting my script together.

  

Using Adam’s approach for checking for the current version (although my version is a lot smaller and grabs only the information I require for my function to work), then checking for the latest versions listed in the CSV file I could then search for the correct SP and CU without fail every time and use this to find the files in my source folders.

After struggling to get consistent correct responses from my own Test-RebootPending function, I then decided to import the PendingReboot module by Brian White to use instead. For the copying of the files to the servers and installing the patches, I wrote my own functions using the PSSession cmdlet. Due to my knowledge of PowerShell still being fairly infant, I couldn’t fathom Adam’s functions particularly well although they were obviously more robust than mine but if I don’t know how something works in a script then I’m not comfortable running it, especially if you’re using it on highly available databases servers.And as I write and learn more, I can go back and improve on my own scripts.

Anyway I digress slightly.. I’d already written a SQL Server install script using PowerShell and Config files so I knew a little about remote connections as well as copying and running .exe’s so I was able to use these to build my own functions.

The last thing I wanted to do was add a -test/whatif parameter because sometimes it’s just handy to find out version and  patching information in a hurry and not actually run the installers. I can always use the information in a CMS so I know when patching has gone out of date.

There are definitely more improvements to make (unlike Adam’s script which updates multiple servers, mine only does one at a time for now and you can’t choose which SP or CU to install, it’s the latest and greatest at the moment) and I’ll be making improvements as I learn more. So far this this process and script works and has saved me loads of time and effort on the boring mundane stuff that I can now use on working out what further boring and mundane stuff I can remove from my job.

My script is here on GitHub (it includes a synopsis and examples) as well as a copy of my CSV file (identical to Adam’s and it covers SQL Server 2005 – 2016 as of 19/10/2018) and a screen print of my file naming conventions and folder structures.

I hope it helps someone write their own automation script. Please let me know if you can spot any improvements that can be made to my own too.

If you plan to use or modify this script then you know already not just run it straight in production right? Also, you’ll need to put in your own Folder location into the $Source Parameter (I have this set as a default location so I don’t have to enter it every time I run it). This is currently blank on the version on GitHub. This script will definitely not run if you don’t follow the folder tree, csv format or naming conventions precisely.

Oh, and don’t forget to periodically update your CSV file (you could even put this in a table in a CMS somewhere), download and rename your files. Sorry, it is not quite that automated!