In Los Angeles? Come See Me Speak!
I’ll be presenting at the Los Angeles SQL Server Professionals Group on January 21st. For complete details, check their website.
I’ll be presenting at the Los Angeles SQL Server Professionals Group on January 21st. For complete details, check their website.
I just put together and delivered my first consolidation and virtualization class for SQL Server based on Windows Server 2008. One of the technology labs is building a standalone SysPrep’d server that has both Windows Server 2008 R2 and SQL Server 2008 R2 (currently still in pre-release form).
A bit of history: up until now, there really was no way to have a SysPrep’d Windows system with SQL Server on it. In fact, the wasn’t supported for that combo, and there were numerous issues if you tried to do it. All of this is pretty well documented. With the release of SQL Server 2008 R2, there is now a way to SysPrep so that you can essentially have a way of creating a SQL Server that can be partially installed earlier, and then finished later.
This is different than the advanced cluster preparation install which is for failover clustering and introduced with SQL Server 2008. In fact, you cannot Sysprep a cluster at all, so if that’s what you want to do, sorry. Standalone servers only. SysPrepping a cluster would need support both at the Windows and SQL levels, and it’s not there for either. Once it’s in Windows to SysPrep a base cluster, SQL can come up with a way to support it. I don’t see either happening soon.
The concept of having a Sysprep server makes sense: create a base image with your configuration, and then use that base which can then be “brought to life” during the loading of the image (or the ISO burned from the image). This ensures consistent builds which is a good thing. One of the problems I often help DBAs with is ensuring that the Windows builds (both standalone and clusters) they get to put SQL on isn’t a bed of quicksand. So many problems that happen with SQL Server stem from a bad configuration of hardware or underlying software which seemingly has nothing to do with SQL Server. Different rant for another day; just stating facts here.
In this day and age of virtualization, it makes even more sense to have a base image you can then deploy a million times. Just create your image, save it as a template VM, and away you go!
Let me say this before I go further: the way SQL Server implemented SysPrep in the SQL Server 2008 R2 installer works, and it works as advertised. But I feel in this version they did not go all the way. What do I mean by that?
The way the feature is implemented in SQL Server 2008 R2, you can only Sysprep the Database Engine and Reporting Services – not Analysis Services, nor the tools. My problem stems from not doing the tools (i.e. SSMS). When I build a SysPrep’d image, I want it to have everything, and then I’ll bring SQL to life (i.e. completion of the image) to wrap things up. To me, that would mean when that process is done, I also have the tools to do the job. You can get there, but it’s more work – in my opinion – than it should be. Here’s the process at a high level of what you have to do to have a “complete” image that is Sysprep’d:
1. Create the base OS image and make sure it is left open; don’t seal it yet.
2. Install the SQL Server tools.
3. Run the image preparation for SQL Server 2008 R2.
4. Seal the Windows image.
At this point, you’ve got a usable image that can be used. When you load the image onto the new server, you’ll have the process of bringing the server to life, which will include finishing the SQL Server installation. The process is pretty quick. However, the difference is that the process I outline above gives you a usable server. If you just Sysprep the server, you’d have to load the tools after the fact, which to me defeats the whole purpose.
I still think it’s a great feature in SQL Server 2008 R2, and something that many of my customers have been asking to have for years. Remember, SQL isn’t always built by DBAs, so the more that the experience can be integrated with Windows and be stable, it makes for better SQL Server implementations. It isn’t for everyone, but it works just fine if you account for what I talk about above (which is also documented – see the link below) to ensure you have a full image for SQL Server.
Tip
Use the generalize option (either GUI or command line) for the Windows Sysprep process to ensure that each finalized server will get its own SID, new Event Logs, etc. Also use the Out Of Box Experience (oobe) option to prompt for a new system name and such.
Some good links:
1. The Books Online entry for the feature in SQL Server 2008 R2 can be found here.
2. A nice walkthrough of the SQL Server 2008 R2 GUI for SysPrep can be found here.
3. The Windows Sysprep documentation can be found here.
Fun fact:
Note that Windows spells Sysprep with a lowercase p, but SQL uses an uppercase P. Gotta love consistency!
I’m back from Asia and teaching in Singapore. I have some time off over the next few weeks, so I’ll be doing a bit of catch up (including finishing the scripts for the book … no, really). Part of that catch up will be doing a bit of blogging. I try to post things that are not just throwaway, and after speaking in Singapore, I think it’s time I get a bit more outspoken on this topic: the shrinking DBA skillset. This is going to be a small series on what I feel many DBAs today are lacking to do their jobs better. This series has its genesis back when I did a series of 6 webinars in the springtime of 2009 for Penton Media/Windows IT Pro on SQL Server DBA basics. It had been a long time since even I had revisited what we sometimes like to call “101” here in the States. (101 is what colleges sometimes call the intro courses to a subject.)
Transact-SQL – It’s What’s For Dinner
For years, I’ve felt that the “modern” DBA has become reliant upon graphical-based tools, and not really learned how to do things in alternate ways. Scripting is like an evil curse word to some. Besides teaching my consolidation/virtualization course, I also did a presentation at the World of Windows Server Conference on failover clustering (I know, big surprise). I asked if people wrote Transact-SQL or any kind of admin scripting. No one raised their hands. I was shocked.
I’m going to sound like a crotchety old man here (I’m not; I just turned 38 last month, but I’ve been around SQL Server for nearly half of my life going back to about 1992). When I started on Sybase SQL Server, there were no GUIs. Just T-SQL and command line. Even 4.21a of Microsoft SQL Server only had a pretty rudimentary Enterprise Manager. Scripting was a way of life.
Transact-SQL dates back to the Sybase days, and is the most fundamental SQL Server programming language. Microsoft and Sybase still use Transact-SQL, but they diverged as of Microsoft SQL Server 7.0 so they are not exactly the same. It was/is the common thread for SQL Server professionals until the era of the GUI was ushered in.
In 2009, I have a hard time finding people who can write even basic BACKUP and RESTORE statements. Let me clarify what I’m saying before people go off on me: I’m not talking about having to memorize every parameter of every Transact-SQL statement here. That’s what Books Online is for; it’s a great referenec. Even I don’t memorize everything; there’s only so much I can keep in my head. I’m also not talking about writing T-SQL for applications (that’s a whole different ballgame and developers – I’ve got a bone to pick with you; just not in this blog post); I’m talking for administrative purposes. T-SQL is not a retired feature, DBAs. It’s one of your most powerful tools available to you. All of that information you see visually in SQL Server Management Studio? You can generally get at it easily – or easier – via scripting. When I’m doing a consolidation gig, I use things like the SERVERPROPERTY function as part of my scripts that I have clients run. Microsoft has even made it easier to get at some things – you don’t even have to remember system tables in many cases. Just know about the dynamic management views (DMVs) first introduced in SQL Server 2005. They are a great way to gather information and monitor what’s going on.
Scripting Sometimes Means More Than T-SQL
Over the years I’ve also written many OS-level scripts in languages like WMI for DBA use. Now, I’m not a coder in the sense that you’d want me writing optimized application code, but admin code is a whole different thing. For example, my backup retention scrips (also included as part of the download for the Pro SQL Server 2005 High Availability book) are all WMI. My current consolidation scripts for gathering OS-level information are all WMI. I’m currently porting a lot of that stuff to PowerShell, which I will talk about in a second. A DBA’s job description does not begin and end with just SQL Server (a continuing theme through this series). SQL Server is part of a larger ecosystem that ESPECIALLY includes Windows. That’s a topic for another day. I don’t even ask DBAs if they can write other code if they can’t write T-SQL.
I can’t say it enough at this point: learn PowerShell. And if you didn’t understand me, this should clear it up for you: L E A R N P O W E R S H E L L. You’d be foolish not to. Microsoft has often (and sometimes rightly so) been criticized for not standardizing toolsets between products. PowerShell is becoming that unifier, and you WILL be left behind as an administrator if you don’t learn it. On the Windows side, for example, the clustering team is sunsetting the .exe version of command line tools for PowerShell only versions. I predict many Microsoft products doing this over time. Windows is a bit ahead of the curve here.
I will admit I do not love SQL Server’s implementation of PowerShell at the moment as it does not use a lot of commandlets (cmdlets), but is basically a PowerShell interface directly to SMO. Windows uses a lot of cmdlets and is easy to use, while SQL Server needs a bit more “finesse” at this point. I’m hoping that changes in future releases of SQL Server.
Having said that, the beauty of PowerShell even over older programming paradigms like WMI is that you can write integrated scripts that cross products to administer them. For example, I can write cluster-based scripts with a SQL Server focus that do both OS-level and SQL-level tasks in one fell swoop; I don’t need to write different code in different scripts. I’m already experimenting with that, and it’s very powerful.
PowerShell is currently at version 2.0, which ships with Windows Server 2008 R2 and Windows 7, and can be downloaded for other platforms like Windows Server 2003). SQL Server 2008 ships with its own provider, and you can even have PowerShell stored procedures.
Don’t believe me? Talk to other SQL folks like my friends Buck Woody or Allen White. Read their blogs. See their webcasts and presentations at TechEd and PASS. PowerShell is here to stay.
Why Script?
At the end of the day, why should you care about scripting? Here are but a handful of reasons:
1. Better control of what you want to do.
I’m sure as a DBA you’ve had that moment where the GUI doesn’t exactly do what you want. Scripting – whatever language you use – is a way to handle that. And it’s sometimes incredibly efficient. What you can do in sometimes a few lines of code (and sometimes more than that … won’t lie) can be powerful. That code can then be scheduled as a SQL Server Agent job or via some OS-level scheduler to be run.
2. Understand your environment better
I’m a big believer in knowing how things work underneath the covers so that even if you use the GUI, you have a better idea of why things are the way they are. Scripting helps you get there.
Coding also means that there’s no black box, which is really what a SQL Server Maintenance Plan is; you don’t see the T-SQL or SMO run under the covers. Coding means there’s intent. What do I mean by that? Too many people use Maintenance Plans with no thought behind them. Having jobs you’ve coded and are executing gives you that control and understanding of why, not just selecting something because it’s a checkbox on an option screen.
3. Some features are only available via script or have script-only options
Not everything has a GUI equivalent or has all options exposed via some fancy interface. I know that may be news to some of you, but it’s true. You’d be better off learning a scriptable way to do things because you may have more options at your disposal – some of which may actually be useful.
Anyone ever use BCP these days? It’s one of the more powerful SQL Server tools, still ships with the product, and has been around since the Sybase days.
4. Reliability
As an administrator, there’s nothing you want more than repeatability and consistency in what you do – especially in terms of how things are executed and the results. Scripting is one of the best ways to make that happen. Scripts can be tested and results easily verified.
5. Disaster recovery/problems
What happens if your SQL Server is locked up and you do not have SSMS accessible? Do you know how to access and use the Dedicated Admin Connection via sqlcmd, and then run the proper commands to see what’s going on and resolve the issues? Never assume you’ll have a GUI available to you, and quite frankly, scripting can be faster than GUIs – especially if you know what you’re doing and/or what has to be done.
What if you need to do something like restore a server from bare metal and need to restore backups in a particular order. Are you really going to do that via the GUI if you’ve got 100 backup files (most of which are t-logs)? No, you’re going to script that. Hopefully you’ve generated it (like the script I wrote to generate that as part of my last book and downloadable on this website as well as at Apress) already. If you’re going to do it via GUI, that may lead to problems (human error) or just be cumbersome.
Scripts can save your proverbial bacon, and can serve as documents of your current environment. A good example is scripting your replication configuration.
Back to Reality
I don’t expect this blog post (or any in this upcoming series) to change anyone’s mind overnight, nor is it designed to turn anyone into a coding genius; it’s just me posting observations about what I see out there in the SQL Server community. I’ve done enough research over the past few years to see this as a very disturbing trend.
Yes, I know learning how to code takes time when you’re already overloaded with other tasks. Stop thinking of code as a development task. Coding has an administrative side that has nothing to do with application functionality. Exploit it for your own good and productivity.
I don’t want to seem like I’m not a GUI user here – I absolutely am. I mix both script and GUI in how I deal with my own environments and test harnesses in addition to how things get done at customer sites. A good administrator knows when to use a tool, or what will be best in a given situation.
What are your thoughts and comments? Agree? Disagree? What are you seeing out there?
It’s 3:11 AM here and I just got my first Vmware demo environment with vSphere 4 (i.e. ESX 4.0) configured to show vMotion. Having set up both Live Migration and now the aforementioned vMotion, I can tell you that I much prefer the setup under Hyper-V. Just getting vSphere installed (pretty simple using its GUI Setup) and configured (not straightforward … maybe it’s my unfamiliarity with the server-based Vmware products, but it is not intuitive like Vmware Workstation is) took me the better part of today to figure out all of the little nuances. OK, sure, I could have probably read some docs (which I wound up referencing), but setting up clustering and then Live Migration is just easier.
I also didn’t like the fact that vCenter needs a 32-bit ODBC connection to work (it uses a SQL Server backend), and it does not install the latest SQL Express version (if you choose not to use an existing instance with an appropriate DSN). That means those of you on later versions of Windows (read: Windows Server 2008 R2) will have to patch immediately. I wound up configuring vCenter Server under a Windows Server 2003 R2 VM.
As for the actual features – vMotion and Live Migration – they do essentially the same thing (migrate a virtual machine from one hypervisor host to another while keeping it up and running). They both work well.
The one major advantage of vSphere + vMotion at the moment is that I can demo it live on my laptop; my Live Migration stuff has been captured from a setup I did about a month ago when I had access to hardware. If you’re wondering why, it’s because the VT-x extensions that enable things like virtualization are not emulated in VMs (and if you think about it – it makes sense; virtualize a virtualized environment isn’t a normal use of the technology). Both Microsoft and Vmware do not emulate VT-x. To do vSphere, it seems like Vmware does something special under the covers. I hope MS does something similar because I’d love to demo Live Migration “in the flesh”.
Having said all of that, it’ll be interesting to see how I get along with vSphere now that I have it set up. Since many customers of mine are invested in vSphere/ESX for their virtual environments, it’s clearly in my best interests to have a good working knowledge of setting it up. I know DBAs probably won’t be doing most of what I do, but this work I’m doing right now helps in conversations with the other groups (especially the guys setting up the VMs). I’m not new to Vmware – I’ve been using Workstation for nearly 10 years with a lot of success, and I used Vmware Workstation 7 to set up this new vSphere 4/vMotion environment.
I have yet to play with any of the other virtualization products since quite honestly, my customers really only talk about Vmware or Microsoft. The others don’t come up in conversation. If you’re using one of the other ones, I’m not denegrating your choice – do not take it that way. It’s just that I’m not seeing it out there, much like the most common storage vendor I see at customers is EMC. There is other stuff, too – IBM, some HP, a Hitachi here and there – but EMC seems to be ubiquitous. All I care about is ensuring you have the right configuration no matter what hardware/software choices you make.
OK, off to get a few hours of shuteye and back to the grind in the AM to finish up some stuff before I head to Tokyo!
Well, it’s official – the website just went up. Can’t hide it anymore … I’m delivering a 3-day masterclass on consolidation and virtualization at World of Windows Server in Singapore as well as delivering two sessions (although the current website shows two different, it is going to be a two parter on clustering). I’m very exicted and honored to have been asked to do this.
If you don’t live in Singapore or can’t get there (hey, what’s a few thousand US dollars to take a little excursion to Singapore at the last minute – just ask ask your boss!), I do plan on delivering the masterclass and possibly expanding it to four or five days. Or not. I’ve spent a lot of time over this past month in content development (official v1 of my Windows Server 2008/SQL Server failover clustering class and this upcoming delivery in Singapore), and am going to assess it after I’m done to see if any tweaks are needed. I can promise you that whether you come to Singapore or see it in a town near you, it’l be a lot of fun, technical content, and information.
Contact me if you want to know more!