Free Data Analysis and Reporting course from Google

Google is offering a free course titled “Making Sense of Data”. Google Data Course

This is very timely for me. In my role now as the Supervisor of IT Quality, I’ve been asked to organize and lead a monthly meeting with the Director of Infrastructure and Applications and his direct reports – managers of the various IT silos reporting to him. I’ve already started with one base sources of information, our Help Desk system which tracks both Incidents and Changes. I have a number of Pivot Charts in an excel spreadsheet that will take only a few minutes to refresh the data from our Help Desk source and refresh the charts.

With this new course – while probably specifically geared towards Google Docs and Fusion Tables, I’m hoping the underlying concepts of how to answer questions with data and how to recognize patterns in data will help me prepare a great monthly meeting that will be a value add to the organization. This is something that has not been done before by the IT organization – there have been some monthly data charts prepared in the past, but nothing like this monthly overview where I’ll walk the heads through some of the data, and drill down into information like “The number of incidents reported by region”, or “Incidents by incident area”. Over time this data will build up some great trends on performance. A repository not of data, but of knowledge. Maybe its a natural progression for me, as a DBA I have a long history of working with and supporting “Data” systems, this will be a path of turning “Data” into something useful for the business.

Disable UAC without reboot on Windows 7, Server 2008, Server 2008 R2

I thought this was common knowledge, but after a discussion on twitter, and a subsequent internet search, I think it’s not well known. In Windows 7, and Server 2008 and 2008 R2, you can disable UAC without requiring a reboot. You simply have to log in with the local administrator account and disable it via the User Account Control Settings dialog. Any change made will not require a reboot. Please note, you must use the built in Administrator account… using an account that’s in the Administrators group will not work.

It appears that in Windows 8, a reboot is never required, at least I couldn’t get a forced reboot in my testing. If I have time, I’ll update this with testing on a Server 2012 system in a few days.

Note, I am not endorsing that one routinely use the built-in administrator account on a system, and I’m not endorsing completely disabling UAC. Both are very real security risks… but there are times when it’s far more expedient to do this in a controlled fashion than the alternatives. I am a firm believer in the policy of Least Permissions Needed.

Powershell script to capture OS level properties (CPU, Cores, NIC, Memory, BIOS, OS)

I have a master post which links to a set of powershell scripts all related to this. The Master post which links to all the scripts.

This script is an updated and streamlined version of this original and ugly script

This version actually dumps out to a CSV via the “Export-CSV” command-let. It has some error handling and will check the first WMI call and skip the rest if the first one can’t connect. It has a friendly output to the screen so the user knows which server is currently being queried.

$serverList = "..\SQL Servers.txt"
#$serverList = "test.txt"
[System.Collections.ArrayList]$sysCollection = New-Object System.Collections.ArrayList($null)

foreach ($server in (Get-Content $serverList)) 
    "Collecting information from $server"
    [System.Collections.Specialized.OrderedDictionary]$si = @{}    # Blank hashtable for each iteration


        [wmi]$sysInfo = get-wmiobject Win32_ComputerSystem -Namespace "root\CIMV2" -ComputerName $server -ErrorAction Stop
        [wmi]$bios = Get-WmiObject Win32_BIOS -Namespace "root\CIMV2" -computername $server
        [wmi]$os = Get-WmiObject Win32_OperatingSystem -Namespace "root\CIMV2" -Computername $server
        [array]$procs = Get-WmiObject Win32_Processor -Namespace "root\CIMV2" -Computername $server
        [array]$mem = Get-WmiObject Win32_PhysicalMemory -Namespace "root\CIMV2" -ComputerName $server
        [array]$nic = Get-WmiObject Win32_NetworkAdapterConfiguration -Namespace "root\CIMV2" -ComputerName $server | where{$_.IPEnabled -eq "True"}

        $si.OSName=[string]$os.Name.Substring(0,$os.Name.IndexOf("|") -1);
        while ($iter -lt 12)   #Parse through each stick of memory (arbitrary 12 Memory stick columns)
            if ($mem[$iter] -ne $null) {
                $si."MemBank$iter" = [string]$mem[$iter].Capacity + "," + [string]$mem[$iter].BankLabel 
            } else {
                $si."MemBank$iter" = ""
        while ($iter -lt 4)    #Parse through each NIC (arbritrary 4 Network Card columns)
    catch [Exception]
        "Could not contact $server, skipping to next"
        [void]$sysCollection.Add((New-Object PSObject -Property $si))    #Add the dictionary list to the collection

$sysCollection | Export-CSV -path ServerInventory.csv -NoTypeInformation    #Output the collection to a CSV file

Another developer gift

Just found this one, looks like it was dropped in right around the time I started at the company.

ALTER procedure [dbo].[usp_Get_RowCount] 
@pStructureRowCount	nvarchar(500),
@pRowCount int output


set nocount on
@SQLStatement varchar(500)

set @SQLStatement = 'Select count(*) From ' + @pStructureRowCount

create table #tempRowCount (----Temporary table created for storing related to current Database
StrucRowCount bigint,

insert into #tempRowCount exec(@SQLStatement)

select @pRowCount = StrucRowcount from #tempRowCount

Developers and Reporting Services

Hello again. Recently I accepted a new job offer at the same company, I’m now the “Supervisor of Quality” which means I’m going to be working on Service Management, Change Management, Problem Management, etc, etc, etc.

So I kind of figured that my SQL posting days were done. However, my company likes to put the cart way in front of the horse and I’m still the lone DBA for 50 production servers, 70 production instances, and thousands of databases, scattered around the globe from South Africa to the UK, from Australia to Amsterdam. Oh, and everywhere in-between.

Today, I got a message from a “developer” that his new report is timing out.

I ran the report manually with his parameters, and do a little “SP_WHOISACTIVE” while the report was running. Here’s one of the queries I captured…
Continue reading

Surface Pro – Sleep oddities

No, this isn’t a post about me sleeping with my Surface Pro! My wife probably wouldn’t like that too much.

No this is a post about problems I seem to be having with this thing going to sleep. So, as a desktop replacement, it’s been pretty good so far. I’m still looking for a good way to split off the mini Display Port to two DVI monitors… I’m a dual 24″ monitor type of worker when I’m in the office. Using one and the Surface screen for work is not the best arrangement for me. But with that aside, I’ve been very happy so far.

I have noticed though that at times, when I go away from my desk, after hitting Windows Key + L to lock my screen, if I’m gone for more than just a minute or two, the SP seems to go into a deep sleep almost like hibernate mode – even when plugged into wall power. I’m a DBA, and I use SSMS all day long to interact with my SQL Servers… I try not to RDP to them to do work, unless I have OS level work to do. So, I might need to kick off a backup on a large DB that might take 30 minutes or longer to finish. I launch SSMS and run my TSQL command, all good. However, during the duration, if I lock my SP and then go to the rest room, and come back, I log back in, and my wireless connection is dropped… ergo, my backup has failed and I have to kick it off again.

It seems to me that perhaps there’s a proximity sensor on this thing, kind of like the XBOX Kinect camera that knows not only that I’ve locked my SP, but also that I’ve walked away, and it decides to override my power configuration and take a nap. That’s not a good thing… if I wanted to have to use a VDI type interface for all my work, I’d work off my Android phone with a Blue Tooth Keyboard, Mouse, and Micro HDMI to my monitor. If I can’t keep this thing running while on house power while I lock it, and go to a 30 minute meeting, it’s not useful as a workstation replacement. Off house power, I get it… but on it, let me decide – I am still a user, right Redmond?

Windows Updates affected SQL Server?

I had read about the issues with the SQL Agent “Is Alive” checks spamming event logs on clustered SQL Server 2012 and SQL Server 2008 R2 SP instances… and since I have three sets of 2 Node clusters running multiple instances including 2012, 2008R2 and 2008, after reading about that a few months back, I checked my servers. None of them were spamming the logs. They all have (change that to HAD now that I’ve upgraded) SQL 2012 RTM, SQL 2008 R2 SP2, and SQL 2008 SP3. Not a one of them ever had the “dreaded” LooksAlive or CheckServiceAlive entries with Event ID 53 from [sqagtres]. Here’s one connect article that shows some more details and which versions are fixed with what CU’s.

Anyway, so, none of the clusters in my environment have any history of Event ID 53 for SQL Agent. Last night, the System Administrator applied some Windows Updates (I don’t have a “test” SQL Cluster, but all these updates were applied on our Test SQL Servers prior to last night). After the Windows updates were applied, my event logs on ONE server started being spammed with the Agent Alive checks. WTF? I thought this was a SQL Agent bug?

Here’s the list of Updates applied last night. KB2799494, KB2789645, KB2251487, KB2790655, KB2538243, KB890830, KB2790113, KB2789642, KB2797052, KB2687441, KB2596672, KB2778344, KB2645410, KB2792100, KB2538242.


On Monday (since I’ve worked enough this damned weekend already), I’m going to go through the other two clusters and compare Updates that were installed, and base levels of SQL Server. Off hand, I think that both of the other Clusters were running SQL 2012 SP1 already.

Oh, another odd data point for me to consider… I was getting these LooksAlive entries in the event logs for the SQL 2008 instance!

Finally, a parting thought. Installing CU3 for SQL Server 2008 R2 at 2AM (or was it 3AM, damned DST!) on a Sunday morning, I really liked that I had to implement a work around to get the POS CU3 installed. Yeah, the old “Rename the C:\Windows\System32\perf-*-sqlagtctr.dll” file to allow the installer to upgrade your SQL Server bug. Thanks for that MS.

Surface Pro initial post

I got a Surface Pro through work, some there are thinking that this might be a good laptop refresh replacement asset. All in all we got almost a dozen and spread them out pretty good among IT. We have some Client Services folks, our “Executive Liaison Group”, our AD Administrator who does way too much with GPO management, myself the DBA, and a couple managers (I know!) with one. I am going to focus on using this as a complete laptop replacement so lets break for a paragraph about that!

I’ve had my trusty old Lattitude e6410 since starting at this company, so almost 3 years now. I had the last RC of Windows 8 on it in a dual boot mode, and when 8 RTM’d, I “side-graded” to that release. I say that because I have a blog post on how I did it, and it really generated a lot of traffic for me. I happened to answer another DBA’s twitter post about the possibility of “upgrading” from RC to RTM, he also blogged about it, referenced me and my post, and that got picked up in a Computer World article. That generated around 10k hits last year, and close to that already this year. Boring, I know, but I wanted to point out that I’m already somewhat familiar with the Win 8 “Metro” interface. Even though the Dell isn’t a touch screen, and I immediately found and installed Classic Desktop, I still played around over the past few months with Metro.

Surface Pro

So, on to the goods. My plan is to hopefully write a weekly entry about how the week was. Today I just bought a plain and cheap Mini DisplayPort -> DVI adapter from Office Max. It is an iLuv Mini DisplayPort to DVI Adapter with the actual note on the bottom of the box “* MacBook Pro and DVI Cable not included”!


It was $24.99, but available today, so I grabbed it. It just worked, I was a little concerned about it hanging with all the other elite and ultra-cool Apple gear, but it just works. I have an old 19″ Dell LCD that does 1280 x 1024 and it works, and looks pretty darned good. Next week, I should be getting a mDP -> Dual DVI adapter from Monoprice. Here’s the part an ugly little thing, but at $40, I don’t care. I’ll tuck it up between my two 24″ Dell displays at work, and no one will see it.

That’s when the testing will start in earnest. I consider myself a power user, as a sole DBA at this large company, I’ve usually got dozens of windows open and activities happening in the background. Whether it’s powershell scripts to gather SQL Server documentation, fixing Merge Replication crashes, doing new installs, running XPERF to trace down driver/hardware induced DPC’s… I’m doing a lot. The specs of this tiny little Surface Pro are probably better than my laptop. It’s also got a Core i5 (at least one generation older), 4 GB of memory, a 250GB spinning magnetic drive, and a crappy little display. The Surface is so much better, with a 128GB SSD, latest gen i5, a super-duper-spectacular display, and also 4GB of RAM.

I’ll work on some better pictures for next week… and clean the screen!!! eewwww!


T-SQL Tuesday #39 Changing Service Accounts or Passwords with PoSH

T-SQL Tuesday #39 Changing Service Accounts or Passwords with PoSH

T-SQL Tuesday

I’ve done one other of these in the past. This is a good one, and I too love using Powershell to automate management of multiple SQL Servers. Mine is an example of a situation I was in. We had 30 or so SQL Servers that were configured with Domain accounts to run each of the major services – the Engine, SSAS, SSIS, SSRS, and the Agent. However, these accounts were shared across the computers, so an account XYZ\SQLServer-SVC was used to run SQL Server Engine on all the servers. Due to circumstances outside my control we had to change passwords on all accounts.

I was dreading the fact that I would have to take a major outage and race to change the service account password in AD, and then do each server individually. Then I thought, POWERSHELL, you can save me! Using SMO.WMI.ManagedComputer was the trick for me. Here’s the code.

#Load the SqlWmiManagement assembly off of the DLL

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement") | out-null
$serviceAccountList = Get-Content "SQL Servers.txt"

foreach ($server in $serviceAccountList)
    $SMOWmiserver = New-Object ('Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer') $server #Suck in the server you want
    $changeService = $SMOWmiserver.Services | where {$_.serviceAccount -eq "XYZ\SQLServer-SVC"} #| select name, type, ServiceAccount, DisplayName, Properties, StartMode, StartupParameters | Format-Table
    if ($ -ne $null) {
        foreach ($service in $changeService)
            "Service " + $ + " found on " + $server
	    # Change the service account name or password as needed		

            "Service " + $ + " changed on " + $server
            $service = $null
    } else {
        "Specified account was not found on " + $server
    $changeService = $null
    $SMOWmiserver = $null

Right there, $service.setServiceAccount, that’s the magic. The nice thing about this approach is that it will take care of the dependencies. I started with another approach which would change the service account and then restart the service, but had issues with dependent services, like SQL Agent stopping me from stopping SQL Server outright. With the setServiceAccount part of ManagedComputer, it handled it for me.

Oh, and all new Servers, they have unique domain Service Account credentials, we no longer share a service account across computers. With SQL 2012, I plan on testing out Managed Service Accounts.

Extended Properties as a Run Book?

Perhaps Run Book isn’t the right terminology, but it’s close. So, in my environment we are a manufacturer of large equipment. We do have a centralized IT organization, but it seems to be more focused on keeping the lights on and trying to catch up on what the business purchases off the shelf or even what some groups have developed on their own.

One thing I’ve struggled with in the past two years, since this company has no defined processes (no ITIL or anything similar) and little to no documentation, has been in capturing what databases are where, who is the “Business Owner”, what the application is, and such. I’ve also started implementing DNS aliases for connection strings so that we can be more portable when moving applications, especially COTS systems where we might not have the understanding to change a set of connection strings. As you may know, I have a number of Powershell scripts that capture the specifics of the servers, instances, database properties, file locations, etc. One thing that these cannot grab are the applications and business representatives. Sure, with my two years of knowledge, I can look at a database server, look at an instance name, look at the name of the database and make a good guess at what it does, where it runs, who I should contact if there are problems or changes are planned… but when onboarding someone new, it would take an incredible amount of time for that new person to build up the same intimacy with the environment.

Previously, the approach was to take the Excel document output from the Database Properties script and add a column with “Owner” which sometimes had a name I could place with it.

Today I received a request for a new blank database via email from a quasi-developer in another location. This person is not in IT but has developed a number of supporting type applications for the local users at his manufacturing site. I thought to myself, I have all the details in this email from the requestor, maybe we should just embed this email to the excel document. But I’m lazy, and this is a very loose requirement, I could be busy, I could be in a procrastination mood and put off the follow up documentation for… well,,, ever. Next, I thought, we have a web based in-house developed tool for requesting services from IT. So I thought, hey, I could make it a requirement that a ticket be created for any new Database needed, and then just copy the ticket number into the spreadsheet. Blah, same thing, when I’m off creating a DB in SSMS, I don’t really want to have to open a spreadsheet on our SharePoint site to update that… still too manual and loose.

So, I thought, why not make it part of the “Create database” process to copy the details from the request ticket into Extended Properties inside the database. What I came up with was setting up four Key, Value pairs to start with. A Description which contains some details of the what the database is used for. An AppOwner key with the requestor name or whomever is identified as the owner. A Ticket key with the number from the request. The final required one is the ConnectionString which contains the DNS alias name created for the application to use. I kind of like it, it’s tidy and neat, and it stays with the database if I restore to a new server. The only maintenance might be to change the AppOwner if they change roles or leave the company.

I’m going to modify my above referenced powershell script to include grabbing these keys and values from every database when I run it. I’ve also been noodling a type of “Self Service portal” where power users like the requestor above can hit a web page form that would then kick off an approval work flow and I could then choose the server to deploy to and the database create and all the above documentation would be created on the fly. That would be really neat, the only flaw would be organizational changes.

In short, I’m going to make it a requirement than any new database request be filled out with our services request system, and require a description of the application just for paperwork reasons… it’s much easier to run a web based report than search a DBA’s PST file for auditing purposes. I’m going to have the DBA team start using the Extended Properties for the database to hold that. I’m going to “Extend” my powershell script to grab those properties for our documentation purposes. I may or may not work on a Provisioning Portal… I might be getting too far ahead of the rest of the IT organization here.