Tuning SQL Server Management Operations

Doctor Scripto

Summary: Microsoft PFE, Thomas Stringer, talks about using Windows PowerShell and tuning SQL Server Management operations.

Microsoft Scripting Guy, Ed Wilson, is here. I would like to welcome back guest blogger, Thomas Stringer

I spend a lot of time talking with customers and working with them regarding how they can manage their large enterprise SQL Server environments seamlessly with Windows PowerShell and automation engineering in general. Fun puzzles and obstacles always come along, and they cause me to think outside of the box, and possibly veer off “the norm” when I am reaching out to a large number of machines.

Automation and coded management is much like SQL Server development. If you have a single table, with only 10 rows, and you write a query that runs once a year, it probably doesn’t make a difference if that query completes in 2 milliseconds or 2000 milliseconds. But if the table is 10 billion rows, and users or applications are running this query hundreds of times a second, you can surely bet that you would want to make sure you have tuned and optimized that query to the nth degree.

Managing enterprise environments with Windows PowerShell is just like that. If you have a script that is run once a year against 10 servers, it probably doesn’t make much of a difference if that script takes a few extra minutes. But with the current environments that we’re working with, we’re talking about thousands and thousands of servers, and even tens and hundreds of thousands of databases. You can probably imagine that a small cumbersome operation gets quite magnified rather quickly.

The performance tuning arena is no longer restricted to SQL Server developers or application developers. Because enterprise environments are growing quickly, operations developers are now forced to benchmark and optimize their own code. That’s a good thing. There’s nothing quite like an illustrative and code-filled example to show the type of methodology I take when tuning management operations with Windows PowerShell (in my case—being a SQL Server person—the end result is almost always SQL related).

Let’s say that I’m a SQL Server DBA managing an extremely large environment and I have now been tasked to pull a handful of metrics from all of the transaction logs, from all of the databases, from all of the instances in my environment. I think to myself, “Excellent. I’m a SQL-slash-PowerShell guy. I’ll have no problem with this at all.”

So like many of my automated SQL tasks, I jump straight into utilizing SMO (Microsoft.SqlServer.Management.Smo namespace). For those of you who may not be familiar with SMO, it exposes SQL Server management in a very easy-to-use, object-oriented manner. In the case of my aforementioned task, this is as simple as pulling values from the Database.LogFiles property (a LogFileCollection object). Following is some sample script that would accomplish this:

  foreach ($SqlDatabase in ($SqlServer.Databases | Where-Object {$_.Status -eq [Microsoft.SqlServer.Management.Smo.DatabaseStatus]::Normal})) {

    foreach ($SqlLogFile in $SqlDatabase.LogFiles) {

      $SqlLogFile |

        Select-Object @{Name = "DatabaseName"; Expression = {$SqlDatabase.Name}},

          @{Name = "SizeKB"; Expression = {$_.Size}},

          @{Name = "UsedSpaceKB"; Expression = {$_.UsedSpace}},

          @{Name = "FreeSpaceKB"; Expression = {$_.Size – $_.UsedSpace}},

          @{Name = "Used %"; Expression = {$_.UsedSpace / $_.Size * 100}}

    }

  }

Nothing to it. Sit back and watch the log-file usage metrics get retrieved and calculated. In my test environment (like with most things we do, these numbers will drastically vary from environment to environment), I get an average of about 160 milliseconds for a run.

You may be thinking, “Sure that’s pretty quick, no big deal.” And you’d be absolutely right—if you’re running this against a single instance, or ten instances, or even 100 instances. But what happens when we start running this against thousands of servers, and we are pressed for time on this diagnostic retrieval? Great question, because it could become a very real issue.

Now I want to simulate what this would look like when we run this methodology against 5,000 servers. To test this and time it, I have bundled this logic (in addition to the code to time the operation with the System.Diagnostics.StopWatch class) in a function:

function Test-SqlBenchmarkSmo {

  param (

    [Parameter(Mandatory = $true)]

    [string]$SqlServerName

  )

  Add-Type -Path "C:\Program Files\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll"

 

  $StopWatch = New-Object System.Diagnostics.Stopwatch

  $StopWatch.Start()

  # [SMO] — retrieve log file consumption

  #

  $SqlServer = New-Object Microsoft.SqlServer.Management.Smo.Server($SqlServerName)

  $SqlServer.ConnectionContext.ApplicationName = "SqlBenchmarkSMO"

  foreach ($SqlDatabase in ($SqlServer.Databases | Where-Object {$_.Status -eq [Microsoft.SqlServer.Management.Smo.DatabaseStatus]::Normal})) {

    foreach ($SqlLogFile in $SqlDatabase.LogFiles) {

      $SqlLogFile |

        Select-Object @{Name = "DatabaseName"; Expression = {$SqlDatabase.Name}},

          @{Name = "SizeKB"; Expression = {$_.Size}},

          @{Name = "UsedSpaceKB"; Expression = {$_.UsedSpace}},

          @{Name = "FreeSpaceKB"; Expression = {$_.Size – $_.UsedSpace}},

          @{Name = "Used %"; Expression = {$_.UsedSpace / $_.Size * 100}} |

        Out-Null

    }

  }

  $StopWatch.Stop()

 

  return $StopWatch.ElapsedMilliseconds

}

All I’ve really done here is the same SMO operation of getting our log-file metrics by looping through the databases on a given instance (passed as a parameter to the function), all the while timing this. The return of the function is going to be the amount of milliseconds that a single run has taken.

To loop through 5,000 servers in my environment, I needed to load up a string array ($SqlServerNameArray) with a list of instance names (basically two distinct servers that are duplicated 2,500 times). Then I loop through this list of names, calling my Test-SqlBenchmarkSmo function, adding up the durations, and storing it in the $CumulativeDuration variable:

$CumulativeDuration = 0

foreach ($SqlServerIndividualName in $SqlServerNameArray) {

  $CumulativeDuration += Test-SqlBenchmarkSmo -SqlServerName $SqlServerIndividualName

}

Write-Host "SMO   Benchmark — exhaustive test, $($SqlServerNameArray.Count) servers — :: Total: $($CumulativeDuration / 1000) seconds  Average: $($CumulativeDuration / $SqlServerNameArray.Count) ms" -ForegroundColor Yellow

In my case I get the following output:

Image of command output

Wow! For 5,000 servers, it takes over 13 minutes to get what I need. Now, I’m going to preface the continuation of this blog post by saying: If this meets your business requirements and works for you, leave it at that and go forward with your acceptable solution. I am not one for needless optimizations, especially when there are dozens of other priorities that need to be complete today before you close shop.

If my 13 minute operation is simply unacceptable, I need to start exploring other ways to get this information with less operation duration. There are dozens of ways to pull this transaction log-file information from all of these instances, and SMO only provides one of these solutions. Albeit, a solution that is relatively simple, quick, and perfect for 95% (estimated, of course) of the requirements. But if we’re in that 5% slice, we may need to step away from SMO to cut down this runtime duration.

Now, being a SQL Server guy, I know that behind the scenes, SMO is just writing normal T-SQL queries against my instance. Nothing special, but it is very well packaged, has a lot of handling for annoying cases (such as different versions), and other instance specifics that we wouldn’t normally think of code for—or even want to concern ourselves with. The power of SMO is that it takes a lot of the nitty gritty operations development and makes it an abstraction. But that abstraction may indeed come with additional duration, like we are seeing here.

If I want to take out the extra steps that SMO is performing, all I really need to do is be the direct composer of the T-SQL that is getting executed on the instances. One way that I can do this is through directly leveraging the ADO.NET .NET Framework Data Provider for SQL Server (System.Data.SqlClient). There are most definitely a lot of downsides to taking this route instead of the SMO way:

  1. I need to know T-SQL, and SQL Server in general, relatively well.
  2. I forfeit all safeguards and checks that may have been happening previously.
  3. Anything that needs to be done, now has to explicitly be done by me and my own code.

I know all this, and I still want to move forward and see what kind of gains I can get from working directly with System.Data.SqlClient. So I mimic my previous benchmark against 5,000 servers. Instead, this time I use SqlConnection, SqlCommand, and the other ADO.NET classes:

function Test-SqlBenchmarkAdoDotNet {

  param (

    [Parameter(Mandatory = $true)]

    [string]$SqlServerName

  )

 

  $StopWatch = New-Object System.Diagnostics.Stopwatch

  $StopWatch.Start()

  # [ADO.NET] — retrieve log file consumption

  #

  $ConnectionString = "data source = $SqlServerName; initial catalog = master; trusted_connection = true; application name = SqlBenchmarkADOdotNET"

  $SqlConnection = New-Object System.Data.SqlClient.SqlConnection($ConnectionString)

  $GetLogSpaceCmd = New-Object System.Data.SqlClient.SqlCommand

  $GetLogSpaceCmd.Connection = $SqlConnection

  $GetLogSpaceCmd.CommandText = "

    declare @sql varchar(max) = '';

 

    select

                @sql +=

                            'use ' + quotename(name) + ';' +

                            char(13) + char(10) +

                            'select database_name = ''' + name + ''',

                                        size_kb = size * 8,

                                        used_space_kb =

                                                    fileproperty(name, ''spaceused'') * 8,

                                        free_space_kb =

                                                    (size – fileproperty(name, ''spaceused'')) * 8,

                                        used_percentage =

                                                    fileproperty(name, ''spaceused'') * 1.0 / size * 100

                            from sys.database_files

                            where type_desc = ''log'';'

    from sys.databases

    where state_desc = 'online';

 

    exec (@sql);"

  $SqlDataAdapater = New-Object System.Data.SqlClient.SqlDataAdapter($GetLogSpaceCmd)

  $ResultSet = New-Object System.Data.DataSet

 

  try {

    $SqlDataAdapater.Fill($ResultSet) | Out-Null

 

    foreach ($DataTable in $ResultSet.Tables) {

      $DataTable |

        Select-Object @{Name = "DatabaseName"; Expression = {$_.database_name}},

          @{Name = "SizeKB"; Expression = {$_.size_kb}},

          @{Name = "UsedSpaceKB"; Expression = {$_.used_space_kb}},

          @{Name = "FreeSpaceKB"; Expression = {$_.free_space_kb}},

          @{Name = "Used %"; Expression = {$_.used_percentage}} |

        Out-Null

    }

  }

  catch {

    Write-Error $_.Exception

  }

  finally {

    $SqlDataAdapater.Dispose()

    $GetLogSpaceCmd.Dispose()

    $SqlConnection.Dispose()

  }

  $StopWatch.Stop()

 

  return $StopWatch.ElapsedMilliseconds

}

Note  The potential pitfalls become extremely evident when I look at my new function. That T-SQL query may seem trivial to experienced SQL Server DBAs, but the person mocking up this Windows PowerShell script may not necessary have the expertise and knowledge of these finer details. Not to mention, what happens if we are trying to hit a system catalog view or dynamic management views that existed in SQL Server 2012, where we did all of our testing, but that same object didn’t exist in the same form, or at all for that matter, in SQL Server 2005? Now we have a breaking dependency that we didn’t think of.

Much like our initial SMO testing, let’s hammer this function 5,000 times and see what kind of duration we get:

$CumulativeDuration = 0

foreach ($SqlServerIndividualName in $SqlServerNameArray) {

  $CumulativeDuration += Test-SqlBenchmarkAdoDotNet -SqlServerName $SqlServerIndividualName

}

Write-Host "ADO.NET Benchmark — exhaustive test, $($SqlServerNameArray.Count) servers — :: Total: $($CumulativeDuration / 1000) seconds  Average: $($CumulativeDuration / $SqlServerNameArray.Count) ms" -ForegroundColor Yellow

Following is the output I get in my environment:

Image of command output

Excellent! I’m at an average time of less than 10 milliseconds for each server, and a total of about 49 seconds. So my processing has brought the runtime duration from 13+ minutes down to a sub-minute time.

One of the reasons that this happens is because my ADO.NET example is lightweight and it assumes nothing (which, as I said before, could actually be a very bad thing and make the development and testing process much longer). We can see this directly by looking at the actual T-SQL statements that are being executed on the instance.

If you are a SQL Server professional, you will be very familiar with what I’m about to do. But if you are not, I’m basically setting up a tracing mechanism to watch the SQL statements that hit the instance.

If you happened to notice in my previous examples, I set the Application Name parameter of my connection string to “SqlBenchmarkSMO” and “SqlBenchmarkADOdotNET” respectively. The reason behind that is two-fold: First, I don’t want to see any chatter of other SQL statements outside of my testing. And secondly, I want to be able to distinguish commands that come from my SMO test, and commands that came from my ADO.NET test, so I can get a count on the volume of statements. Here is the definition (Transact-SQL) of my tracing mechanism:

if exists (select 1 from sys.server_event_sessions where name = 'SMOvsADOdotNET')

begin

            drop event session SMOvsADOdotNET

            on server;

end

create event session SMOvsADOdotNET

on server

add event sqlserver.sql_statement_completed

(

            action

            (

                        sqlserver.client_app_name

            )

            where

            (

                        sqlserver.client_app_name = 'SqlBenchmarkSMO'

                        or sqlserver.client_app_name = 'SqlBenchmarkADOdotNET'

            )

)

add target package0.event_file

(

            set filename = N'<path to my XEL file>'

);

go

 

alter event session SMOvsADOdotNET

on server

state = start;

go

Note  The code that is included in this blog post, both Windows PowerShell and T-SQL, is intended only for illustration. It is not meant to be run in production without understanding it and reworking it for your environment.

When I run my test again while monitoring (I’m only going to run it against one server because the statements count is going to be relative), I see the following output:

Image of command output

For each run in my SMO testing (this will vary based on the count of databases in the instance, but you get the idea), there were roughly 122 SQL statements. Conversely, my ADO.NET testing had only 3 SQL statements. The extra statements in the SMO example aren’t necessarily bad. In fact, they’re making wise checks. But I’m looking to shave time off my management operation in lieu of that for this particular scenario. Mission successful.

In summary, if you are racing against the clock with your management operation, it is wise to take a few steps back and analyze where performance can be gained. It could be in a single line of code, a function, the namespaces you consume, or even in your entire approach.

I showed one example for how this can be done with SQL Server and a single alternative. Don’t needlessly sacrifice the benefits that great solutions, like SMO, provide with an extra layer of checks and ease-of-use. But sometimes we need to step further and control more aspects of the operation if there is a warranted demand.

You can download the full script from the TechNet Gallery: Sample Benchmark Test on SMO vs. System.Data.SqlClient.

I hope you have enjoyed this blog post, and please feel free to reach out to me at sqlsalt@outlook.com if you have any questions. Likewise, feel free to post comments in the Comments box that follows. Thanks!

~Thomas
  Twitter: @SQLife

Thank you, Thoma,s for sharing your time and knowledge.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy 

0 comments

Discussion is closed.

Feedback usabilla icon