Virtually Unstoppable


A friend insists that we’ll only
know the recession is over when software vendors no longer start every
whitepaper with the phrase “In these tough economic times …” It may be as
reliable an indicator as any.

Meanwhile, in these tough economic
times, I often read of factories suffering so badly that they are “operating at
only 50% of capacity.” For a manufacturing plant, such low utilization is a
disaster. So, gentle reader, what do you think would be the average utilization
of your data center’s capacity? Nothing like 50%, that’s for sure. Typical
enterprise servers run at about 10% utilization according to a recent McKinsey
report. They may, just may, be able to reach as high as 35% with a concerted
effort.

There are many good excuses for
this situation, with both business and technical justifications. Enterprise
applications on the same server do not always play together nicely. One will
demand all the memory it can get, sulking unresponsively in a corner if it can’t
get it; another will push over less aggressive applications in order to grab
more CPU. In the SQL Server world, we’re working on that continuously, with
every version adding better resource governance and management. (See http://bit.ly/ss2008rg  for specific information
about SQL Server 2008.) Then again, these same applications are often
mission-critical and it is business requirements which force us to isolate
them: from the risk of downtime, or other disruptions. Approaching our problems
in this way, it’s quite easy to add a new server for this app, and another
server for that one, and sure enough, the result is soon 10% utilization.

It won’t do. There’s a capital
cost, and fixed running costs, for every server we add, not to mention the
environmental considerations of wasted energy and resources that weigh heavily
on many of us, recession or not. I have visited datacenters in emerging
economies from Egypt to China where simply having enough power available is a
problem and resource management is imperative.

In the database world, we have
traditionally approached these problems by running multiple native instances of
servers on the same box. This can indeed consolidate hardware and reduce costs.
Nevertheless, IT managers and DBAs are increasingly looking to virtualization.
Why? There are numerous advantages. For example, with virtualization each
application can have a dedicated, rather than shared, Windows instance:
especially useful for mixed workloads; and with virtualization, instances are limited
only by the capacity of the machine, rather than the native 50-instance limit.

 SQL Server 2008 works
exceptionally well with Windows Server 2008 R2 and Hyper-V to deliver effective
virtualization. In SQL Server 2008 R2 (shipping in the first half of 2010) we
will support up to 256 logical processors on that
platform to scale those solutions even further. There are some great scenarios for this. Business
Intelligence applications such as Analysis Services and Reporting
Services are prime candidates, especially when mixed BI and operational
workloads peak at different times. Virtualization has other benefits for the database user:
for example, the lifecycle from development to test to production becomes
easier to manage with a consistent, virtualized, environment.

It’s really worth considering
virtualization, and building up your understanding of the technology and
requirements. There’s a great whitepaper at http://bit.ly/sqlcatvirtual with sound advice and background
for any SQL Server 2008 DBA considering this technology. Good material to have
to hand, in these tough economic times.

  • Donald Farmer
  • Twitter: @donalddotfarmer
Comments (3)

  1. robertcslim says:

    nowadays we received so much hype about virtualisation that we forgot about all the good practises and discipline of resources monitoring and capacity planning. This art and skill seem to have been forgotten by the IT community and professionals.

    The mckinsey reports underlined the above.when above. Da

  2. Gilbert Ayonayon says:

    how or any free software to convert windows2000 to full version of XP or greater.

    I also woul like to be able to get a membership to Microsoft WindowsXP Professionals To be Able to log on and off with

    a User name and pass word. buisness security windows logo log on before you can even see desk top. One of My desktops I had tried to complete aplication to manage a net wor I created a user’s name and pass code I had shut down the system and, then wen’t to reboot my desktop only booted half way stoped with a window’s logo Microsoft WindowsXP Professionals User name Pass code I had entered name and pass code and was denied access said I was not member of windowsXP microsoft now I need to pay for system to be restored. I’m upset for the fact I cannot get any help to simply get assistance on creating a possitive account and how to log on PLEASE HELP …My email address if you can send me info or a site to go to that will give me free membership also need a virus protection software for windows 2000 like trend norton I have my own personal Product key to activate ,just this sysem will not let me download soft ware error say’s windows servise pack 2 needs to be downloaded I tried won’t even down load I need Help to up Grade Convert windows2000 to WindowsXP or GREATER

                                      Thank You

                                   Gilbert Ayonayon

  3. John Langston says:

    There consolidation issue must be carefully studied.  One really has to know their applicaition databases and the traffic to which they are subjected to make a clear case for stand-alone database servers or to consolidate.  

    Named instances and virtualization both have their places but all to often those wishing to rush in that direction do so without understanding that either approach can demand a server with some beef to it in order to gracefully accomodate consolidation.  Again, knowing how the appplication interacts with its database and being able to document the interaction is parmount.