Hyper-V and VHD Performance – Dynamic vs. Fixed

My name is Tim Litton, I work as a Program Manager within the Microsoft Windows Server team, and my particular area of focus is performance optimization for Hyper-V.


With the recent release of Hyper-V, customers are starting to ask us how to configure Hyper-V to get the best performance.  It’s generally recognized that there is overheard running a virtualized environment, but the question that really needs to be answered is how much?


With this in mind, I thought I’d share some of our recent testing of Hyper-V and how disk workloads perform when using Fixed or Dynamic VHDs.  The goal here is to provide some data that backs up the tuning guidance that can be here: http://www.microsoft.com/whdc/system/sysperf/Perf_tun_srv.mspx.


The following graph shows the relative performances for a number of different scenarios (with Dynamic VHD being the baseline).


Hyper-V VHD Performance - Dynamic vs. Fixed


Fixed VHD always performs better than a Dynamic VHD in most scenarios by roughly 10% to 15% with the exception of 4k writes, where Fixed VHD performs significantly better.


We ran 16 virtual machines when performing these tests (see “How We Tested” below) with the goal of evaluating how well Hyper-V performed in the server consolidation scenario.  Being able to consolidate a number of physical machines onto a single machine and have the virtual machines able to handle the load is a very important design goal for Hyper-V.


The exact result that a customer is going to see will depend on quite a few variables (e.g. how large and often the reads and writes are, how many outstanding I/O there can be at one time), so performing real-world testing is the best way to assess what impact virtualization will have.


Recently, QLogic  published a benchmark for I/O throughput for storage devices going through Windows Server 2008 Hyper-V (http://www.qlogic.com/promos/products/hyper-v.aspx) that closely matches the native performance, thus demonstrating Hyper-V’s ability to bring the advantages of virtualization to large-scale datacenters.


How We Tested

Hardware: DP DL580 G5, 16 x 2.4 GHz (Intel E7340), 16 GB RAM


Disk: HP P800, 25 spindles


Virtual Machine Setup: 16 Virtual machines, each running Windows Server 2008 Enterprise Edition (64 bit), 1 CPU, 796 MB RAM


Testing Software: We used IO Meter (http://www.iometer.org) to generate the workload for the I/O system, with a maximum number of 8 outstanding I/Os per virtual machine to a 100MB file.

Comments (11)
  1. Anonymous says:

    If you are in the midst of testing and deploying Virtual machines using the VHD file format on Hyper-V

  2. Anonymous says:


    Are these fixed vs. dynamic tests you run Hyper-V or Hyper-V R2?    It is my understanding that dynamics perform somewhat better in the R2 version due to tweaking.

    The tests in this blog post reflect Windows Server 2008 Hyper-V results. The

    VHD Performance
    white paper published by the Hyper-V team in 2010 compares Windows Server 2008 and Windows Server 2008 R2 VHD performance, Windows Server 2008 R2 VHD types, and provides guidance on choosing a storage
    container format. The same white paper also points to Tony Voellm’s blog for a

    side-by-side performance and scale comparison of Hyper-V releases
    For general Windows Server 2008 R2 Hyper-V tuning guidance please see the
    Performance Tuning Guidelines for Windows Server 2008 R2
    white paper.

  3. Anonymous says:

    Tim Litton posted in his blog about performance optimization for Hyper-V, looking at fixed versus dynamic

  4. Anonymous says:

    zowel bij de Hyper-V dag als bij de DMCUG dag waren er performance vragen over fixed disk versus dynamic

  5. Anonymous says:

    Tim Litton posted a blog about performance optimization for Hyper-V, looking at fixed versus dynamic VHDs. Here’s an excerpt: The following graph shows the relative performances for a number of different scenarios (with Dynamic VHD being the baseline).

  6. Anonymous says:

    We ran a Hyper-V day at the end of October, and collated a series of useful links and best practice guidance.

  7. Anonymous says:

    Audience : SharePoint Solution Architect, SharePoint Infrastructure Architect, Consultant, IT Administrator

  8. Anonymous says:

    Virtualisierung sind super, sei es für Legacy-Systeme, Testbetrieb oder Konsolidierung von Hardware.

  9. Anonymous says:

    Catch the blog by the performance team around using Hyper-V and VHD files. Read it here .

  10. Anonymous says:

    Video #9 in the Windows Server 2008 Hyper-V series, is this video focussed on the use of Pass Through

  11. John Keating says:

    Would you see the same peformance degradation if you pre-grew the VHD (let’s say a 36 GB partition) with a  filler file (let’s say 10 GB) and then deleted it. That way, you would get around the possible file fragmenetation (which is probably causing the
    performance issue with the dynamic disks)?

    With Dynamic Disks there is an extra translation done to map the logical disk location within the Dynamic VHD to the physical backing file location so Fixed Disks will always be faster than Dynamic Disks.

    Details on the VHD format can be found here:

    When writing to a new location on a Dynamic VHD, it may be necessary to grow the physical backing file, which means more
    I/Os are required to satisfy the request. 

    Tim Litton, Program Manager, Windows Server Performance.

Comments are closed.

Skip to main content