Benchmarking Windows 7

A very common request that clients make when starting a Windows deployment project is that we must reliably and accurately demonstrate that Windows Vista or Windows 7 is worth upgrading to from Windows XP, especially if the purchase of new hardware is not part of the project; often, and logically, clients will try to re-use their existing hardware platform whenever possible in order to save money.  Consequently, I am regularly asked to prove that the newer version of Windows will run equally as well as Windows XP on the same hardware.  Their reasoning being is that they won’t upgrade the operating system on the existing hardware if the end-user experience is worse in terms of performance – regardless of how good the new features of the operating system are.  The problem with this is two-fold, one is a technical issue and the other is pure fantasy...

 

Attempting to benchmark and compare two different operating systems (yes, I know that each version of Windows is an evolution of the previous one, but it is still complicated) and providing accurate results is an extremely complicated task.  Unless done right, the results can be often speculative and can be turned upside down purely on your point of view.  It is also dodgy territory to get into because of the sheer number of factors that need to be taken into account in order for the test to be of any value, and then trying to explain the results to a lay person can be even more tricky.  This is because benchmarking a computer is not simply about measuring the raw speed of the CPU, in the same way that the horsepower alone of a car cannot be used to gauge if one car is quicker than another – you need to consider other factors such as the weight, ratios of the gearbox and torque.

There are plenty of free tools that you can find on the Internet that will measure your computers speed, but more than a few are totally useless because they are simply measuring individual parameters such as CPU speed and RAM read/writes.  To benchmark a computer properly you need something that will measure hundreds of different aspects of the computer’s hardware and the operating system installed onto it, in order to produce an reliable result; even things such as the BIOS software can have a profound effect on the results from the benchmarking tests.

 

Because this is such a hot topic, and given that Windows 7 is around the corner, Microsoft published some benchmarking guides so that you can start measuring right now the performance of your systems with Windows 7.  I have included the link for the Windows 7 guide below, along with the (now defunct) Windows XP guide.  The Windows XP guide is there because, after all, the only real way to gauge the performance of your system is when you compare it to a different operating system running on the exact same hardware.

 

Benchmarking on Windows XP: www.microsoft.com/whdc/archive/benchmark.mspx

Performance Testing Guide for Windows: www.microsoft.com/whdc/system/sysperf/Win7Perf.mspx

 

Oh, and the “pure fantasy” I mentioned earlier...  Well, Windows Vista has had a somewhat “turbulent” life in terms of public perception.  I have always honestly told clients that my personal experiences with Windows Vista have been excellent, and it has always been fast for me.  Likewise, when I have worked on Windows Vista deployment projects this has also always been the case, the deployed computers always performed better than clients expectations.  However, often when clients talked about benchmarking Windows Vista they were very dubious of the results and tended to disregard them purely because they had it in their head that Windows Vista was/is a terribly slow beast.  Overcoming their personal views was actually extremely hard to do, regardless of how well benchmarking facts and statistics actually were.