I'm going to approach this topic over a number of posts, as something I've been thinking about rather a lot lately.
Basically, the challenge is about finding out what impact making a change to the business environment will have: either positive or negative, and then using that information to either justify making the change in the first place (so it's not really measuring business impact, but estimating future business impact of an impending change), or a retrospective measurement to either decide if some earlier change was a good thing (and maybe to see if it should continue).
Most of the time you'll read about managing business impact, reducing cost, improving flexibility etc etc, it will be coming from someone trying to sell you something - an IT supplier saying that the latest version of this is going to solve all sorts of problems (some of which you don't even know exist yet), or an IT or business analyst trying to sell you their insight and knowledge, without which you're bound to fail and wind up on the scrapheap counting all those missed opportunities you just couldn't see at the time.
Numerous terms have arisen, to try to describe this impact or to frame a way of counting the scale of it. Just a few examples:
TCO - Gartner Group coined the "Total Cost of Ownership" term in the late 1980s, to describe the cost of running a whole IT system, not just the cost of buying it or implementing it in the first place. It's one of the most-used terms when it comes to talking about the benefits of some new IT system, partly because most businesses would see a benefit in reducing operational costs... so think that TCO reduction is inevitably a good thing. The irony is that, in my experience at least, many businesses don't really know what their costs are (other than at a high level) so measuring a change in TCO is going to be difficult to do at any specific level.
Think of an example of support costs - if a new project aims to reduce the costs of keeping everything running, the only way you'll know if it was effective would be to know what the true cost was in the first place. I've seen some businesses which can tell exactly how much it costs to provide really specific services to their users - like $10 a month to put a fixed phone on a user's desk - so can more accurately estimate how much of a saving will be generated by rationalising, or improving the current systems.
RoI - a catch-all term for what the return on any investment will be, and (in measuring terms at least), what the time frame for that return will be. Just as one way of making more money is to reduce the costs of operations, investing in something new which returns more money back into the business is a clear way of growing. The downside of looking for an ROI in every investment, however, is that the knock-on ROI will be in some associated project which you might not be expecting right now, or measuring currently. What I mean by that is, the fact that you made some change to the business might not bring about any RoI in itself (eg increasing capacity on the network) but it will allow other project (like deploying a new application) to be more effective.
TEI - Forrester Research came up with this one, possibly in answer to the noise Gartner was making about their TCO model... though it does go further than just look at cost. "Total Economic Impact" tries to correlate cost, benefit and (most importantly, perhaps) the future flexibility that might come about by making some change, with the risk inherent in doing so.
Even when thinking about the financial models for justifying expenditure (let's assume it's a new software deployment, which will have direct costs - licenses - and indirect costs - the time of people to carry out testing of the software, training time for end users etc), it's very easy to get caught up in thinking too closely about the project in question.
One concept that stands out to me when talking about IT investment, is that of opportunity cost - an economics term which isn't really measured as a value of cost at all, but it's the missed opportunity itself. In basic terms, the opportunity cost of going to the cinema on a Saturday afternoon is not going to see the football. In that example, it's a straight choice - you can only do one of those things at that point in time, and the cost will be the missed opportunity to do the other. The element of choice there will be to decide which is going to be better - which is going to cost less, or which might return a higher degree of satisfaction, possibly.
Thinking about opportunity cost in business terms is a bit harder, since we often don't know what the missed opportunity is until we look back some time later and realise it then. To flip that idea on its head, let's say you want to measure how effective someone is at doing their job.
Just about every employer has measures in place to try to figure out how well the employees are doing - in relative terms, measuring their performance in comparison with their peers, or in financial terms, to decide if the resources being used to employ that person could be better used in a different area, or if more resources should be deployed to have more people doing that type of job.
Let's take the example of a restaurant. Making a successful business will depend on a whole load of relatively fixed factors - the location of the building, the decor and ambience of the place, for example - as well as lots of flexible things, like the quality and price of the food or the effectiveness of the service. There will even be external factors that the restaurant could do nothing about, except possibly anticipate - such as change in fashion or a competitor opening up across the street.
If the quality of the food is poor when compared to the price, the standard of service and the overall ambience of the place, then customers will be unhappy and will likely vote with their feet. If the food is consistently average but cheap, then people will come for that reason (just look at most fast food outlets). Each of these factors could be varied - raising the price of the not-so-nice food, or paying more for ingredients to get higher quality, or firing the chef and replacing him with someone who's more skilled - and they should make a difference, but the problem is in knowing (or guesstimating) what that difference will be before deciding on which factor to vary, and by how much.
When businesses look at how they invest in people, it's easy to question the amount spent on any specific role. In the restaurant case, would adding another chef make the quality better? Would it improve the time it takes to get meals out to customers (something that's maybe really important, if it's a lunchtime restaurant but maybe less so in some cosy neighbourhood trattoria)? Would the knock-on effect be worth the extra cost of employing the chef? And would the extra chef just get in the way of the existing one, reducing their individual effectiveness?
I've said to people in my own employers in the past, that the only way they will really be able to measure how good a job my team does, is to stop us doing it and then observe what happens 2 years later. So what if the restaurant reduced the number of waiting staff, and switched from using expensive, fresh ingredients to cheaper frozen stuff, in an effort to reduce cost? On one hand, the figures might look good because the cost of supply has just dropped and the operational costs have been reduced too.
But the long term impact might be that loyal customers drift away as the food's not as good value as it was before, or a bad review from an unexpected restaurant critic. At that point, it could require a huge effort to turn things around and rebuild the tarnished name of the place.
So what's the point of all this rambling? Well, in the next installment I'll look at some of the TCO/ROI arguments around Exchange Server...