Examples of how customer feedback shapes Exchange: Max Hop Count

I occasionally get requests for examples of how customer feedback changed something in Exchange. We live and breathe customer feedback, from individual devs/testers/pms up to our VP, but it's probably not obvious unless you're a member of the group seeing the internal workings of how the feedback is integrated into the product.

As someone who's spent a significant part of their work life over the last several years working directly with our customers, I thought that discussing those kinds of changes might make an interesting series of blog posts. So this is the first in that line (and here's to hoping I don't pull a "History of the World Part I" on this! :-)

The one I'll start with is a change that happened near the end of the Exchange 2003 ship cycle, about two to three months before RTM (in a server product, that time period has a very high bar for fixes and all changes are under warteam control as we are focusing just on running the product at increasing scale, getting great uptime, etc). At the time, the maximum hop count in Exchange 2003 was 15. We'd thought about changing it before, but 15 was a standard on the net and we didn't want to go against standards without good reason.

In April, we started to get complaints from JDP customers (JDP was the precursor to TAP, the program I now own) about the hop count. Interestingly enough, the complaints came up because the hop count was unusually high in the way that we communicated with the JDPs, which is through a distribution list.

We have one distribution list on a box on the internet, and each customer has one entry on that DL (such as "e2k3jdp@mycompany.com"). Each customer entry points to either a distribution list or public folder in their organization, and they own the membership or view rights to the DL/PF so that they can control who works on the TAP program within their organization without requiring us to make a change. 

So for a message to get from us to a customer (or from one customer to another through the DL), it would have to go through a few hops at Microsoft, get to our box on the net, then go through a couple of hops to the customer, and then either a few hops to a PF/DL and then some more hops to expand the DL to the individuals. Depending on the complexity of the routing environments on our side or on the customers' side, this could add up to more than 15 (and we often configure unnecessarily complex environments in our Exchange configuration at MS for dogfooding purposes).

In April 2003 a couple of customers on the JDP made some configuration changes on their side that resulted in many messages sent to the JDP DL NDR'ing because the hops to the individuals was more than 15. So even though we'd decided originally to not change the hop count, we re-evaluated this and re-activated the bug we'd had that was tracking the issue. We conferred with the JDPs on what the right number was and settled on 30 to give us some headroom.

Even a change as seemingly simple as this can't be made without some consideration for the various install scenarios, however: what if a customer already configured the default to something other than 15? As a result, we implemented the logic as follows. This logic happens during Exchange 2003 install (which might be an upgrade or a reinstall):

  • If new install, hopcount = 30
  • If upgrade/reinstall and if hopcount != 15, leave it as is (customer has changed the default)
  • If upgrade/reinstall and if hopcount = 15, set it to 30 (we want all servers in the org to be configured the same unless the customer has made a change)

This is a pretty small change technically, but it has a big impact, and I think it's a good illustration of how customer feedback is a critical part of our design process throughout even the late stages of the product cycle.