I am sure many of you have at one time or another wanted to do a Deep Traversal (subfolder) Search on the MAPI Public Folder Tree. However, as many of you probably know, Exchange does not allow for Deep Traversal (subfolder) Searches on the MAPI Public Folder Tree: http://support.microsoft.com/default.aspx?scid=kb;en-us;254911. The reason you cannot do this is because when the Public Folder system was designed, it was decided that load balancing would be accomplished by putting only some of the content on some of the servers. For example, if there exists 3 folders and 3 servers. Folder1 may be replicated between servers 1 & 2, folder2 between servers 2 & 3, and folder3 between 1 & 3. If the client created a search folder that simultaneously searched all three folders, they’d get different results depending on which actual Public Folder Server they were connected to. To address this issue, search folders cannot be created that search the content of multiple folders; deep traversal, or subfolder, being a special case of “multiple folders”.
One way to get around this “limitation” is by setting up a Non-MAPI Public Folder Tree because you can do a Deep Traversal (subfolder) Search against that: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/e2k3/e2k3/_exch2k_specifying_a_deep_traversal.asp?frame=true. However, you cannot view this Public Folder Tree via MAPI (i.e Outlook). So what is the difference you ask? Well, the thought behind Application Public Folder Stores was that, for the most part, the content of the folders would not be replicated to any other server. There’s no actual architectural limitation to enforce this – Exchange originally just foresaw organizations would create only one Application Public Folder Store at all and so replication wouldn’t be an issue. Therefore, users can create Search Folders that can search the content of multiple folders. Given this, though, you do actually step back into the original problem. Namely, the search folder itself is replicated among servers but the content is always generated locally. Also, clients are never referred to a different server – search folders always seem (to the client) to have their exclusive replica on the server presently being queried. So, users should be aware that they will get different results depending on which server they contact for the content, and of course replication latency adds a new dimension to the differing results.