Troubleshooting EWS requests using IIS logs with Logparser

The problem:

 We have very valuable data inside of the IIS logs that can greatly assist in troubleshooting Exchange Web Service clients.  The problem is that all of the data exists in one field (cs-uri-query) inside of the IIS logs.

An example cs-uri-query field for a EWS call:

;RC:111111ac-1111-1111-1111-111112c1111;Init>>Conn:0,HangingConn:0,AD:30000/30000/0%,CAS:54000/54000/0%,AB:30000/30000/0%,RPC:36000/36000/0%,FC:1000/0,Policy:DefaultThrottlingPolicy_3333333-3333-3333-3333-333333333333,Norm,Sub:5000/0;SoapAction=GetUserAvailability;AddressCount=1;MessageId=urn:uuid:db01111-da7c-4f46-b89d-1111be7aaaaa;Requester=S-3-3-33-3333223333-3333333333-3333673333-3323333;local=1;Threads.Worker.Available=1198;Threads.Worker.InUse=0;Threads.IO.Available=1194;Threads.IO.InUse=1;Failures=0;MailboxRPC.TimeTaken=0;MailboxRPC.RequestCount=0;AD.TimeTaken=0;AD.RequestCount=2;Request.CPU.Main=0;Local.FirstThreadExecute=0;Local.LongPole.Elapsed.TimeTaken=53;Local.LongPole.Elapsed.Destination=server.company.com;LocalLongPole.TimeTaken=3;LocalLongPole.RequestCount=28;LocalLongPole.Destination=server2.company.com;TotalLocal.TimeTaken=53;TotalLocal.RequestCount=1;Thread.CPU.LongPole.TimeTaken=15;Thread.CPU.LongPole.Destination=LocalRequest.Execute;Request.Phase.PreQuery=5;Request.Phase.RequestDispatcher.BeginInvoke=0;Request.Phase.RequestDispatcher.Complete=54;Request.Phase.PostQuery=0;Request.CPU.Total=15;TimeInAS=59;[C]Queues:0msec/Execute:59msec;End(61ms)>>Conn:1,HangingConn:0,AD:30000/29996/1%,CAS:54000/53940/1%,AB:30000/30000/0%,RPC:36000/35972/1%,FC:1000/0,Policy:DefaultThrottlingPolicy_333333-3333-3333-3333-aaaaaaaaaaaa,Norm[Resources:(DC)dc.company.comHealth:-1%,HistLoad:0),(DC)dc2.company.com(Health:-1%,HistLoad:0),(Mdb)DatabaseName(Health:-1%,HistLoad:0),],Sub:5000/0;

 

The solution:

We will utilize logparser purely for speed

Download Log Parser

 

The LogParser Query:

It took a bit of time to work around some of the logic, but this works well.

Logparser.exe -i:iisw3c "SELECT time, date, cs-username, substr(substr(cs-uri-query, index_of(cs-uri-query, 'Impersonate')),0,index_of(substr(cs-uri-query, index_of(cs-uri-query, 'Impersonate')),';')) AS [Impersonate], substr(substr(cs-uri-query, index_of(cs-uri-query, 'SoapAction')),0,index_of(substr(cs-uri-query, index_of(cs-uri-query, 'SoapAction')),';')) AS [SoapAction], substr(substr(cs-uri-query,index_of(cs-uri-query, 'RC')),0,index_of(substr(cs-uri-query, index_of(cs-uri-query, 'RC')),';')) AS [RC Data], cs-uri-stem, substr(substr(cs-uri-query, index_of(cs-uri-query, 'Execute')),0,index_of(substr(cs-uri-query, index_of(cs-uri-query, 'Execute')),';')) AS [Time to Execute] INTO c:\logs\out.csv FROM c:\logs\*.log Where cs-username = 'Domain\ServAcct' AND cs-uri-query LIKE '%SoapAction%'"

Note: output goes into the file specified after the INTO clause above. Input is taken from the FROM clause. Notice we are grabbing all .log files in a directory. We are also filtering for requests coming from one user, typipcally applications will authenticate as one user and Impersonate the intended target.

 

What this query does:

-Only fields that are sent by a specific user (Domain\ServAcct above)

Splits the csi-uri-query to pull out:

  • who the above user is impersonating as
  • what the soapaction is
  • the RC field
  • The Execution time

 It took me a bit to understand the built in functions to sort the data we needed.

The output:

 

 

How fast is this?

On my (rather speedy) laptop with about 650 megs of logs:

Statistics:

-----------

Elements
processed: 827827

Elements
output:    325218

Execution
time:     11.02 seconds

 

Great, i see the data in Excel.... How can i make it useful?

 

Sorting and Charting in Excel is extremely helpful, I'll go through a few scernios that shed some light on our issue

Counts for each user:

 

Row Labels

Count of SoapAction

Impersonate:Domain\User1

1102

SoapAction=m:ConvertId

293

SoapAction=m:CreateItem

1

SoapAction=m:FindFolder

22

SoapAction=m:FindItem

310

SoapAction=m:GetFolder

11

SoapAction=m:GetItem

377

SoapAction=m:UpdateItem

88

Impersonate:Domain\User2

1013

SoapAction=m:ConvertId

285

SoapAction=m:CreateItem

1

SoapAction=m:FindFolder

22

SoapAction=m:FindItem

352

SoapAction=m:GetFolder

11

SoapAction=m:GetItem

286

SoapAction=m:UpdateItem

56

 

Counts for each Action:

 

Row Labels

Count of SoapAction

SoapAction=GetUserAvailability

31

SoapAction=m:ConvertId

65685

SoapAction=m:CreateItem

1560

SoapAction=m:DeleteItem

1013

SoapAction=m:ExpandDL

12

SoapAction=m:FindFolder

49536

SoapAction=m:FindItem

102752

SoapAction=m:GetFolder

24804

SoapAction=m:GetItem

70584

SoapAction=m:ResolveNames

76

SoapAction=m:UpdateItem

9165

Grand Total

325218

 

 

That looks great, how do I get that level of detail out of the logs?

Note: I am using Excel 2010

 

To Calculate Counts for each user:

 

  •  Open the .csv output in Excel
  • Without selecting any cells, click insert, PivotTable
  • At Select a Table or Range, the default should be all Cells, click New Worksheet
  • On the new worksheet with the PivotTable click Impersonate and SoapAction under PivotTable Field List (upper right)

 

  • Now drag SoapAction to the (Sum) Values box at the bottom right

  • Now we will want to fix the sort to list the user with the highest number of requests first
    • Highlight all of the values inside of “Count of SoapAction” column (not gathering the header or the total at the bottom)
    • Click the Data tab

  • Select Z to A to sort Largest to Smallest (Next to Sort, upper right)

To Calculate Counts for each Action:

 

  • Open the .csv output in Excel
  • Without selecting any cells, click insert, PivotTable
  • At Select a Table or Range, the default should be all Cells, click New Worksheet
  • On the new worksheet with the PivotTable click SoapAction under PivotTable Field List (upper right)

 

  • Now drag SoapAction to the (Sum) Values box at the bottom right

 

Now look at the worksheet with the Pivot Table, you should see the data sorted correctly. Using the same technique you can sort or chart the data by time to get an idea of when the bulk of requests are coming in

 

-CaseyS