Retrieving Packets Received Discarded Perfmon Counter From Multiple Servers–Updated February 2015


On the Retrieving Packets Received Discarded Perfmon Counter From Multiple Servers post, Tim commented that he had found the script very useful when troubleshooting discarded packets on Exchange 2013 DAG servers.  Using PowerShell  it is very easy to extend the search to discover which other servers are impacted by the packets received discarded issue. 

Since the initial version of the script echoed all results to the console screen, this was not ideal for analysing 200 servers.  When I wrote the script it was to look at a specific DAG, which is only a handful of servers. 

Trying to get the ServerName, NIC, OS Uptime and Perfmon data exported to a CSV was a challenge.  This is not specific for this issue.  It is a technique that comes in handy when you want to pull in bits of data from multiple sources and then easily write out to a CSV or otherwise manipulate the data.

For example in Exchange Get-Mailbox and Get-MailboxStatistics are two separate cmdlets but we often want to combine aspects of their output for reporting purposes.  To make it more interesting, you may also need to pull in data from Get-ADuser and Get-MailboxFolderStatistics into the same report.

How do we get all of the information from multiple sources into a single place?

If only there was a way to customise PowerShell so that we could have custom objects of our own.  Well there is…

 

Custom PowerShell Objects

We can indeed create custom PowerShell objects and construct them with whatever we want to add to it.  You could maybe say it is a little Frankenstein **, but the marketing people would much prefer to use the word hybrid.

There is a an example of this in the below PowerShell template script:

PowerShell Template

We create a new template object which can be copied and re-used multiple times.  This is a custom PSObject with the property names we choose. In the example below they are Element1, Element2 and Element3.  The names can be changed to suit your preference. 

$TemplateObject = New-Object PSObject | Select-Object Element1, Element2, Element3

And this template can be copied to make new objects as needed:

# Make a copy of the TemplateObject.  Then work with the copy…

$WorkingObject = $TemplateObject | Select-Object *

# Populate the TemplateObject with the necessary details.
$WorkingObject.ServerName = $Server

 

The above is just one example or creating custom objects – there are many more

 

Updated Script For Packets Received Discarded

In version 4, the script now outputs to a CSV file.  Script will add one line to the CSV per selected NIC.  If a server has two NICs that means two lines for that server.  This is to allow filtering based on discarded packets and server name in Excel.  A server may have one good NIC and one bad for example. 

The below in an excerpt showing the use of the custom object. 

Using Custom PowerShell Object To Store Data From Multiple Sources

 

Is this the technique you typically use, or what other interesting approaches do you have to this? 

 

Cheers,

Rhoderick

** – Though remember that Frankenstein was the creator and not the monster…

Comments (3)

  1. anonymouscommenter says:

    thanks

  2. DJ Grijalva says:

    Your custom object creation might run slower than needed, especially when processing 200 servers. I usually do something like this:

    #Create the array
    $Data = New-Object System.Collections.ArrayList

    #Create Custom Object
    $Object = New-Object PSCustomObject -Property @{
    ServerName = $Server
    DaysUptime = $UptimeDays
    OSInstallDate = $OSInstallDate.ToShortDateString()
    InterfaceName = $Interface.InstanceName
    InterfaceDrops = $Interface.CookeValue
    }

    #Add Object to Array
    [void]$Data.Add($Object)

    Also, avoid using "+=", you can read why here
    http://powershell.org/wp/2013/09/16/powershell-performance-the-operator-and-when-to-avoid-it/

  3. Nice feedback – thanks DJ!

    I’m personally not too concerned with the execution latency, but totally agree that others may be. The link mentions processing 10,000 and 15,000 elements. Most customers don’t have that many Exchange servers :0)

    Will if I get time to incorporate that for the next version and see what the processing time difference is.

    For a lot of my customers, they are global and it is the network latency that is the killer. PowerShell is like lightening compared to pulling data from Singapore to Canada.

    Cheers,
    Rhoderick

Skip to main content