Use the Pipeline to Create Robust PowerShell Functions

Summary: Microsoft Windows PowerShell MVP, Don Jones, shows how to use the pipeline to create robust reusable functions.

Microsoft Scripting Guy, Ed Wilson, is here. Our guest blogger today is Don Jones.

Photo of Don Jones

Don Jones is a Windows PowerShell MVP Award recipient, and he is one of the world’s most well-known Windows PowerShell authorities. He blogs about PowerShell, writes about PowerShell, and offers private on-site Windows PowerShell classes. He recently released a book for Windows PowerShell beginners, helping them Learn Windows PowerShell in a Month of Lunches.

“There’s a reason,” I am constantly telling my Windows PowerShell students, “That it’s called PowerShell and not PowerScript.” Although Windows PowerShell obviously has an embedded scripting language, it is first and foremost a command-line shell based around the concept of a pipeline—not unlike ages-old UNIX shells. Given the focus on the pipeline, it only makes sense for the tools you write to take advantage of that pipeline. Unfortunately, if your past experience is mainly in scripting languages like VBScript, or even programming languages like VB or C#, the pipeline can be a bit elusive. Here are two rules to always try to follow to keep the pipeline firmly in focus as you write functions in Windows PowerShell:

  1. Never worry about where input is coming from.
  2. Never worry about where the output is going.

If your functions accept input from the pipeline, and output to the pipeline, then they’ll achieve maximum flexibility—not to mention all the PowerShell feng shui you could possibly desire.

Input from the Pipeline: Go Advanced

Windows PowerShell offers many different forms of a function, the basic unit of modularization within the shell. There are basic functions, filtering functions, and so on. Forget all of them except the pinnacle of function functionality: The advanced function—fondly called a “script cmdlet” by its biggest fans. Honestly, advanced functions aren’t all that different from their non-advanced peers—they just have a little bit more structure, which allows PowerShell to do some interesting work on your behalf.

Let’s say your goal is to write a function that retrieves management information from one or more computers. You don’t want your function worrying about where those computer names are coming from—Rule #1 says that the function shouldn’t care about its input. Instead, your function should simply implement a ComputerName parameter and accept one or more computer names. Where those names come from doesn’t really matter. For example, any of the following should work (assuming that you named your function Get-Info):

Get-ADComputer –filter * | Select @{label=’computername’;expression={$_.name}} | Get-Info

Get-Info –computername SERVER2,SERVER3

“localhost” | Get-Info

Get-Info –computername (Get-Content names.txt)

Get-Content names.txt | Get-Info

Why so many variations? Well, because that’s how most native PowerShell cmdlets work. You can never tell which of these patterns someone else might be used to, so you want to support all of them. Fortunately, an advanced function makes it easy to do so. Here’s an example where I’ve also included a second parameter, LogName, which will be used to specify the name of an error log file (for computers that can’t be reached). That parameter defaults to C:\retries.txt, so you don’t even have to specify the parameter, meaning that all of the previous examples will still work.

Function Get-Info {

      [CmdletBinding()]

      Param(

                              [Parameter(Mandatory=$True,ValueFromPipeline=$True,ValueFromPipelinebyPropertyName=$True)]

      [string[]]$computername,

      [string]$logfile = ‘c:\retries.txt’

)

PROCESS {

      Foreach ($computer in $computername) {

            # do stuff here with $computer

      }

}

}

Simply providing all of that extra metadata around the parameter and using the CmdletBinding() directive, makes PowerShell do all the heavy lifting. I’ve indicated where you’ll use the $computer variable to do whatever it is you need to do with a single computer name at a time. Every one of the five original usage examples will work fine. In fact, let’s go ahead and document those by adding them as comment-based Help. These specially-formatted comments, which must immediately precede or follow the function declaration, will be interpreted by the shell. If someone runs Help Get-Info, they’ll get standard-formatted Help.

<#

.SYNOPSIS

Retrieves info from one or more computers.

.PARAMETER computername

The computer name(s) to retrieve the info from.

.PARAMETER logfile

The path and filename of a text file where failed computers will be logged. Defaults to c:\retries.txt.

.EXAMPLE

Get-ADComputer –filter * | Select @{label=’computername’;expression={$_.name}} | Get-Info

.EXAMPLE

Get-Info –computername SERVER2,SERVER3

.EXAMPLE

“localhost” | Get-Info

.EXAMPLE

Get-Info –computername (Get-Content names.txt)

.EXAMPLE

Get-Content names.txt | Get-Info

#>

Function Get-Info {

      [CmdletBinding()]

      Param(

                              [Parameter(Mandatory=$True,ValueFromPipeline=$True,ValueFromPipelinebyPropertyName=$True)]

      [string[]]$computername,

      [string]$logfile = ‘c:\retries.txt’

)

PROCESS {

      Foreach ($computer in $computername) {

            # do stuff here with $computer

      }

}

}

(June Blender, one of the Windows PowerShell team and one of the folks chiefly responsible for the quality of PowerShell’s Help, buys me a drink every time I show folks how to comment their own functions like this. It helps document your function and makes it more accessible to Windows PowerShell users who know how to use the built-in help system. That means comment-based Help is a good idea. Run help about_comment_based_help within the shell for more details.)

Now all we have to do is make the function…well…do something.

Output: Pipeline

By spewing (that’s a technical term for “outputting”) data to the pipeline, you enable your function to work in an almost infinite variety of scenarios. Want CSV output? XML output? HTML? Databases? Need to filter or sort the output? No problem in the pipeline. The trick is to make sure your function is spewing objects, rather than some kind of preformatted text. In other words, Write-Host is almost always the wrong answer. If you’re using Write-Host, you’re probably doing it wrong. Instead:

  • Use Write-Debug to output debugging information, if needed. Running your function with the debug switch will enable this output.
  • Use Write-Verbose if you want to output step-by-step status information. Adding verbose to your function when running it will enable this information (for example, Get-Info –computername localhost –verbose).
  • Use Write-Output to write information to the pipeline.

There are a LOT of techniques for constructing an object to output. Here’s the technique I prefer, mainly because I find it easiest to read:

<#

.SYNOPSIS

Retrieves info from one or more computers.

.PARAMETER computername

The computer name(s) to retrieve the info from.

.PARAMETER logfile

The path and filename of a text file where failed computers will be logged. Defaults to c:\retries.txt.

.EXAMPLE

Get-ADComputer –filter * | Select @{label=’computername’;expression={$_.name}} | Get-Info

.EXAMPLE

Get-Info –computername SERVER2,SERVER3

.EXAMPLE

“localhost” | Get-Info

.EXAMPLE

Get-Info –computername (Get-Content names.txt)

.EXAMPLE

Get-Content names.txt | Get-Info

#>

Function Get-Info {

      [CmdletBinding()]

      Param(

                              [Parameter(Mandatory=$True,ValueFromPipeline=$True,ValueFromPipelinebyPropertyName=$True)]

      [string[]]$computername,

      [string]$logfile = ‘c:\retries.txt’

)

BEGIN {

      Remove-Item $logfile –erroraction silentlycontinue

}

PROCESS {

      Foreach ($computer in $computername) {

            $continue = $true

            try {

                  $os = Get-WmiObject –class Win32_OperatingSystem –computername $computer –erroraction Stop

            } catch {

                  $continue = $false

                  $computer | Out-File $logfile

            }

            if ($continue) {

                  $bios = Get-WmiObject –class Win32_BIOS –computername $computer

                  $obj = New-Object –typename PSObject

                  $obj | Add-Member –membertype NoteProperty –name ComputerName –value ($computer) –passthru |

                  Add-Member –membertype NoteProperty –name OSVersion –value ($os.caption) –passthru |

                  Add-Member –membertype NoteProperty –name OSBuild –version ($os.buildnumber) –passthru |

                  Add-Member –membertype NoteProperty –name BIOSSerial –value ($bios.serialnumber) –passthru |

                  Add-Member –membertype NoteProperty –name SPVersion –value ($os.servicepackmajorversion)

                  Write-Output $obj

            }

           

      }

}

}

Technically, I could have just added passthru to the final Add-Member and skipped Write-Output, but I wanted to reinforce the fact that Write-Output is your friend, because it writes to the pipeline. With this technique, almost anything can follow Get-Info in the pipeline. For example, check out this awesomeness:

Get-ADComputer –filter * | Select-Object @{label=’computername’,expression={$_.Name}} | Get-Info | Where-Object –filterscript { $_.OSBuild –eq 7600 –and $_.SPVersion –ne 2 } | Export-CSV c:\needs-patched.txt

In other words, get me all of the computers running Windows build 7600 that don’t have Service Pack 2, and export their information to a .csv file that I can give to someone else who will be spending the weekend installing service packs. Ahem. Come Monday, I could run the same thing again and instead of outputting to a .csv file, do this:

Get-ADComputer –filter * | Select-Object @{label=’computername’,expression={$_.Name}} | Get-Info | Where-Object –filterscript { $_.OSBuild –eq 7600 –and $_.SPVersion –ne 2 } | ConvertTo-HTML | Out-File \\webserver\webroot\fail.html

Now I have a simple HTML page sitting on my intranet server, listing the computers that still do not have the right service pack. The trick is all in the pipeline: By not worrying about input and output inside my function, and by instead letting the pipeline handle it, I have opened up a huge amount of flexibility. How my function is used can differ from day-to-day, without me doing any additional work.

Thank you, Don, for this awesome article.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy