Build Your Own PowerShell Cmdlet: Part 7 of 9

Summary: Microsoft Windows PowerShell MVP, Sean Kearney, continues a series of guest blogs detailing building your own cmdlet.

Microsoft Scripting Guy, Ed Wilson, is here. Guest blogger and Windows PowerShell MVP, Sean Kearney, has written a series about building cmdlets. For more about Sean, see his previous guest blog posts.

Note This is Part 7 of a nine-part series about building your own Windows PowerShell cmdlet. Read the entire series as it unfolds.

Here’s Sean…

Begin, Process, and End blocks

So far, we haven’t discussed three sections for your cmdlet. They are Begin, Process, and End.

  • The Begin block runs only once for each instance of the cmdlet. An example of script that you could place here might be the definition of static variables that will never change.

  • The Process block can be more dynamic in nature. It will run for each record in the pipeline. If you ran Import-CSV for some data and piped it to your cmdlet, for every row that existed within the CSV, this block of script would execute.

  • The End block also runs only once for each instance of the cmdlet. It is meant for any clean-up that may need to be performed, for example, closing open files that were created by the cmdlet.

We’ll modify our current advanced function to more correctly reflect processing the cmdlet.

Because we are going to be piping in data, and we are no longer dealing with a single element by default, we must adjust the parameters to reflect this. Whenever we pass data through the Process block, it is received as a single-element object array. To your script, this is actually a very minor change—simply alter each passed parameter in the Process block with a [0] to reflect this.

In our current cmdlet, we would look for references to the $Folder, $Preface, and $Extension variables. Most of the actual processing of our cmdlet will happen in the Process block, and this is where you need to ensure that you affect those changes on the variables.

Our current cmdlet modified to have its structure broken into appropriate script blocks, with variables reflecting input from the pipeline, will look like this.

function global:ADD-LOGFILE{













HelpMessage=’Folder to Store Logfiles in’)]





HelpMessage=’TEXT to prepend all logfiles with’)]






HelpMessage=’File Extension for Logfiles’)]




Begin {}

Process {


WRITE-DEBUG “`$Folder: $Folder[0]”

WRITE-DEBUG “`$Preface: $Preface[0]”

WRITE-DEBUG “`$Extension: $Extension[0]”


# GET the Current Date for our Logfile



WRITE-DEBUG “`$Today: $Today”

# Extract the Date removing the “/”



WRITE-DEBUG “`$Date: $Date”


# Extract the Time removing the “:”



WRITE-DEBUG “`$Time: $Time”


# Build our Filename



WRITE-DEBUG “`$Logfilename: $Logfilename”


# Test and ensure file does not already exist


IF (TEST-PATH -path $Logfilename)


{ WRITE-ERROR –message “Error: $Logfilename exists.” –category ‘WriteError’


# If file exists, return a status of Boolean $False for Unsuccessful


RETURN $Logfilename,$FALSE }






# Create logfile


NEW-ITEM –Type File -path $Logfilename -Force | OUT-NULL

WRITE-DEBUG “$Logfilename successfully created”


# Return the Full path and filename if successful


RETURN $Logfilename,$TRUE


End {}




Thank you, Sean. You are a marathoner, and you are really going the distance on this series. Guest Blogger Week will continue tomorrow when Sean will bring us Part 8.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy 

Comments (2)

  1. Anonymous says:

    > simply alter each passed parameter in the Process block with a [0] to reflect this.

    This means if the parameters are specified on the command line (not using the pipeline) the second etc. elements of those arrays will be ignored silently. Not good.

    The mistake is making the three parameters  ($Folder, $Preface, and $Extension) arrays in the first place. For pipeline input they don't need to be (compare Where-Object's -InputValue: the parameter that receives Where-Object's pipeline input is /not/ an array). As they all need to match up (what would happen if there were two prefaces but three folders and four extensions?) easier to avoid the arrays.

    (If using without the pipeline then the caller would need to do the looping over multiple sets of values, but that isn't much of a loss.)

  2. A Concerned Citizen says:

    Although this will likely never be edited, I have to point out that DefaultParameterSetName=”Folder” does absolutely nothing, because Sean forgot to define a Parameter Set. DefaultParameterSetName has nothing to do with Parameter variable names.

    Also, is it best practice to set the Position parameter variable to 0? Doesn’t Sean point out an example of another cmdlet that starts at one? Microsoft does point out that the position numbers don’t need to be contiguous and can also start at 0, this appears to be confirmed by the fact that the shell output translates Position 0 into Position 1. It is technically correct, so we can let that slide.

    On top of that, your code changes with no explanation (assuming it to be an honest mistake), because the Position parameter variables change from 0,1,2 the first go-around in Part 6 to 0,1,1 for the rest of the articles. Again, seems like an honest mistake.

    Other than that, I still learned a lot. I tend to learn a lot from the mistakes others and myself make.

Skip to main content