Easily Multi thread Powershell commands and scripts

Update: I had recreated the same functionality in C# using Tasks. The Powershell binary module has some added features when compared with the below PS Version.

Download the powershell binary module from Github repo.

If you require the PS1 version of the script, download it here.
Rename the file to .ps1 to use it on the PS window

I want to share this script, I wrote for my customer few months back. This one is special as it is my first script without any Exchange commands in it :-) .

There are many everyday tasks in support and administration which takes long time to complete. Some of the common tasks are

  •                Collecting asset information, configuration from all the workstations
  •                Collect mailbox statistics or other details for thousands of objects...etc.

I had seen some commands/scripts that was executed for large number of objects and took more than 24 hours to complete. Scripting is all about reducing time spent performing mundane tasks, but seems it is not enough. The commands simply take so long just because of too many objects it has to process sequentially. Then comes the idea of Multi-threading in powershell, using powershell background runspaces seem fit for the job than using Powershell jobs. Background Runspaces are more faster and light weight to process more number of tasks in parallel.

I started integrating runspaces to achieve multi-threading on my scripts, but for every script, the requirement is different and coding it consumes time. To overcome this challenge, I decided to create a generic function that can be used on all the commands and scripts with very less modification to the actual command. Not all Administrators who uses powershell for the daily administrative tasks and reporting are well versed on scripting. With all that kept in mind, Invoke-all function was created, I have made the function very generic, easy-to-use and lightweight as possible.

The script can be downloaded from https://aka.ms/invokeall

Usage example:

#Actual command:

Get-Mailbox -database 'db1' | foreach{ Get-MailboxFolderStatistics -Identity $_.name -FolderScope "inbox" -IncludeOldestAndNewestItems } | Where-Object{$_.ItemsInFolder -gt 0} | Select Identity, Itemsinfolder, Foldersize, NewestItemReceivedDate

Same command with Invoke-all

Get-Mailbox -database 'db1' | Invoke-All{ Get-MailboxFolderStatistics -Identity $_.name -Folder Scope "inbox" -IncludeOldestAndNewestItems }  | Where-Object{$_.ItemsInFolder -gt 0} | Select Identity, Itemsinfolder, Foldersize, NewestItemReceivedDate

As you see it is that easy to use. Imagine, 100s of DBs and using invoke-all function is 10x faster than the conventional way.

Note: The scriptblock is the first parameter to the Invoke-all function and it should only contain the first command and its parameters that needs to be executed. Any Filtering that is done using another command (Where-Object) must be outside of the script block as mentioned in the above example.


  1. Easy to use.
  2. When used with Cmdlets or Remote PowerShell, it automatically detects and loads the (only) required modules to the sessions.
  3. Checks if it can complete the first job without errors before scheduling all the jobs to run in parallel.
  4. By default, the script processes the jobs in batches. With batching, the resource consumption is very less and its proportional to the batch size which can be adjusted as required.
  5. Ability to pause the Job scheduling in-between batches. This is very useful if too many requests are overloading the destination.
  6. Creates a runtime log by default.

The Invoke-all is a function exported and is a wrapper like function that takes input from Pipeline or using explicit parameter and process the command concurrently using runspaces. I have mainly focused on Exchange On-Perm (and some Online Remote powershell (RPS) commands) while developing the script. Though I have only tested it with Exchange On-Perm Remote sessions and Snap-ins, it can be used on any powershell cmdlet or function.

I haven’t tested it with any Set-* commands, but it is possible to use this script to modify objects in parallel. Carefully chose the command so that there is no contention created when using multiple threads.

Exchange remote PowerShell uses Implicit remoting, the Proxyfunctions used by the Remote session doesn’t accept value from pipeline. Therefore, the Script requires you to declare all the Parameters that you want to use in the command when using such commands. See examples on how to do it.

It can handle cmdlets and functions from any modules. I have updated script to handle external scripts as well. I will see to include applications handling in the next version, but it can be achieved in this version with a simple script as shown in the examples that can execute your EXE or application.

Please refer to the comments on the script to learn more about the technique used on the script or please leave a comment here. Hope you find this one useful to save some time on your long running powershell commands.

Comments (17)
  1. turbomcp says:

    Thanks alot
    im a fan from ver 1.0

    1. Thank you Turbo, glad it is of use :-)

  2. jdp2010 says:

    Can you post your code again? it is missing from technet gallery….

    1. please find the link to the script on top of this post, thanks

  3. Gert Nelen says:

    I love the idea, but I can’t get it to work. I’m trying to pass to it a block of lines from a set of files, on which my own script needs to do some regex matching and writes results to a MongoDB. I’m getting an index out of bounds error. Is there a limit to the amount of objects that can be passed to the invokeAll command?

    1. It uses an array for input object, I have tested with more than 300k input objects and it does work fine..can you provide more details and share me the log file the script creates..tx

      1. Gert Nelen says:

        Sure. The output in my Powershell ISE was:

        Invoke-All : Caught Error : Index was outside the bounds of the array.
        At D:\col\psScripts\processcolToDBv3.ps1:221 char:29
        + Get-Content $FileName | Invoke-All -Quiet -BatchSize 20 {
        + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
        + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Invoke-All

        Please verify if mandatory parameters are mentioned, if Command accepts ValueFromPipeline, do not explicity mention it as a Parameter.

        And the output in the log file:

        [2019-02-20 16:48:56] Starting to Execute – Invoke-all 2.5
        [2019-02-20 16:48:56] Get-Content $FileName | Invoke-All -Quiet -BatchSize 20 {

        [2019-02-20 16:48:56] Got command preProcessLine to process
        [2019-02-20 16:48:56] Created the Proxy command preProcessLineProxy
        [2019-02-20 16:48:56] The command preProcessLine is a custom function from file D:\col\psScripts\processcolToDBv3.ps1
        [2019-02-20 16:48:56] Caught Error : Index was outside the bounds of the array.
        [2019-02-20 16:48:56] Cleaning up; Error in Process block
        [2019-02-20 16:48:56] Deleted the Proxy function preProcessLineProxy
        [2019-02-20 16:48:56] Disposed the Powershell runspace pool objects

        The command is in a foreach file loop and calls InvokeAll for each line in each file.

        Get-Content $FileName | Invoke-All -Quiet -BatchSize 20 {
        $LineNr ++
        if ( $_.Length -gt $len ){ $line = preProcessLine $_ }
        else { $line = $_ }
        lineProcessor $line

        Thanks for the help!
        Best Regards,

        1. Gert Nelen says:

          I’m sorry but the layout above is removed by the blog page, I’m afraid… I also tried feeding this script a directory with smaller files, but it was the same error…
          Powershell really needs this kind of functionality and not just in workflows imho. I’ve looked into converting my script(s) into workflows, but there’s quite some differences and therefore quite some work. Great job in coding and many thanks for sharing!

        2. Create a PS1 script to handle it, something like..

          $len = 20
          if ( $inputline.Length -gt $len ){ $line = preProcessLine $_ }
          else { $line = $_ }
          lineProcessor $line
          And then run
          Get-Content $FileName | Invoke-All -Quiet -BatchSize 20 { .\FilecreatedfromAbovelines.ps1 -inputline $_} -Modulestoload D:\col\psScripts\processcolToDBv3.ps1

          Each line from the $filename is sent to Invoke-all and the preProcessLine and lineProcessor functions are executed based on the script created above – the functions are made available to the runspace as the processcolToDBv3.ps1 is imported as a module

          1. Gert Nelen says:

            Hi, Again thanks for your help. Must I create something actually named ScriptBlock? I tried as you suggested and this is the output.
            Invoke-All : The property ‘ScriptBlock’ cannot be found on this object. Verify that the property exists.
            At D:\col\psScripts\processcolToDBv3.ps1:221 char:29
            + … $FileName | Invoke-All -BatchSize 100 {.\runProcessLine.ps1 -inputlin …
            + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            + CategoryInfo : NotSpecified: (:) [Invoke-All], PropertyNotFoundException
            + FullyQualifiedErrorId : PropertyNotFoundStrict,Invoke-All
            When I tested with ‘regular’ powershell commands, it does work… When I test the line processor in a separate file with a test line (and first importing the functions from my root file, it also works…

          2. The error is from the script that is invoked – D:\col\psScripts\processcolToDBv3.ps1:Line 221
            If your script is calling some other modules or functions, make sure it include it to the -Modulestoload parameter..

          3. Gert Nelen says:

            I’ll rewrite my script (centralizing all functions) a bit. When I call Invoke-All directly from the command line (with only one file), it works as expected. When it is within the original script (in a foreach $file in gci loop), it throws the error. Thanks for the help. I hope I’ll get it to work as expected…
            Is there a way to let it create the jobs asynchronously? For example create first 100 jobs, start processing them and add new jobs as they complete (now it takes minutes if I start it on a large file (because it creates 11 000 000 jobs before starting the batch processing)).

          4. As you mentioned, going thru each file is the best way, pl see if you can get it to work (good luck!!).. Am working now to rewrite the script using c# which will have the functionality you mentioned, process x number of jobs and Queue the remaining as they complete in an efficient way. I will post here when I complete it

          5. Gert Nelen says:

            Thanks again for the assistance and the (sharing of your) code. I ended up rewriting everything, splitting files in arrays of 500 each and got it to work with Invoke-All, but was still bothered by the overhead of creating all the jobs beforehand. Finally I ended up creating my own runspacefactory with runspaces, tweaked to run this specific code, ending up in a performance increase of 800%! Should your generic C# script be ready, I’m interested in having a look. Th@nks!

  4. Patrick_92 says:

    Hello Santhosh Sethumadhavan,

    Great script! It really enables me to do great things.

    However, please change the PowerShell versioncheck in the script in the next update:
    Current: $host.Version.Major
    Improvement: $PSVersionTable.PSVersion.Major

    The issue with $host is that it will return version “1” if you’re connected to a remote host with PowerShell Remoting. The $PSVersionTable will display the actual PowerShell version of the remote host.

    1. Thank you for the suggestion Patrick. I will add it on the next version.

Comments are closed.

Skip to main content