I/O Performance impact of running Start-DedupJob with –Priority High


My name is Steven Andress and I am a Support Escalation Engineer with Microsoft’s Platforms Support Team.  This is a short blog post to alert you to a condition you might encounter when running a deduplication job using the Start-DedupJob PowerShell cmdlet. 

Start-DedupJob
http://technet.microsoft.com/en-us/library/hh848442.aspx

Start-DedupJob [-Type] <Type> [[-Volume] <String[]> ] [-AsJob] [-CimSession <CimSession[]> ] [-Full] [-InputOutputThrottleLevel <InputOutputThrottleLevel> ] [-Memory <UInt32> ] [-Preempt] [-Priority <Priority> ] [-ReadOnly] [-StopWhenSystemBusy] [-ThrottleLimit <Int32> ] [-Timestamp <DateTime> ] [-Wait] [ <CommonParameters>]

The Start-DedupJob cmdlet starts a new data deduplication job for one or more volumes.  The Priority setting sets the CPU and I/O priority for the optimization job run that you run by using this cmdlet.  The only way to run a Deduplication job with High Priority is to use the cmdlet.  When Priority is set to High, I/O for other processes using the volume may be slowed down or even blocked.   If this is a Cluster Shared Volume (CSV), I/O to the volume from other nodes can be similarly impacted.   

Workaround:
Do not use "-Priority High" when starting dedup jobs if this is server is in production hours.  If you wish to use this switch, please ensure that it is done after hours so that productivity is not affected.

Regards,
Steven Andress
Senior Support Escalation Engineer
Microsoft Corporation

Comments (1)

  1. Anonymous says:

    My name is Steven Andress and I am a Support Escalation Engineer with Microsoft’s Platforms Support

Skip to main content