Resetting and running the Service Manager Data Warehouse Jobs separately

We can see quite some issues with running the SM Data Warehouse Jobs or finishing the Cube Processing Jobs. Most of these issues are related to performance problems due to down-sizing of the hardware on the SM Data Warehouse server, SQL Server with the DW databases and/or SQL Server with the SQL Analysis Services. So most of the time we may find ourselves in situations where we can see that one or the other DW Job is stuck – it just remained in "Running" status but there is no JobModule error.

There are different DW Jobs which need to run and these also have an order in which they need to run – what I mean by this is that if a dependent job did not run successfully, then another job will fail or it won't have anything to do. So to explain a little bit more about this with an example, if the DWMaintenance Job did not finish successfully, it may block the MPSyncJob. If MPSyncJob did not run successfully, then it would block the ETL Jobs. Then, following the ETL Jobs, if the Extract Jobs would fail for some reason, then the Transform Jobs wound have no data to transform … so then the Load Jobs would have no data to load.

These Jobs are being executed on a schedule. To get the schedule of the Jobs, open up a SM PowerShell and check the schedules by using Get-SCDWJobSchedule. All of these CMDLets which we will be using by the way, will need to connect to the DAS service of the SM Data Warehouse Server, so don't forget to add the -ComputerName parameter to these and pass in the Name of the SM DW Server. Now, from the output of that CMDLet you will be able to see if the schedules are enabled (they need to be or the jobs won't run automatically) and the time and execution frequency of each job.

So, this is the *success* order in which the jobs need to run:

  1. DWMaintenance job
  2. MPSyncJob job
  3. Extract jobs
  4. Transform jobs
  5. Load jobs
  6. Cube Processing jobs


To see if these are enabled and their status – if any of these looks to be stuck or if it failed, you can use Get-SCDWJob. If you see that any of these seems to be stuck or has the status "Failed", then if would be a good idea to go ahead and check which module from the job has the issues.

Using the CMDLet Get-SCDWJobModule and passing in the job name (as the -JobName parameter) which we have taken from the job results, we can see the module status and/or at least in what module it is stuck in case that we see that no module is in failed state but we are not really progressing to the next module after a lot of time.

If there are some clearly severe failures in the modules, then it means that by simply re-running the jobs we won't have a solution.


However, in many situations, re-running all the jobs in the right order separately really does help. This is the most common solution for situations where the jobs are stuck in "Running" state, but none of the Modules of the Jobs have errors. So using this PowerShell script below we can reset and restart every job separately in the correct order. If the script encounters a failure in one of the Modules when running one of the Jobs, then it will stop and show the Error message and the module name in which this is occurring. So basically it's a good starting point because it may usually solve the problems by running the jobs in the correct order separately and if there is any error which does not get solved by a simple "reset/rerun" then it shows us this error. If we get such an error then this needs more detailed analysis and usually involves checking around in the database. Please do not mess around in the database without having a support case open with Microsoft because it can lead to an unsupported environment.


You can run this script either on the SM Data Warehouse server or SM Management Server. Either way it will automatically load the SM PowerShell Modules. If you are running the script from a SM Management Server, then you need to pass the -DWServer parameter with the name of your SM Data Warehouse server. Additionally, in some situations, the SQL Analysis Services is running on a different SQL Server (very good idea by the way!) and because the script will also force a re-process of all the OLAP Cubes, we need to also pass the -ASServer parameter with the name of the SQL Analysis Services server. In case that the default AS database name has been changed, then you should also pass the name of the database to the -DWASDatabase parameter.

Please also keep in mind, that I have not included any timeout for the script so, essentially, if one of the DW Jobs would get stuck while it is being run by the script, it would never finish because the script checks once every 10 seconds if the current Job is in "Running" state and only goes further if that status changes. There are some environments, where the DW Jobs do take a lot of time to finish. However, you see that the script is currently executing a job for more that 3-4 hours, it may be a good idea to check out what is going on there because this is an indication that it might be stuck – you could also however just let the script run longer if you think it might take longer because there is no issue in doing this. Also, it would be a good idea to think about the OLAP Cube re-processing jobs – these might really take a LOT of time because just to be safe, we are doing full processing on the cubes.

NOTE: It is very important to know here that if the script will throw an error for any of the DW Jobs, or if it will seem to be running forever (several hours) and you will stop it manually, the remember that it will leave the DW Jobs and DW Job Schedules disabled, so make sure you enable all the Jobs and Jobs Schedules afterwards using: Enabled-SCDWJob and Enable-SCDWJobSchedule. The script purposely lets these disabled because it is up to you (depending on the error/situation) to decide how you want to proceed next and if you need the Jobs and/or Job Schedules disabled or not.


   [Parameter(Mandatory = $False)]
      [String] $DWServer = "localhost",
   [Parameter(Mandatory = $False)]
      [String] $ASServer = "localhost",
   [Parameter(Mandatory = $False)]
      [String] $ASDBName = "DWASDatabase",
   [Parameter(Mandatory = $False)]
      [int] $Wait = 10
$props = Get-ItemProperty "HKLM:\SOFTWARE\Microsoft\System Center\2010\Common\Setup"
$instdir = $props.InstallDirectory
$dwpsd = $instdir + "Microsoft.EnterpriseManagement.Warehouse.Cmdlets.psd1"
Import-Module -Name $dwpsd
$JSList = Get-SCDWJobSchedule -ComputerName $DWServer;
$JList = Get-SCDWJob -ComputerName $DWServer;
function Run-DWJob([String]$DWServer, [String]$JobName) {
   Write-Host "Enabling and running Job:" $JobName;
   Enable-SCDWJob -ComputerName $DWServer -JobName $JobName;
   Start-SCDWJob -ComputerName $DWServer -JobName $JobName;
   $currentJobStatus = Get-SCDWJob -JobName $JobName -ComputerName $DWServer | Select Status
   while($currentJobStatus.Status -eq "Running") {
      Start-Sleep -s $Wait
      $currentJobStatus = Get-SCDWJob -JobName $JobName -ComputerName $DWServer | Select Status
      $moduleList = Get-SCDWJobModule -JobName $JobName -ComputerName $DWServer
      foreach($obj in $moduleList) {
         if([String]::IsNullOrEmpty($obj.ModuleErrorSummary) -ne $true) {
            Write-Host "There is no need to wait anymore for Job" $JobName "because there is an error in module" $obj.ModuleName "and the error is:" $obj.ModuleErrorSummary;
   if($currentJobStatus.Status -ne "Not Started") {
      Write-Host "There is an error with" $JobName "and we will exit this - please inspect the status";
foreach($obj in $JSList) {
   Write-Host "Disabling Schedule for Job: " $obj.Name;
   Disable-SCDWJobSchedule -ComputerName $DWServer -JobName $obj.Name;
foreach($obj in $JList) {
   Write-Host "Stoping and disabling Job: " $obj.Name;
   Stop-SCDWJob -ComputerName $DWServer -JobName $obj.Name;
   Start-Sleep -s $Wait
   Disable-SCDWJob -ComputerName $DWServer -JobName $obj.Name;
$maintenanceList = New-Object System.Collections.ArrayList;
$MPSyncList = New-Object System.Collections.ArrayList;
$extractList = New-Object System.Collections.ArrayList;
$transformList = New-Object System.Collections.ArrayList;
$loadList = New-Object System.Collections.ArrayList;
$cubeList = New-Object System.Collections.ArrayList;
foreach($obj in $JList) {
   if($obj.Name -match "Extract") {
      $extractList.Add($obj.Name) | Out-Null;
   } elseif($obj.Name -match "Transform") {
      $transformList.Add($obj.Name) | Out-Null;
   } elseif($obj.Name -match "Load") {
      $loadList.Add($obj.Name) | Out-Null;
   } elseif($obj.Name -match "Maintenance") {
      $maintenanceList.Add($obj.Name) | Out-Null;
   } elseif($obj.Name -match "MPSync") {
      $MPSyncList.Add($obj.Name) | Out-Null;
   } else {
      $cubeList.Add($obj.Name) | Out-Null;
foreach($obj in $maintenanceList) {
   Run-DWJob $DWServer $obj;
foreach($obj in $MPSyncList) {
   Run-DWJob $DWServer $obj;
foreach($obj in $extractList) {
   Run-DWJob $DWServer $obj;
foreach($obj in $transformList) {
   Run-DWJob $DWServer $obj;
foreach($obj in $loadList) {
   Run-DWJob $DWServer $obj;
foreach($obj in $cubeList) {
   Run-DWJob $DWServer $obj;
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") | Out-Null;
$Server = New-Object Microsoft.AnalysisServices.Server;
$Databases = $Server.Databases;
$DWASDB = $Databases[$ASDBName];
$Dimensions = New-Object Microsoft.AnalysisServices.Dimension;
$Dimensions = $DWASDB.Dimensions;
foreach($dim in $Dimensions) {
   Write-Host "Processing Cube Job" $Dim.Name;
   Start-Sleep -s $Wait;
foreach($obj in $JSList) {
   Write-Host "Enabling Schedule for Job: " $obj.Name;
   Enable-SCDWJobSchedule -ComputerName $DWServer -JobName $obj.Name;
Write-Host "";
Write-Host "FINISHED!";



Comments (3)

  1. Anonymous says:

    There are some great articles out there which talk about troubleshooting the Data Warehouse. In this

  2. Andy Williams says:

    This is amazing, Thank You so much. I had a cube which was stuck running and it was driving me crazy.

  3. Henning Tranberg says:

    Excellent article! Your script and troubleshooting tips saved the day for me. Thanks!

Skip to main content