Monitor an agent – but run response on a Management Server






Comments (9)

  1. Kevin Holman says:

    Adding back previous comments:

    Raphael Burri
    November 3, 2015 at 9:08 am (Edit)
    Hi Kevin

    Thank you for writing this up as a reference. I have been using this now and again for the last years and it is a great enabler for complex workflow scenarios.

    However; I came to understand that it is often better to use the more specialized target class Microsoft.SystemCenter.CollectionManagementServer. Such the response will run on a “true” Management Server with database and SDK capabilities. This allows using
    PowerShell scripts (or modules) that require the SCOM SDK.

    Using the Microsoft.SystemCenter.ManagementServer target class will include Gateways. Those do not allow access to SDK and/or DB. Hence advanced rule actions may fail wen running for agents connected through gateways.

    When using Microsoft.SystemCenter.CollectionManagementServer instead, the rule action will be executed on the MS that is currently serving the GW to which an agent is connected. More versatile in my opinion.

    The other remark when using this great rule re-targeting: One has to be careful with variables. the usual $Target$ replacement will show unexpected results when used on the rule action. This is because not the triggering agent’s target object properties are
    being evaluated but the MS’ properties it has been redirected to. If you need to know e.g. which agent the rule was triggered on, one possible workaround includes:

    Include the agent’s name (or other properties) as parameters in the event you’re triggering the rule on. Then use $Data$ replacement when calling the action sctript. E.g.: $Data/Context/DataItem/Params/Param[3]$ (getting the 3rd parameter from the collected
    event coming from a consolidated rule).

    Raphael

    1. Kevin Holman says:

      Raphael Burri
      November 3, 2015 at 8:29 pm (Edit)
      Hi Kevin
      Indeed; when I need to pass anything re-usable from the agent to the MS-side action, only $Data… can be used. Everything that is contained in the built up DataItem of a datasource is fine.

      What I wanted to emphasize is that initially I often failed because I went the seemingly more obvious way of using $Target – only to discover that those workflows would most often fail silently. If a different base classes was used on the datasource and action
      sides. When using the same base class(Microsoft.SystemCenter.HealthService and Microsoft.SystemCenter.CollectionManagementServer for example), they ran but contained the wrong information – much easier to troubleshoot.

      Let me brew up a quick scenario:
      – Agent rule targeted at Microsoft.Windows.Computer (my DataSource)
      – I need to re-use the value of the OrganizationalUnit property for SDK-scripts run as responses (my WriteAction)

      When using $Target/Property[Type=”Windows!Microsoft.Windows.Computer”]/OrganizationalUnit$ as a WriteAction parameter, the server side response will fail quietly. The action attempts to evaluate the $Target expression on the MS. MS’ base class is Microsoft.SystemCenter.HealthService.
      Not resolvable to my $Target expression – hence the action will not run. This is awfully difficult to troubleshoot as it needs a deep understanding of the workflows and DataItem passed around.

      Workarounds: Build a DS that puts the $Target/Property[Type=”Windows!Microsoft.Windows.Computer”]/OrganizationalUnit$ value into the DataItem (e.g. System.Event.GenericDataMapper to add the value to the Params or a PropertyBag creation script), then extract
      on the action using the corresponding $Data expression.

      Bottom line: May I suggest adding a few extra words to the post that informs about the limitation around replacement and gives a few hints on how to successfully use $Data instead. Plus a some hints on where to investigate when things do not work out?

  2. mkmaster78 says:

    Hey Kevin, I know this is unrelated to this post, but I couldn’t find a better way to ask you. I’m semi-new to SCOM (though long time with SCCM) and I want to give users access to the console for their machines. I used your AD group MP fragment and created groups with their machines (which was a great help, thank you), but when I set that as their scope and give them access to some of the views (SQL Team doesn’t need the AD MP views cluttering up their screen) many of the objects are missing (SQL team can see their computers, but the DB instances and Availability Groups don’t show, for instance). How do I create a group that contains all objects contained by their SQL systems without explicitly listing the 75+ SQL object types in the dynamic query builder?

  3. Edwio says:

    Hello Kevin. first thanks for another great article!

    second, I have two questions regarding this post:

    1. Can I use this technique for disabling a monitor, from a scom agent that’s in an untrusted domain that (connects to scom via a gateway server)? cause currently I’m doing it with a recovery task that use a remote PowerShell script to disable the required monitors.

    2. I can see that lately you are shifted from using the Windows Computer target class to the Windows Server Operating System class instead? can you please explain why?

    1. Kevin Holman says:

      1. You should be able to… I don’t see why not.

      2. I *NEVER* target “Windows Computer” with very few exceptions. Windows Computer is a special class and there can be many unhosted instances of this class.

      I almost always target “Microsoft.Windows.Server.OperatingSystem” for generic workflows and discoveries, again, with only a few exceptions.

      I will target “Windows Server” class sometimes, IF I need a Windows Failover Cluster aware workflow, to be able to use the “IsVirtual” property in the discovery. This is incredibly rare.

      I generally like to target specific application classes for workflows, unless I need to run the workflow on a wide audience. In that case I prefer Windows Server Operating System.

      1. Anonymous says:
        (The content was deleted per user request)
  4. OdgeUK says:

    What kind of permissions does the script need? My script doesn’t seem to be able to run the Commandlets I’ve given it to put a server into MM (the script works against an MS when run directly on the MS) but does complete the last part of my script where it outputs to a file. I’m wondering if the script needs to have a RunAs account that is Admin on the MS? Or to the SDK?

    1. Kevin Holman says:

      This is why ALL scripts should collect and log an event starting script, which outputs who the script is running under.

      https://blogs.technet.microsoft.com/kevinholman/2017/09/10/demo-scom-script-template/

      Then you know who the script is running as. By default – the scripts should be executing as the Management Server Action account, when it is running on a MS, which should have local admin permissions.

  5. Hello,

    I am looking to do something similar to this, i want to have a monitor on a server that runs a script from the management server and the result of that script be reflected in the health of the monitor. Do you have any suggestions on how to proceed.

    Thank You

Skip to main content