Too many times dealing with customers I find that audit settings are either poorly configured or not configured at all. The funny thing is this is not industry dependent etc. some of the customers who you would think have the best audit configurations due to various regulations and specific guidelines for auditing in many cases have some of the worst auditing setups. You tend to find workstations that either don’t have any audit configured or have something like both legacy auditing and advanced audit policies half applied which puts them in a broken place when they have a setting they wanted in the old settings but the equivalent thing not set in the advanced audit policies. I also tend to see a lot of .. well we delegate out GPO management to sub-OU managers either based on org hierarchy or geographical location management etc which typically results in new GPO’s for all these various sub-OU’s where each sub-OU manager does their own thing for audit which widely varies based on knowledge experience etc. In my perfect world there would be 3 GPO’s linked at the root of each domain that controlled all audit for all systems.. one for Servers, one for DC’s (Ok yeah link it at the DC OU level ), and one for Workstations. Since this perfect world rarely happens I put together a script that will query all systems throughout a forest and dump current effective audit settings via auditpol along with a number of other audit settings that I would consider key but aren’t part of the normal ‘Advanced Audit Policies’.
I’ve called the set of scripts Get-AuditPol and they can be downloaded at https://github.com/kurtfalde/Get-Auditpol/
You will also need the PoshRSJobs module which you can get at https://github.com/proxb/PoshRSJob
The scripts are setup as follows:
Modify the $RunspaceThreads to control how many simultaneous threads are running .. there are 6 runspace pools kicked off so technically there are 6 x whatever this value is.
This is hardcoded to run from a c:\get-auditpol by default.. change to something else if you are using a different directory structure, the output data will be dropped into a .\data folder
These 3 scripts are the script blocks for the ‘AuditPol’ runspaces. The runspaces use WMI (can’t rely on psremoting being enabled on systems) to remotely kick off auditpol /backup on the remote systems and dump the results to c:\windows\SystemName-Auditpol.csv on the remote system. As I didn’t know a good way to check on whether this process had completed I cheated and just threw a sleep command in for a few seconds to wait on this to complete following running the command. Once the wait completes we remotely import-csv on the content to an object and then append that content to a local .csv with the aggregate results of auditpol for all systems in the runspace. I am using a mutex per runspace to marshal connections to the aggregate .csv file to ensure we don’t lose data as much as possible by preventing file being locked for writing errors.
These 3 scripts are the script blocks for the various other Audit settings of interest on the remote systems. This script is using remote registry calls via dotnet to query various key’s existence/values and appending to an aggregate .csv locally on the collector system for the runspace group. The data that we are gathering includes the following:
- Is the force Advanced Audit Policies reg key set
- Is CMD line auditing enabled for Process Creation Events – see https://technet.microsoft.com/en-us/library/dn535776.aspx for more info
- Is AppLocker logging enabled (either Enforced or in Audit mode … this is not about AppLocker and how it’s configured but rather if it is providing forensically valuable artifacts on the system)
- Is PowerShell Script Block Logging enabled
- Is PowerShell Transcription Logging enabled
- Is the Special Groups Auditing registry key configured with anything – see https://blogs.technet.microsoft.com/askds/2008/03/11/special-groups-auditing-via-group-policy-preferences/ (oh you didn’t know you actually had to configure something extra for that audit setting???? )
- Machine Name, OS version, OU location/DN
For now this is all I currently have however once I can actually get a good data set that’s not the 4 or so machines in my lab I plan on taking this data and throwing into a PowerBI desktop file and creating some visualizations around this to show where variances are and how well systems conform to recommended Microsoft Security Baselines. Let me know if this helps you at all and if you have a dataset that you can share that I could use to create visualizations please let me know.