Logparser play of a forensicator


My guru, I won’t name him, but he knows who he is, told me one day what we do is not exactly forensics, its actually Root Cause Analysis to find out how a security incident happened, so once we know that root cause we can do multiple things from ensuring preventions to mitigations and performing recovery.

Few words about Forensics. Forensics in  true sense, is the domain, in which the evidence collected from crime scene needs to be presented to the court of law. The rules of evidence and chain of custody which are very strict, apply in forensics. In case of IT crimes, evidence is fragile and delicate ,which also includes volatile and non-volatile data or evidence, where data collection is done as per order of volatility, most volatile first. So when a security Incident happens, the processes are used by organization, which insure if organization wants to go to court of law to punish the attackers specially if they are internal. Then these processes needs to be followed to protect evidence and maintain chain of custody, preferably performed by certified and experienced forensic expert . A minor modification in the evidence during the investigation or otherwise will make evidence useless in the court of law ,allowing attackers to go free. That’s why rules of forensics are really rigid.

Since we don’t work with evidence part so we live in the domain of Root Cause Analysis not Exactly forensics so evidence protection(with all the rigid strict rules) is the main difference between RCA and forensics but these days these terms have been used interchangeably but still we should know the difference.

Which later was confirmed, when i was studying for  CISSP.One of the things my guru encouraged me to play with was a tool that he is perfection on. The tool is Logparser.

Being an inspired student, i started playing with that, like any artist who would like to showcase his little piece of art or share his work or his amazement. I am putting my little compilation, that helps me tracking things as per my needs, when i m digging deeper into files, events and logs.

I have organized them in groups , based on focus, so I have sections focused on logon events, service generation and process creation then i have section that is about playing with file system using NTFSinfo and USNinfo. To keep the size of the article smaller, i m not adding the details about NTFS info and USN info, which are top focus along with other things during forensics or RCA investigations. May be in some other post or follow up post i shall add their details. One more point, I have mentioned CSV(comma separated file) and Datagrid or grid as output. CSV file can be opened using excel for analysis and Datagrid  shows output after logparser has finished the query without help of any text analysis tool in a display format like excel columns and rows.

Event Log Analysis

You will notice huge usage of extract_token(strings, n, ‘|’) as fieldname in my log parser queries below, this is one of the most beautiful ,effective and powerful techniques to extract items within string field in the event logs here “n” is number usually location from where you want to extract and “|” is like delimiter used in grep utility along with cut in Linux ,so you can change value of n to get the item you want out of string field in the event log, you will get an idea when you will see examples below.

                                                    To track Logon success and failure attempts

 

Successful logons :  list all accounts who logged on successfully


Output in grid

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624” -o:DataGrid

Output in CSV

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624” -o:csv >C:\data\exampleDIR\accounts_logon_success.csv


 

Failed logon attempts


Output in grid

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4625” -o:DataGrid

Output in CSV

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4625” -o:csv >C:\data\exampleDIR\accounts_logon_fail.csv


 

logon type-3 : Network logon/Network Access


Output in grid

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’3′” -o:DataGrid

Output in CSV

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’3′” -o:csv >C:\data\exampleDIR\accounts_logon_logontype3.csv


 

logon type 10 :RDP Access


Output in grid

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’10′” -o:DataGrid

Output in CSV

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’10′” -o:csv >C:\data\exampleDIR\accounts_logon_logontype10.csv


logon type- 2 :Interactive i.e. login on the console


Output in grid

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’2′” -o:DataGrid

Output in CSV

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’2′” -o:csv >C:\data\exampleDIR\accounts_logon_logontype2.csv


 

logon type- 5:  service logon


Output in grid

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’5′” -o:DataGrid

Output in CSV

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’5′” -o:csv >C:\data\exampleDIR\accounts_logon_logontype5.csv


 

logon type- 3 :  Network logon (Output in grid)


With authentication method used and source computer

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype,extract_token(strings,10, ‘|’) as auth_method,extract_token(strings, 11, ‘|’) as source_computer FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’3′” -o:DataGrid

With source_ip used and source computer

logparser -i:evt “SELECT distinct extract_token(strings, 5, ‘|’) AS account,extract_token(strings, 8, ‘|’) AS logontype,extract_token(strings,18, ‘|’) as source_ip,extract_token(strings, 11, ‘|’) as computer FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 and extract_token(strings, 8, ‘|’)=’3′” -o:DataGrid


 

No of logon attempts per account


LogParser.exe -i:evt -o:datagrid “SELECT distinct extract_token(strings, 5, ‘|’) AS account, COUNT(*) AS hits FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 GROUP BY account ORDER BY hits”

LogParser.exe -i:evt -o:datagrid “SELECT distinct extract_token(strings, 5, ‘|’) AS account, COUNT(*) AS hits FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4624 GROUP BY account ORDER BY hits”


 

                                                                   Service generated


LogParser.exe -i:evt -o:datagrid “SELECT Timegenerated,extract_token(strings, 0, ‘|’) AS service, extract_token(strings, 1, ‘|’) AS exe FROM ‘C:\data\exampleDIR\system.evt’ WHERE eventid=7045

   Between a time range, when you target a time line.

LogParser.exe -i:evt -o:datagrid “SELECT Timegenerated,extract_token(strings, 0, ‘|’) AS service, extract_token(strings, 1, ‘|’) AS exe FROM ‘C:\data\exampleDIR\system.evt’ WHERE eventid=7045 and Timegenerated >’2016-06-09 09:30:00′ and Timegenerated <‘2016-06-09 20:30:00’ ”


 

                                                                    Process Creation


LogParser.exe -i:evt -o:datagrid “SELECT distinct extract_token(strings, 5, ‘|’) AS exe, COUNT(*) AS hits FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4688 GROUP BY exe ORDER BY hits”

Within Time range

LogParser.exe -i:evt -o:datagrid “SELECT distinct extract_token(strings, 5, ‘|’) AS exe, COUNT(*) AS hits FROM ‘C:\data\exampleDIR\security.evt’ WHERE EventID = 4688 and Timegenerated >’2016-06-09 09:30:00′ and Timegenerated <‘2016-06-09 20:30:00’ GROUP BY exe ORDER BY hits”


 

File Activity analysis


NTFS info and USNinfo are used to dig into file activities e.g. when a file landed on the machine, when changes to a certain files are made, when certain files were deleted , there is a fine document on SANS about file access, creation ,modification, modification of MFT data .MACE or MACB reference  https://www.sans.org/reading-room/whitepapers/forensics/filesystem-timestamps-tick-36842

                                                                      NTFS info


 

Based on filenamecreationDate

Logparser.exe -i:csv -o:datagrid “SELECT Filename AS Source, ParentName, File, FileNameCreationDate, LastModificationDate FROM ‘C:\data\exampleDIR\ntfsinfo*.csv’ WHERE (FileNameCreationDate >= TO_TIMESTAMP(‘2016-04-22 01:00:00’, ‘yyyy-MM-dd HH:mm:ss’) AND FileNameCreationDate < TO_TIMESTAMP(‘2016-04-22 02:00:00’, ‘yyyy-MM-dd HH:mm:ss’))

Based on filenamecreationDate and lastmodificationdate

Logparser.exe -i:csv -o:datagrid “SELECT Filename AS Source, ParentName, File, FileNameCreationDate, LastModificationDate FROM ‘C:\data\exampleDIR\ntfsinfo*.csv’ WHERE (FileNameCreationDate >= TO_TIMESTAMP(‘2016-04-22 01:00:00’, ‘yyyy-MM-dd HH:mm:ss’) AND FileNameCreationDate < TO_TIMESTAMP(‘2016-04-22 02:00:00’, ‘yyyy-MM-dd HH:mm:ss’)) OR (LastModificationDate >= TO_TIMESTAMP(‘2016-04-22 01:00:00’, ‘yyyy-MM-dd HH:mm:ss’) AND LastModificationDate < TO_TIMESTAMP(‘2016-04-22 02:00:00’, ‘yyyy-MM-dd HH:mm:ss’))”

File search

Logparser.exe -i:csv -o:datagrid “SELECT Filename AS Source, ParentName, File, FileNameCreationDate, LastModificationDate FROM ‘C:\data\exampleDIR\ntfsinfo*.csv’ WHERE (ParentName Like ‘%malwarefileoranyfile%’)”

Or

Logparser.exe -i:csv -o:datagrid “SELECT Filename AS Source, ParentName, File, FileNameCreationDate, LastModificationDate FROM ‘C:\data\exampleDIR\ntfsinfo*.csv’ WHERE (File Like ‘%malwarefileoranyfile%’)”

Or

If you want to know if psexec was on that machine 

Logparser.exe -i:csv -o:datagrid “SELECT Filename AS Source, ParentName, File, FileNameCreationDate, LastModificationDate FROM ‘C:\data\exampleDIR\ntfsinfo*.csv’ WHERE (File Like ‘%psexec%’)”


 

                                           USN Info(refer: https://en.wikipedia.org/wiki/USN_Journal)


To see if certain files were deleted, various things can be tried in the conditions e.g. full path of location where you want to see deleted files

Output in grid

LogParser “SELECT * FROM ‘C:\data\exampleDIR\usninfo*.csv’ where (Reason LIKE ‘%create%’) and (FullPath LIKE ‘%malwarefileoranyfile%’)” -o:datagrid

Output in CSV

LogParser “SELECT * FROM ‘C:\data\exampleDIR\USNInfo_C_.csv’ where (Reason LIKE ‘%delete%’)” >deleted_files.csv


Hope you guys in field of forensics or Security Incident response RCA enjoy it and please add your tricks and suggestions in comments section. this could be an open live document if you like. Thanks for reading ..

Comments (0)

Skip to main content