Jumat, 29 Juli 2011

Sorting Windows events by UserID

Sorting Windows events by UserID is a critical piece of auditing. In the code and examples below, I concentrate on:


$Logs="System","Application","Microsoft-Windows-GroupPolicy/Operational"


purposefully leaving out the Security log. We can create a simple function that allows us to check all events logs on any machine sorted by file size:

function CheckEventLogsBySize
{
get-winevent -listlog * | Sort -desc FileSize |
ft -auto LogName,@{Label="FileLogSize(MB)"; Expression={$_.FileSize/1MB}},@{Label="MaxLogSize(MB)"; Expression={$_.MaximumSizeINBytes/1MB}},LastWriteTime,IsLogFull 
}


This gives us a clear view of all log file size, maximum log size, last write time:


We can also sort by 'LastWriteTime'.



function CheckEventLogsByLastWrite
{
get-winevent -listlog * | Sort -desc LastWriteTime |
ft -auto LogName,LastWriteTime,@{Label="FileLogSize(MB)"; Expression={$_.FileSize/1MB}},@{Label="MaxLogSize(MB)"; Expression={$_.MaximumSizeINBytes/1MB}},IsLogFull 
}




We can create a 'filterhashtable' array that searches multiple logs for event instances where there is an associated UserID.  N.B.: 'ea -0' replaces 'erroraction silentlycontinue' or '$erroractionpreference=silentlycontinue'.  '-Max 10000' limits the data collection to the last 10K events.



$Logs="System","Application","Microsoft-Windows-GroupPolicy/Operational"
$a=get-winevent -ea 0 -filterhashtable @{Logname=@($Logs)} -Max 10000

$host.UI.RawUI.BufferSize = new-object System.Management.Automation.Host.Size(500,1000)
$a | where {$_.UserID} | ft TimeCreated,LogName,RecordID,ID,UserID,Message -auto | more






Following from above we can sort for all unique UserIDs:.  



$b=$a | Select TimeCreated,LogName,RecordID,ID,UserID,Message | where {$_.UserID}
$c=$b | Select -unique UserID
$c | more

UserId
------
S-1-5-19
S-1-5-18
S-1-5-21-3270194588-3457409491-1468880025-1004
S-1-5-20


Next we create a list of the UserIDs of which we wish to search:

$UserIDs="S-1-5-18","S-1-5-19","S-1-5-20"
$a | where {($_.UserID -eq $UserIDs[0]) -or ($_.UserID -eq $UserIDs[1]) -or ($_.UserID -eq $UserIDs[2])} |ft TimeCreated,LogName,ID,RecordID,UserID,Message -auto | more




We can also create a function (as below) to accomplish the same output:


function UserID-filter
{
$Logs="System","Application","Microsoft-Windows-GroupPolicy/Operational"
$UserIDs="S-1-5-18","S-1-5-19","S-1-5-20"
$Global:Query=get-winevent -ea 0 -filterhashtable @{Logname=@($Logs)} -max 10000
$Query | Select TimeCreated,LogName,ID,RecordID,UserID,Message | where {($_.UserID -eq $UserIDs[0]) -or ($_.UserID -eq $UserIDs[1]) -or ($_.UserID -eq $UserIDs[2])}
}




Kamis, 28 Juli 2011

Review of Metasploit: The Penetration Tester's Guide Posted

Amazon.com just posted my four star review of Metasploit: The Penetration Tester's Guide by David Kennedy, Jim O’Gorman, Devon Kearns, and Mati Aharoni. From the review:

Metasploit: The Penetration Tester's Guide (MTPTG), is a great book about the Metasploit Framework. I first tried MSF in April 2004 (noted in one of my blog posts) and have since used it to test detection mechanisms, as well as simulate activity by certain threat groups. I've read MSF coverage in a few other books, but MTPTG really outdoes the competition. While I see areas for improvement to be addressed in a second edition, if you have any interest in Metasploit you should check out this book.

Review of Hacking: The Art of Exploitation, 2nd Ed Posted

Amazon.com just posted my five star review of Hacking: The Art of Exploitation, 2nd Ed by Jon Erickson. From the review:

This is the last in a recent collection of reviews on "hacking" books. Jon Erickson's Hacking, 2nd Ed (H2E) is one of the most remarkable books in the group I just read. H2E is in some senses amazing because the author takes the reader on a journey through programming, exploitation, shellcode, and so forth, yet helps the reader climb each mountain. While the material is sufficiently technical to scare some readers away, those that remain will definitely learn more about the craft.

Review of Gray Hat Hacking, 3rd Ed Posted

Amazon.com just posted my three star review of Gray Hat Hacking, 3rd Ed by Allen Harper, Shon Harris, Jonathan Ness, Chris Eagle, Gideon Lenkey, and Terron Williams. From the review:

Critical reviews are my least favorite aspect of my Amazon experience, but I believe readers expect me to be honest with them. Gray Hat Hacking, 3rd Ed (GHH3E) has a lot of potential, but it needs a reboot and a ruthless editor. I read and reviewed the original edition 6 1/2 years ago but skipped the 2nd Ed. This 3rd Ed (published in Jan 2011) features several exceptionally talented authors (such as Allen Harper and Chris Eagle), so my expectations remained high. Unfortunately, after finishing the book I had collected a pile of notes that I will try to transform into constructive commentary for a 4th Ed, which I would enjoy seeing!


Review of Ninja Hacking Posted

Amazon.com just posted my four star review of Ninja Hacking by Thomas Wilhelm and Jason Andress. From the review:

Ninja Hacking is not a typical digital security book. When I saw the title I expected the use of "Ninja" to be a reference to a style of digital attack. While this is true to a certain extent, Ninja Hacking is about actual Ninja concepts applied to the digital world. The book is an introduction to Ninja history and techniques, applied to the modern digital security context. That was not at all what I expected, but I found the result intriguing.


Review of Managed Code Rootkits Posted

Amazon.com just posted my five star review of Managed Code Rootkits by Erez Matula. From the review:

Managed Code Rootkits (MCR) is one of the best books I've read in 2011. MCR is a one-man tour-de-force through the world of malicious software that leverages managed code for its runtime. Prior to reading the book I was only vaguely aware of the concept and implementation. After reading MCR, I am wondering when we might see more of this technique in the wild. Author Erez Metula does almost everything right in MCR, and I strongly recommend reading it.

Review of Buffer Overflow Attacks Posted

Amazon.com just posted my two star review of Buffer Overflow Attacks, by James C. Foster, et al. From the review:

I read "Buffer Overflow Attacks" as part of a collection of books on writing exploit code (reviewed separately). I have to give credit to the author team for writing one of the first books on this subject; Syngress published BOA in 2005, when the subject received less published coverage. However, better books are available now if you want to learn the sort of material found in BOA.

Risk Modeling, not "Threat Modeling"


Thanks to the great new book Metasploit (review pending), I learned of the Penetration Testing Execution Standard. According to the site, "It is a new standard designed to provide both businesses and security service providers with a common language and scope for performing penetration testing (i.e. security evaluations)." I think this project has a lot of promise given the people involved.

I wanted to provide one comment through my blog, since this topic is one I've covered previously. One of the goals of the standard is to name and explain the steps performed in a penetration test. One of them is currently called "threat modeling," and is partly explained using this diagram:



When I saw elements called "business assets," "threat agents," "business process," and so on, I realized this is more of a risk model, not just a "threat model."

I just tagged a few older posts as discussing threat model vs risk model linguistics, so they might help explain my thinking. This issue isn't life or death, but I think it would be more accurate to call this part of the PTES "Risk Modeling."

Rabu, 27 Juli 2011

Noah Shachtman’s Pirates of the ISPs

Two posts in one day? I'm on fire! It's easy to blog when something interesting happens, and I can talk about it.

I wanted to mention the publication of Pirates of the ISPs: Tactics for Turning Online Crooks Into International Pariahs by Noah Shachtman, acting in his capacity as a Nonresident Fellow for Foreign Policy in the 21st Century Defense Initiative at The Brookings Institution. I read and commented on an earlier draft, and I think you will find Noah's paper interesting. From the introduction:

Cybercrime today seems like a nearly insoluble problem, much like piracy was centuries ago. There are steps, however, that can be taken to curb cybercrime’s growth—and perhaps begin to marginalize the people behind it.

Some of the methods used to sideline piracy provide a useful, if incomplete, template for how to get it done. Shutting down the markets for stolen treasure cut off the pirates’ financial lifeblood; similar pushes could be made against the companies that support online criminals.

Piracy was eventually brought to heel when nations took responsibility for what went on within its borders. Based on this precedent, cybercrime will only begin to be curbed when greater authority—and accountability—is exercised over the networks that form the sea on which these modern pirates sail.


I agree with this. My original comments to Noah emphasized that not all malicious activity on the Internet is crime, nor is it conducted by criminals. For example, I wince whenever I see the term APT in the same sentence as crime or criminals (never mind seeing the "cyber" prefix). As long as you keep Noah's emphasis on true crime in mind while you read the paper, I think you will find it compelling. Great work Noah!

SQL Injection Challenge and Time-Based Security

Thanks to this Tweet by @ryancbarnett, I learned of the lessons learned of the Level II component of the ModSecurity SQL Injection Challenge.

As stated on the challenge site, the goal is "To successful execute SQLi against the scanning vendor demo websites and to try and evade the OWASP ModSecurity CRS." The contestants need to identify a SQL injection vector within one of four demo websites, then enumerate certain information from the target.

As also stated on the challenge page, "Winners of this level will be anyone who is able to enumerate the data listed above for each demo app without triggering an Inbound ModSecurity Alert. If ModSecurity sees any inbound attacks or outbound application defects/info leakages, it will prepend a warning banner to the top of the page."

This is interesting, but what caught my attention is the time-based security metrics describing the results of Level II of the challenge. I'll reproduce the relevant section here:

Hacking Resistance (Time-to-Hack)

Many people wrongly assume that installing a Web Application Firewall will make their sites "Hack Proof." Sadly, this is not reality. The real goal of using a web application firewall should be to gain visibility and to make your web applications more difficult to hack meaning that it should take attackers significantly more time to hack a vulnerable web site with a WAF in front in blocking mode vs. if the WAF was not present at all.

The idea is to substantially increase the "Time-to-Hack" metric associated with compromising a site in order allow for operational security to identify the threat and take appropriate actions...

With this in mind, we analyzed how long it took for each Level II winner to develop a working evasion for the CRS v2.2.0. We are basing this off of the correlated IP address in the logs that was tied to the final evasion payloads submitted to the ModSecurity team. We also saw that many Level II winners actually tested their payloads using the CRS Demo page so we had to correlate test payloads there as well.

Avg. # of Requests to find an evasion: 433
Avg. Duration (Time to find an evasion): 72 hrs
Shortest # of Requests to find an evasion: 118
Shortest Duration (Time to find an evasion): 10 hrs

This data shows that having active monitoring and response capabilities of ongoing web attacks is paramount as it may only a matter of hours before a determined attacker finds a way through your defenses.

I [Ed: Ryan, not Richard] realize that there are a multitude of variables and conditions involved where people can say that these numbers are off (either too high or too low) depending on your defenses and attacker skill level. Keep in mind that this metric was obtained from the ModSecurity WAF using mainly a negative security model ruleset. The point of presenting this data, however, is to have some form of metric available for active web application monitoring and defense discussions related to exploitation timelines.


What a great use of empirical data to make a point about security! Like Ryan says, you can argue about the rating of the intruder (does 10 hours really reflect a skilled intruder?) or the defenses (is ModSecurity really sufficient?). I'd answer that they those aspects of the challenge are sound enough to use as benchmarks for a certain portion of the threat community and state-of-the-practice for defenses.

Ten hours, then, represents the window of time between when an intruder would first start trying to compromise the Web app, and when he succeeded. That means the IR team has no more than 10 hours to detect the activity and take action to close the window of vulnerability. That's a tall order, but we have a metric now based on more than hand-waving that we can use to start a discussion of capabilities.

On a related note, this is the sort of activity that a red team could undertake to simulate threat action and identify IR team effectiveness.

Selasa, 26 Juli 2011

The Difference Between RAW and JPG - A Visual Guide

RAW Image Edited in Darktable
Ever since I have been getting more into photography I have always been looking for what I need to do to improve my skills. I am not aspiring to become a professional photographer, but I believe that I can make simple changes to the way I take pictures that will improve the overall quality of my photos.


One of the improvements I made a while ago was shooting everything in the camera's RAW format. I have noticed that there are some pretty big drawbacks to shooting in RAW. Namely, it takes much more memory, it takes longer to edit, and the camera takes longer to process and save the images to my SD card. With all these drawbacks, I decided to do an experiment to convince myself that shooting in RAW is worth the extra effort.


Read article »

Make your own low (or zero) calorie soda pop



I have always been a huge fan of making home made root beer. It is super fun and easy, but more importantly, it is super delicious.

The concept for making root beer is this: Buy root beer concentrate and mix it with sugar and water then add dry ice to give it the carbonation. Easy.


So one day while I was making root beer, I decided to try an experiment. I started putting dry ice in other drinks and found that it was amazing!

Want to make zero calorie soda? Find some zero calorie Crystal Light and add dry ice to it. Want to make an amazing fruit drink? Add dry ice to it. Want to make your favorite Kool-Aid flavor into soda? Add dry ice to it.
I think you get the point. Making soda out of different flavors of drink is fun and delicious!

More Ideas for Tutorials

Well; I have been slacking off. I have been doing a lot less tutorial because I have less time, but mostly because I have run out of ideas for tutorials.

This is where I need your help! I have created a page for anyone to suggest a tutorial. I am open to any ideas. If you have some open source software you want me to try out to do a review or tutorial, let me know. Any idea would be great! If you have a specific picture or idea you want me to try out you can send me an email with the picture attachment (my Gmail name is mckayc).

Suggest a tutorial for Tutorial Geek.

Sabtu, 16 Juli 2011

Best Video Screen Casting/capturing Software for Linux

Recently I have been doing more screen casts for my blog. Unfortunately, in Linux there are not a whole lot of programs available. I will discuss a few a the more popular programs with their pros and cons.

Read article »

Jumat, 15 Juli 2011

Lightroom vs Darktable - RAW photo editing



Recently I have been trying to do a lot more photography shooting in RAW. Because of this, I have been doing a lot less editing in the Gimp and a lot more editing in Darktable. Recently I did a short review/tutorial on Darktable. I have had a few people ask me to compare or show some of the differences with Lightroom and Darktable. This will be a quick overview on some of the differences and similarities between the two programs.

Read article »

Kamis, 14 Juli 2011

Get-Winevent Part III: Querying the Event Log for Logons (Part E)

In Part A of this series ('Get-Winevent Part III Querying the Event Log for logons'), I worked with the 'where-object' cmdlet to filter through properties of specific logon event types. In Part B, I used '-filterhashtable' and 'findstr' to more quickly dig into the message field of logon events, utlimately producing a spreadsheet or database format of those events. In Part C, I presented code that enumerates all provider types for these events.  Then I used '-filterhashtable' with an array of multiple security EventIDs whose select 'Message' fields I searched with 'findstr' for specific properties relating to logons.  In Part D,  I pull this all together, creating a timeline of multiple security EventIDs whose select 'Message' fields I pump into a spreadsheet for further analysis. In Part E (below), I tie in additional auditing events, specifically connections permitted by the Windows Filtering Platform:


'Auditpol' allows the administrator to add additional events to be collected by the Event Viewer. To see all potential categories:


auditpol /get /category:* 


[partial  list:]
System audit policy
Category/Subcategory Setting
System
   Security System Extension Success
   System Integrity Success and Failure
   IPsec Driver Success
   Other System Events Success and Failure
   Security State Change Success
Logon/Logoff
   Logon Success and Failure
   Logoff Success
 ...
A quick trick to set all categories and their subcategories for auditing:


auditpol /set /category:*


After some time, we query the Security log and notice event 5156 for further monitoring:


get-winevent Security -max 100 | ft -auto -wrap | more


7/14/2011 6:59:55 PM Microsoft-Windows-Security-Auditing 5156 The Windows Filtering Platform has permitted a connection.
Application Information:Process ID: 3588Application Name: \device\harddiskvolume3\program files (x86)\opera\opera.exe
Network Information:Direction: OutboundSource Address: 192.168.0.11Source Port: 51199Destination Address: 199.59.149.243Destination Port: 80Protocol: 6 
...
$5156=get-winevent -filterhashtable @{logname='security';id=5156} -max 1000
foreach ($event in $5156) {($event | Select TimeCreated,Message | fl * | findstr /G:Search5156.lst) -replace " ","" -join "," | out-file -append 5156.csv}


where Search5156.lst:
TimeCreated
Source
Destination

Now we can add some headers and create some filters in our spreadsheet:



Get-WinEvent, EventLogs, ETL, Providers on Win7 Part III

Microsoft has exposed substantial providers since XP. With Windows 7, Microsoft has increased the number of providers substantially over previous versions of Windows and added 'netsh trace' functionality to enable tracing, conversion, batching of these kernel level counters. In the commands below, I have mixed cmd shell, powershell, cygwin cmds to parse ETL files. In  general, parsing etl files with 'get-winevent' and powershell takes a while...  You can understand 'netsh' filtering best with 'netsh trace show CaptureFilterHelp', however I recommend setting your 'netsh trace start maxSize=' parameter at 150 MB or less. (The default is an almost unworkable 250MB.)


From cmd.exe, a variable for date/time (e.g. timestamp) could be useful:

realtd.cmd
@echo off
set realdate=%date:/=.%
set realdate=%realdate:* =%
set realtime=%time::=.%
set realtime=%realtime:* =%
set timestamp=%realdate%_%realtime%


From cmd.exe we can start the trace:

netsh trace start provider=Microsoft-Windows-Kernel-Network provider=Microsoft-Windows-Kernel-Process provider=Microsoft-Windows-Security-Auditing provider=Microsoft-Windows-Security-Netlogon provider=Microsoft-Windows-TCPIP persistent=yes traceFile=%LOCALAPPDATA%\Temp\NetTraces\NetTrace%timestamp%.etl

and stop the trace:

netsh trace stop

If we choose we can covert the trace with 'netsh' we can dump it to a text or csv dump with:

netsh trace convert input=NetTrace07.07.2011_1.38.09.40.etl output=NetTrace07.07.2011_1.38.09.40.txt dump=TXT

Next we can try parsing a particular provider from Powershell. Here I choose "Microsoft-Windows-TCPIP" provider. I adjust the screen buffer size to help 'format-table' catch all of the trace line:

$host.UI.RawUI.BufferSize = new-object System.Management.Automation.Host.Size(500,1000)
$Providers="Microsoft-Windows-TCPIP"
$FileName="NetTrace07.07.2011_1.38.09.40.etl"
foreach ($ProviderName in $Providers) {get-winevent -path "$FileName" -oldest | where {$_.ProviderName -eq "$ProviderName"} | ft TimeCreated, Message| out-file -encoding ASCII -file "$FileName$ProviderName.txt"}

I find I can not make GNUWin32 gawk work as advertised inside Powershell.
The following line does not work in Powershell:

## grep -i -w "remote" "$FileName$ProviderName.txt" | tr -s ' ' | gawk '{print $1" "$2" "$3","$4" "$5" "$6" "$7" "$8" "$9}' | out-file -encoding ASCII -append "$ProviderName.csv"

But this will work just fine in Cygwin:

grep -i -w "remote" NetTrace07.07.2011_1.38.09.40.etlMicrosoft-Windows-TCPIP.txt | tr -s ' ' | gawk '{print $1" "$2" "$3","$4" "$5" "$6" "$7" "$8" "$9}' >> NetTrace07.07.2011_1.38.09.40.etlMicrosoft-Windows-TCPIP.csv

and we are looking at a spreadsheet like this:

Selasa, 05 Juli 2011

Get-Winevent Part III: Querying the Event Log for Logons (Part D)

In Part A of this series ('Get-Winevent Part III Querying the Event Log for logons'), I worked with the 'where-object' cmdlet to filter through properties of specific logon event types. In Part B, I used '-filterhashtable' and 'findstr' to more quickly dig into the message field of logon events, utlimately producing a spreadsheet or database format of those events. In Part C, I presented code that enumerates all provider types for these events.  Then I used '-filterhashtable' with an array of multiple security EventIDs whose select 'Message' fields I searched with 'findstr' for specific properties relating to logons.  In this post (Part D),  I pull this all together, creating a timeline of multiple security EventIDs whose select 'Message' fields I pump into a spreadsheet for further analysis.


Here I get the desired 'logon' events into spreadsheet format:

$EventLogonIDs="4611","4624","4625","4634","4647","4648","4672","4774","4775","4908","4964"
$MultipleIDLogEntries=Get-WinEvent -FilterHashtable @{Logname='security';Id=@($EventLogonIDs)}
foreach ($item in  $MultipleIDLogEntries) {($item | Select TimeCreated, Message | fl * | findstr /G:search.lst) -replace"  ","" -join "," | out-file -append test5.csv }


where search.lst :

TimeCreated
Security ID:
Account Name:
Account Domain:
Logon ID:
Logon Type:
Logon GUID:
Process Name:


Now I get the desired 'sleep' events into spreadsheet format. (My original concern was understanding a why my Windows 7 PC spontaneously "resumes from sleep" by itself and seemingly commences a log-on.)

$EventLogonIDs="1","42"
$MultipleIDLogEntries=Get-WinEvent -FilterHashtable @{Logname='system';Id=@($EventLogonIDs)}
foreach ($item in  $MultipleIDLogEntries) {($item | Select TimeCreated, Message | fl * | findstr /I /G:search.lst) -replace"  ","" -join "," | out-file -append test6.csv }


where search.lst :

TimeCreated
sleep



Now I mux the two data sets and output the combined csv:

$a=gc .\test5.csv
$b=gc .\test6.csv
$c=$a+$b
$c | out-file test7.csv

Once I translate the csv to a spreadsheet's native format, add column headers, format the Date/Time Column (the unique identifier for our purposes) and sort by Date/Time, I have a story book of events for the muxed security (e.g. 'logon') and system (e.g. 'sleep') events:


Next we need to discuss how to add additional Security auditing events to our storybook in Part E.

Bejtlich Teaching in Abu Dhabi in December

I'm pleased to announce that on December 12-13 at Black Hat Abu Dhabi I will teach a special two-day edition of TCP/IP Weapons School 3.0.

This class is designed for junior and intermediate security analysts. The "sweet spot" for the potential student is someone working in a security operations center (SOC) or computer incident response team (CIRT), or someone trying to establish one of those organizations. The class is very hands-on, and focuses on labs and discussions. There are less than 10 slides at the very beginning of the class, and I build the flow of the class based on what you want to hear.

If you would like details on the class, please see the linked site. You may also find my announcement for my Black Hat sessions on 30-31 July and 1-2 August to be helpful too. I'm looking forward to seeing you learn the investigative mindset needed to detect and respond to digital intrusions!

Black Hat has four remaining price points and deadlines for registration.

  • "Best" ends 15 August

  • "Early" ends 17 August

  • "Late" ends 12 December

  • Onsite starts at the conference


Seats are filling -- it pays to register early!

On a related note, we're almost one month away from my 8-9 August TCP/IP Weapons School 3.0 in San Francisco at USENIX Security 2011. Seats are filling in that class too!

I'm also still working on the details for a northern VA TCP/IP Weapons School 3.0 class. When I have them ready I will post them. Thank you.

Sabtu, 02 Juli 2011

Get-Winevent Part III: Querying the Event Log for Logons (Part C)

To list Opcodes, Event IDs, Event Descriptions from any group of provider's (e.g. Securit*) events, you can use:

$ProviderNames=get-winevent -listprovider microsoft-windows-Securit* | % {$_.Name}
$ProviderNames | % {((get-winevent -listprovider $_).events) | format-table @{Name="Opcode"; Expression = {$_.Opcode.Name}},ID,Description -auto -wrap}


To create a list of EventIDs from the Security Log for which want more information, we could download  "Windows 7 and Windows Server 2008 R2 Security Event Descriptions"   or we could select the string log from our provider query:

$ProviderNames | % {((get-winevent -listprovider $_).events) | format-table @{Name="Opcode"; Expression = {$_.Opcode.Name}},ID,Description | findstr "win:"} | out-file  SecurityIDs.txt
Select-string "log" -path SecurityIDs.txt | ft -auto Line

           4611 A trusted logon process has been registered with the Local ...
           4624 An account was successfully logged on....
           4625 An account failed to log on....
           4634 An account was logged off....
           4647 User initiated logoff:...
           4648 A logon was attempted using explicit credentials....
           4672 Special privileges assigned to new logon....
           4774 An account was mapped for logon....
           4775 An account could not be mapped for logon....
           4908 Special Groups Logon table modified....
           4964 Special groups have been assigned to a new logon....

Now we query all of our particular IDs in question using the search list for findstr below:

$LogonIDs="4611","4624","4625","4634","4647","4648","4672","4774","4775","4908","4964"
foreach ($item in $LogonIDs) {(Get-WinEvent -max 100 -FilterHashtable @{Logname='security';Id=$item} | Select TimeCreated,Message | fl * | findstr /G:search.lst) -replace"  "," " | out-file -append "$item.txt" }

where search.lst :

TimeCreated
Security ID:
Account Name:
Account Domain:
Logon ID:
Logon Type:
Logon GUID:
Process Name:

This gives us some hits for the EventID numbers in separate files which contain entries that look like this:

PS C:\ps1> more 4624.txt
TimeCreated : 7/2/2011 7:25:59 PM
                  Security ID:        S-1-5-18
                  Account Name:        RMFVPC$
                  Account Domain:        RMFDEVELOPMENT
                  Logon ID:        0x3e7
              Logon Type:            5
                  Security ID:        S-1-5-18
                  Account Name:        SYSTEM
                  Account Domain:        NT AUTHORITY
                  Logon ID:        0x3e7
                  Logon GUID:        {00000000-0000-0000-0000-000000000000}
                  Process Name:        C:\Windows\System32\services.exe
....
But this type of query isn't very fast. So we can tuck our array of events inside a '-FilterHashtable' array and then query the message field with 'findstr':

$LogonIDs="4611","4624","4625","4634","4647","4648","4672","4774","4775","4908","4964"
$MultipleIDLogEntries=Get-WinEvent -max 100 -FilterHashtable @{Logname='security';Id=@($LogonIDs)}
($MultipleIDLogEntries | Select TimeCreated,Message | fl * | findstr /G:search.lst) -replace" "," " | out-file -append Events_all.txt

This output also contain entries that look like this:

PS C:\ps1> more Events_all.txt
TimeCreated : 7/2/2011 7:25:59 PM
                  Security ID:        S-1-5-18
                  Account Name:        SYSTEM
                  Account Domain:        NT AUTHORITY
                  Logon ID:        0x3e7
TimeCreated : 7/2/2011 7:25:59 PM
                  Security ID:        S-1-5-18
                  Account Name:        RMFVPC$
                  Account Domain:        RMFDEVELOPMENT
                  Logon ID:        0x3e7
              Logon Type:            5
                  Security ID:        S-1-5-18
                  Account Name:        SYSTEM
                  Account Domain:        NT AUTHORITY
                  Logon ID:        0x3e7
                  Logon GUID:        {00000000-0000-0000-0000-000000000000}
                  Process Name:        C:\Windows\System32\services.exe

But what we really need is a way to parse this output into a csv... so on to  Get-Winevent Part III: Querying the Event Log for Logons (Part D)

Jumat, 01 Juli 2011

Google+ vs Facebook - Who will win?

Google+ is Google's latest attempt to take some of the social network from Facebook.


Now that I have Google+ I thought it would be interesting to do a short little comparison. It is fairly obvious that Google is trying their hand at social networking and hoping to dive into Facebook's market. It will be interesting to see if that can actually happen.

As an evaluation, I decided to list the features that people use on Facebook and compare them to Google+

Read article »

Get-Winevent Part III: Querying the Event Log for Logons (Part B)

This is a long post that I've edited from a answer I gave on Stack Overflow.  Although the post is about how to audit logon information in the Security log of Windows 7, it is also about discovering methods to extract critical information from the 'Message' field of a "Logon Type" (ID=4624).  




Get-WinEvent -max 100 | where { $_.Message | findstr /C:"Logon Type"}  | Select Message | fl * | findstr /C:"Logon Type"

  Logon Type:            5
  Logon Type:            7
  Logon Type:            7
  Logon Type:            7
  Logon Type:            7
  Logon Type:            5
  ...

Get-WinEvent Security -max 100 | Select ID,Level,Message | where { $_.Message | findstr /C:"Logon Type"}  | ft -auto -wrap | more

  Id Level Message
  -- ----- -------
4624     0 An account was successfully logged on.

           Subject:
               Security ID:        S-1-5-18
               Account Name:        RMFVPC$
               Account Domain:        RMFDEVELOPMENT
               Logon ID:        0x3e7

           Logon Type:            5
....
Now I introduce '-FilterHashtable' parameter which greatly speeds up queries. Below are two commands which do essentially the same thing in about the same period of time:

Get-WinEvent -max 100 -FilterHashtable @{Logname='security';ID=4624} | ft TimeCreated,MachineName,Message -auto -wrap | more

Get-WinEvent -max 100 -FilterHashtable @{Logname='security';ID=4624} | Select TimeCreated,MachineName,Message | ft -auto -wrap | more

TimeCreated           MachineName Message
-----------           ----------- -------
6/29/2011 12:36:35 PM rmfvpc      An account was successfully logged on.

                                  Subject:
                                      Security ID:        S-1-5-18
                                      Account Name:        RMFVPC$
                                      Account Domain:        RMFDEVELOPMENT
                                      Logon ID:        0x3e7

                                  Logon Type:            5
...

Get-WinEvent -max 100 -FilterHashtable @{Logname='security';ID=4624} | Select TimeCreated,MachineName,Message | Select-string "Logon Type" | more

@{TimeCreated=06/29/2011 12:36:35; MachineName=rmfvpc; Message=An account was successfully logged on.

Subject:
    Security ID:        S-1-5-18
    Account Name:        RMFVPC$
    Account Domain:        RMFDEVELOPMENT
    Logon ID:        0x3e7

Logon Type:            5
...
This last script allows me to dump pre-selected information from the logon events Message field into a spreadsheet. Very useful.

 $LogonTypes=Get-WinEvent -FilterHashtable @{Logname='security';Id=4624}
 foreach ($item in $LogonTypes) {($item | Select TimeCreated, Message | fl * | findstr /G:search.lst) -replace"  ","" -join "," | out-file -append test3.csv }

where search.lst :

TimeCreated
Security ID:
Account Name:
Account Domain:
Logon ID:
Logon Type:
Logon GUID:
Process Name:

The result is a spreadsheet that looks like this:


However, what I need is to be able to search the message field of multiple 'logon' events types...so on to

Get-Winevent Part III: Querying the Event Log for Logons (Part C)