So you have a mountain of IIS logs, and you or someone in your organization is asking questions the answers to which might be found in that endless stream of log data. There are many ways to analyze IIS logs, and there is certainly no compelling reason to reinvent the wheel here – its a well beaten path. The great thing about Log Parser is its feature rich, relatively easy to learn, and you get results very quickly.
Log Parser 2.2 is a command line tool which supports querying text-based data from many log file formats. That is, familiar SQL-like statements can be executed against log files for fast access to the data within. In the case of Log Parser the documentation included with the download is very reliable so much so that Google is likely not the first place to go with questions about the tool (fight the urge!). Page one of the help files provides an overview of various basic features such as a simple query of log files, the results of the query being output to an html format, and the generation of a graph representing a query result all via command line instructions. No point in rehashing these basics here as the documentation does a stellar job.
In my situation, I wanted to do some heavier reporting on the IIS log data and furthermore wanted to put the log data in the context of the application (which conveniently captures user and IP address data to SQL server). Log Parser installs with SQL Server and MS Access support.
To push query results to SQL Server, here is an example:
LogParser "SELECT * INTO Logs FROM C:\Logs\LogRepository\*.log" -iCheckPoint:C:\Logs\LogRepository\LogCheckpoint -o:SQL -server:(local) -database:LogWarehouse -driver:"SQL Server" -username:dev -password:supersecret -createTable:ON -ignoreIdCols:ON
After installing Log Parser you can run it from Start Menu. Once the console is open, you can type (or paste) the above code into the command line. Log parser queries (for the most part) begin with LogParser so be sure to prepend this to your command. The actual query here is the trivial case in that I’m pushing ALL data into SQL Server.
SELECT * INTO ... FROM ...
More complex and/or specific commands are definitely possible!
The rest of the command is mostly connection-string like configuration data, such as server and database name along with credentials. The real interesting pieces are -iCheckPoint and the last two arguments -createTable and -ignoreIdCols.
iCheckPoint is a great feature of Log Parser. When this flag is set a file is created with metadata about the position of the last entry queried. So if you have 100 log entries and you execute a query for the first time (a checkpoint doesn’t exist) then all 100 records are queried and a checkpoint is created. A future query will then only execute against new log entries (and files) and the checkpoint is updated. See the documentation for an explanation of how the checkpoint stores data and handles various logging scenarios.
Because I set the -createTable and -ignoreIdCols flags, a table will be created (if it doesn’t exist) and any identity columns found will be ignored. If the table doesn’t exist Log Parser will create the table with all columns for the entire IIS Log File Format regardless of if the log file contains data for the columns. This is ideal as you can turn logging options on and off without added maintenance to the table.
LogID is not a convention of LogParser, but instead an identity column I added to the table for my own selfish reasons. You can easily set LogParser to continuously add logs to your table as they come in, and the iCheckPoint gives LogParser a fighting chance at scalability. I’ve pushed just over 2 million logs to a SQL table at once and it was a manageable wait – approximately 10 minutes (Core2Quad, 4GB RAM, under existing load).
I can definitely say Log Parser is a great go-to tool, especially when prototyping or scenarios when results are needed/wanted very quickly. Set-up is effortless, documentation is fantastic, and the learning curve is friendly so don’t hesitate to give it a shot!