Log Parser Rocks! More than 50 Examples!

Log Parser is a tool that has been around for quite some time (almost six years, in fact).  I can’t really do any better than the description on the official download page, so here it is: “Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory”.  

Log Parser is a command line (yes, command line!) tool that uses a SQL dialect to extract information from data sources.  In particular, I have found it to be invaluable for extracting information from the web server logs of the sites that I manage and develop.

First, about that SQL syntax Log Parser uses to query the data sources… many developers seem to have a natural aversion to SQL.  In addition, many new data access frameworks attempt to abstract SQL away from the developer.  However, I have always found SQL easy to work with and believe it to be an essential tool that every developer should at least have a working knowledge of.   For Log Parser, all that is necessary is a basic understanding of the core SQL SELECT statement, as implemented within Microsoft’s SQL Server (that is, T-SQL).  That means you should be familiar with the following elements of a SELECT statement: TOP, FROM, INTO, WHERE, ORDER BY, GROUP BY.  That’s all you need to perform most Log Parser operations.

Curiously, Log Parser has never received the amount of attention that I think it deserves.  Beyond a flurry of attention when it was first released, it seems to be mentioned rarely in official Microsoft communications or blogs.  Despite that, it remains a viable and valuable tool for parsing not just web server log files, but all types of structured text-based data.

In this post, rather than explaining how to use Log Parser. I’ll give a number of examples of its use.  In addition, I’ll document some useful locations where Log Parser information can be found on the web.

Examples

Keep in mind that most of the examples that I give here are all-in-one command line queries (even though many wrap to multiple lines when displayed here).  However, queries can also be run as

logparser file:XXXXX.sql

where XXXXX is the name of a file containing a logparser-friendly sql query.  There are a couple examples of this in the following list.

The examples given here have been obtained from a variety of sources, including the documentation that ships with the tool, blogs and online documentation, and my own experience.  Unfortunately, I don’t have a record of the origin of each individual example, as I’ve compiled these piecemeal over the last two or three years.

I hope you’ll find something useful here and gain an appreciation for just how robust this tool is.

1)  All pages hits by a given IP address

logparser "select cs-uri-stem, count(cs-uri-stem) as requestcount from [LogFileName] where c-ip = ’000.00.00.000′ group by cs-uri-stem order by count(cs-uri-stem) desc"

2) Hits on a particular page by IP address

logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] where cs-uri-stem like ‘/search.aspx%’ group by c-ip order by count(c-ip) desc"

3)  ReverseDNS example.  This attempts to find the domain associated with a given IP address.

logparser "select c-ip, REVERSEDNS(c-ip) from [LogFileName] where c-ip = ’000.00.00.000′ group by c-ip"

4)  CSV example. All hits on a page, written to a CVS file.

logparser "select * into OUTPUT.CSV from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’"

5)  Chart example.  All hits on a page by an IP address, displayed on a chart.

logparser "select c-ip, count(c-ip) as requestcount into logparserchart.gif from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’ group by c-ip order by count(c-ip) desc" -o:chart

6)  Hits per hour from a particular IP address

logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.000′ group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"

7)  Basic list of IP addresses generating traffic

logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] group by c-ip order by count(c-ip) desc"

8)  Basic list of pages being hit

logparser "select cs-uri-stem, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem order by count(cs-uri-stem) desc"

9)  Basic list of pages being hit, including which IPs are doing the hitting

logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"

10)  Pages being hit after a specific date and time

logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ and date=’2009-06-04′ and time > ’15:00:00′ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"

11)  Counts of hits of ASPX/ASHX pages by hour from a particular IP address

logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"

12)  Counts of hits against specific pages by hour from a particular IP address

logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), cs-uri-stem, count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600)), cs-uri-stem order by numberrequests desc"

13)  Top browsers

logparser "Select top 50 to_int(mul(100.0,PropCount(*))) as Percent, count(*) as TotalHits, cs(User-Agent) as Browser from [LogFileName] group by Browser order by Totalhits desc"

14)  Hourly Bandwidth (chart)

logparser "Select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)) As Hour, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByHour.gif From [LogFileName] Group By Hour"

15)  Requests by URI

logparser "SELECT top 80 QUANTIZE(TO_TIMESTAMP(date, time), 3600) as Hour, TO_LOWERCASE(STRCAT(‘/’,EXTRACT_TOKEN(cs-uri-stem,1,’/’))) as URI, COUNT(*) AS RequestsPerHour, SUM(sc-bytes) AS TotBytesSent, AVG(sc-bytes) AS AvgBytesSent, Max(sc-bytes) AS MaxBytesSent, ADD(1,DIV(Avg(time-taken),1000)) AS AvgTime, ADD(1,DIV(MAX(time-taken),1000)) AS MaxTime FROM [LogFileName] GROUP BY Hour, URI Having RequestsPerHour > 10 ORDER BY RequestsPerHour ASC"

16)  Top 10 Images by size

logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent From [LogFileName] Where (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘gif’;’jpg’;’png’)) AND (sc-status = 200) Group By To_Lowercase(cs-uri-stem) Order By BytesSent, Hits, MaxTime DESC"

17)  Top 10 URLs for a website, with total hits, max time to serve, and average time to serve

logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent FROM [LogFileName] GROUP BY cs-uri-stem ORDER BY TotalHits DESC"

18)  Top 20 clients

logparser "Select Top 20 c-ip AS Client, Count(*) AS Hits INTO Chart.gif FROM [LogFileName] GROUP BY c-ip ORDER BY Hits Desc"

19)  Referrer Broken Links (i.e. external references to broken links on your site)

logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.html FROM [LogFileName] WHERE cs(Referer) IS NOT NULL AND sc-status = 404 AND (sc-substatus IS NULL OR sc-substatus=0)" -tpl:ReferBrokenLinks.tpl

20)  Status codes

logparser "SELECT sc-status As Status, COUNT(*) As Number INTO StatusCodes.gif FROM <2> GROUP BY Status ORDER BY Status"

21)  Search the Event Log for W3SVC (IIS) log entries and color-coordinate as to Error, Warning, Information.  This example writes the output of the query to an HTML file that  is generated using a template file.

logparser "SELECT TimeGenerated,EventTypeName,Strings,Message,CASE EventTypeName WHEN ‘Error event’ THEN ‘RED’ WHEN ‘Warning event’ THEN ‘YELLOW’ WHEN ‘Information event’ THEN ‘WHITE’ ELSE ‘BLUE’ END As Color INTO file.html FROM System WHERE SourceName = ‘W3SVC’"  -tpl:IISEventLogEntries.tpl

Where IISEventLogEntries.tpl is a file that contains the following:

<LPHEADER>
<HTML>
<HEAD>
  <STYLE>
    TD { font-family: Arial };
    TH { font-family: Arial };
  </STYLE>
</HEAD>
<BODY>
<TABLE BORDERCOLOR="BLACK" BORDER="1" CELLPADDING="2" CELLSPACING="2">
<TR>
  <TH COLSPAN=4 BGCOLOR="BLACK"><FONT COLOR=WHITE>New W3SVC Messages in System Event Log</FONT></TH>
</TR>
<TR>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Time Generated</TH>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Event Type</TH>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Strings</TH>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Message</TH>
</TR>
</LPHEADER>
<LPBODY>
<TR bgCOLOR="%Color%">
  <TD>%TimeGenerated%</TD>
  <TD>%EventTypeName%</TD>
  <TD>%Strings%</TD>
  <TD>%Message%</TD>
</TR>
</LPBODY>
</TABLE>
</BODY>
</HTML>

22)  Upload Log Parser query results directly to a table in SQL Server

logparser "select * into LogTable from [LogFileName] where cs-uri-stem like ‘/folder/filename%’" -o:SQL -createTable:ON -server:[DatabaseServer] -database:[Database] -username:[SqlUser] -password:[SqlPassword]

23)  Top 10 images by size sent.  Note that this example also shows how to query multiple log files at once.

logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO TOP10ImagesBySize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN  (‘gif’;’jpg’;’png’)) AND (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"

24)  Browser types (two different approaches)

logparser "SELECT distinct cs(User-Agent), count(*) as hits INTO useragentsalltypes.txt FROM logs\iis\ex*.log GROUP BY cs(user-agent) ORDER BY hits DESC"

logparser "SELECT TO_INT(MUL(100.0,PROPCOUNT(*))) AS Percent,  COUNT(*) AS Hits, cs(User-Agent) as Browser INTO  UseragentsHits.txt FROM  logs\iis\ex*.log  GROUP BY Browser ORDER BY HITS DESC"

25)  Unique visitors per day.  This requires two queries.  The first query selects from the IIS logs into a CSV file, and the second selects from that CSV file.

logparser "SELECT DISTINCT cs-username, date INTO tempUniqueVisitorsPerDay.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL Group By Date, cs-username"

logparser "SELECT date, count(cs-username) as UniqueVisitors into test.txt FROM tempUniqueVisitorsPerDay.csv GROUP BY date"

26)  Top 10 largest ASPX pages.

logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS  RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile,  Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO top10pagesbysize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘aspx’)) AND  (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"

27)  Top 10 slowest ASPX pages

logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ GROUP BY cs-uri-stem ORDER BY MaxTime DESC"

28)  Top 10 slowest ASPX pages on a specific day

logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ AND TO_STRING(To_timestamp(date, time), ‘MMdd’)=’1003′  GROUP BY cs-uri-stem ORDER BY MaxTime DESC"

29)  Daily bandwidth

logparser "Select To_String(To_timestamp(date, time), ‘MM-dd’) As Day, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByDay.gif From logs\iis\ex*.log Group By Day"

30)  Bandwidth by hour

logparser "SELECT QUANTIZE(TO_TIMESTAMP(date, time), 3600) AS Hour, SUM(sc-bytes) AS TotalBytesSent INTO BytesSentPerHour.gif FROM logs\iis\ex*.log GROUP BY Hour ORDER BY Hour"

31)  Average page load time per user

logparser "Select Top 20 cs-username AS UserName, AVG(time-taken) AS AvgTime,  Count(*) AS Hits INTO AvgTimePerUser.txt FROM logs\iis\ex*.log WHERE cs-username IS NOT NULL GROUP BY cs-username ORDER BY AvgTime DESC"

32)  Ave page load time for a specific user

logparser "Select cs-username AS UserName, AVG(time-taken) AS AvgTime,  Count(*) AS Hits INTO AvgTimeOnSpecificUser.txt FROM logs\iis\ex*.log WHERE cs-username = ‘CONTOSO\User1234’ GROUP BY cs-username"

33)  Error trends.  This query is quite long, and is easier expressed in a text file than on the command line.  So, Log Parser reads and executes the query contained in the specified text file.

logparser file:errortrend.sql

Where errortrend.sql contains the following:

SELECT
  TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
  SUM(c200) AS 200s,
  SUM(c206) AS 206s,
  SUM(c301) AS 301s,
  SUM(c302) AS 302s,
  SUM(c304) AS 304s,
  SUM(c400) AS 400s,
  SUM(c401) AS 401s,
  SUM(c403) AS 403s,
  SUM(c404) AS 404s,
  SUM(c500) AS 500s,
  SUM(c501) AS 501s,
  SUM(c502) AS 502s,
  SUM(c503) AS 503s,
  SUM(c504) AS 504s,
  SUM(c505) AS 505s
USING
  CASE sc-status WHEN 200 THEN 1 ELSE 0 END AS c200,
  CASE sc-status WHEN 206 THEN 1 ELSE 0 END AS c206,
  CASE sc-status WHEN 301 THEN 1 ELSE 0 END AS c301,
  CASE sc-status WHEN 302 THEN 1 ELSE 0 END AS c302,
  CASE sc-status WHEN 304 THEN 1 ELSE 0 END AS c304,
  CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
  CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
  CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
  CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
  CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
  CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
  CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
  CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
  CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
  CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO ErrorChart.gif
FROM
    logs\iis\ex*.log
GROUP BY
  Day
ORDER BY
  Day

34)  Win32 errors

logparser "SELECT sc-win32-status as ErrorNumber, WIN32_ERROR_DESCRIPTION(sc-win32-status) as ErrorDesc, Count(*) AS Total INTO Win32ErrorNumbers.txt FROM logs\iis\ex*.log WHERE sc-win32-status>0 GROUP BY ErrorNumber ORDER BY Total DESC"

35)  Substatus codes

logparser "SELECT sc-status, sc-substatus, Count(*) AS Total INTO 401subcodes.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY sc-status, sc-substatus ORDER BY sc-status, sc-substatus DESC"

36)  Substatus codes per day.  This is another example of executing a query contained in a text file.

logparser file:substatusperday.sql

Where substatusperday.sql contains the following:

SELECT
  TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
  SUM(c1) AS 4011,
  SUM(c2) AS 4012,
  SUM(c3) AS 4013,
  SUM(c4) AS 4014,
  SUM(c5) AS 4015,
  SUM(c7) AS 4017
USING
  CASE sc-substatus WHEN 1 THEN 1 ELSE 0 END AS c1,
  CASE sc-substatus WHEN 2 THEN 1 ELSE 0 END AS c2,
  CASE sc-substatus WHEN 3 THEN 1 ELSE 0 END AS c3,
  CASE sc-substatus WHEN 4 THEN 1 ELSE 0 END AS c4,
  CASE sc-substatus WHEN 5 THEN 1 ELSE 0 END AS c5,
  CASE sc-substatus WHEN 7 THEN 1 ELSE 0 END AS c7
INTO
  401subcodesperday.txt
FROM
  logs\iis\ex*.log
WHERE
  sc-status=401
GROUP BY
  Day
ORDER BY
  Day

37)  Substatus codes per page

logparser "SELECT TOP 20 cs-uri-stem, sc-status, sc-substatus, Count(*) AS Total INTO 401Pagedetails.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY cs-uri-stem, sc-status, sc-substatus ORDER BY Total"

38)  MB sent per HTTP status code

logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.htm FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"

39) 500 errors per ASPX and Domain User

logparser "SELECT cs-username, cs-uri-stem, count(*) as Times INTO 500PagesByUserAndPage.txt FROM logs\iis\ex*.log WHERE sc-status=500 GROUP BY  cs-username, cs-uri-stem ORDER BY Times DESC"

40)  Percent of 500 errors caused by each user

logparser "SELECT cs-username, count(*) as Times, propcount(*) as Percent INTO 500ErrorsByUser.csv FROM  logs\iis\ex*.log WHERE sc-status=500 GROUP BY cs-username ORDER BY Times DESC"

41)  Determine what percentage of the total bytes sent are being caused by each page type

logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.txt FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"

42)  Top 20 pages with a specific HTTP return code

logparser "SELECT TOP 20 cs-uri-stem, sc-status, Count(*) AS Total INTO TOP20PagesWith401.txt FROM logs\iis\ex*.log WHERE TO_LOWERCASE(cs-uri-stem) LIKE ‘%.aspx’ and sc-status=401 GROUP BY cs-uri-stem, sc-status ORDER BY Total, cs-uri-stem, sc-status DESC"

43)  Check traffic from IP addresses

logparser "Select c-ip AS Client, Div(Sum(cs-bytes),1024) As IncomingBytes(K), Div(Sum(sc-bytes),1024) As OutgoingBytes(K), MAX(time-taken) as MaxTime, AVG(time-taken) as AvgTime, count(*) as hits INTO errorsperip.txt FROM logs\iis\ex*.log GROUP BY client ORDER BY Hits DESC"

44)  Check errors by IP address

logparser file:errorbyip.sql

Where errorbyip.sql contains the following:

Select
  c-ip AS Client,
  SUM(c400) AS 400s,
  sum(c401) AS 401s,
  SUM(c403) AS 403s,
  SUM(c404) AS 404s,
  SUM(c500) AS 500s,
  SUM(c501) AS 501s,
  SUM(c502) AS 502s,
  SUM(c503) AS 503s,
  SUM(c504) AS 504s,
  SUM(c505) AS 505s
USING
  CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
  CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
  CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
  CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
  CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
  CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
  CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
  CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
  CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
  CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO
  IPNumberFileName.txt
FROM
    logs\iis\ex*.log
WHERE
    c-ip=’<IP address goes here>’
GROUP BY
    client

45)  Find broken links

logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.txt FROM logs\iis\ex*.log WHERE cs(Referer) IS NOT NULL AND sc-status=404 AND (sc-substatus IS NULL OR sc-substatus=0)"

46)  Top 10 pages with most hits

logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent INTO Top10Urls.txt FROM logs\iis\ex*.log GROUP BY cs-uri-stem ORDER BY TotalHits DESC"

47)  Unique users per browser type (requires two queries)

logparser "SELECT DISTINCT cs-username, cs(user-agent) INTO UserAgentsUniqueUsers1.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL GROUP BY cs-username, cs(user-agent)"

logparser "SELECT cs(user-agent), count(cs-username) as UniqueUsersPerAgent, TO_INT(MUL(PROPCOUNT(*), 100)) AS Percentage INTO UniqueUsersPerAgent.txt FROM UserAgentsUniqueUsers1.csv GROUP BY  cs(user-agent) ORDER BY UniqueUsersPerAgent DESC"

48)  Bytes sent per file extension

logparser "SELECT EXTRACT_EXTENSION( cs-uri-stem ) AS Extension, MUL(PROPSUM(sc-bytes),100.0) AS PercentageOfBytes, Div(Sum(sc-bytes),1024) as AmountOfMbBytes INTO BytesPerExtension.txt FROM logs\iis\ex*.log GROUP BY Extension ORDER BY PercentageOfBytes DESC"

49)  Domains referring traffic to your site

logparser "SELECT EXTRACT_TOKEN(cs(Referer), 2, ‘/’) AS Domain, COUNT(*) AS [Requests] INTO ReferringDomains.txt FROM  logs\iis\ex*.log GROUP BY Domain ORDER BY Requests DESC"

50)  OS types (requires two queries)

logparser "SELECT DISTINCT c-ip, cs(user-agent) INTO UserAgentsUniqueUsers.csv FROM logs\iis\ex*.log WHERE c-ip <> NULL GROUP BY c-ip, cs(user-agent)"

logparser file:getos.sql

Where getos.sql contains the following:

SELECT
  SUM (c70) AS Win7,
  SUM (c60) AS Vista,
  SUM (c52) AS Win2003,
  SUM (c51) AS WinXP,
  SUM (C50) AS Win2000,
  SUM (W98) AS Win98,
  SUM (W95) AS Win95,
  SUM (W9x) AS Win9x,
  SUM (NT4) AS WinNT4,
  SUM (OSX) AS OS-X,
  SUM (Mac) AS Mac-,
  SUM (PPC) AS Mac-PPC,
  SUM (Lnx) AS Linux
USING
  CASE strcnt(cs(User-Agent),’Windows+NT+6.1′) WHEN 1 THEN 1 ELSE 0 END AS C70,
  CASE strcnt(cs(User-Agent),’Windows+NT+6.0′) WHEN 1 THEN 1 ELSE 0 END AS C60,
  CASE strcnt(cs(User-Agent),’Windows+NT+5.2′) WHEN 1 THEN 1 ELSE 0 END AS C52,
  CASE strcnt(cs(User-Agent),’Windows+NT+5.1′) WHEN 1 THEN 1 ELSE 0 END AS C51,
  CASE strcnt(cs(User-Agent),’Windows+NT+5.0′) WHEN 1 THEN 1 ELSE 0 END AS C50,
  CASE strcnt(cs(User-Agent),’Win98′) WHEN 1 THEN 1 ELSE 0 END AS W98,
  CASE strcnt(cs(User-Agent),’Win95′) WHEN 1 THEN 1 ELSE 0 END AS W95,
  CASE strcnt(cs(User-Agent),’Win+9x+4.90′) WHEN 1 THEN 1 ELSE 0 END AS W9x,
  CASE strcnt(cs(User-Agent),’Winnt4.0′) WHEN 1 THEN 1 ELSE 0 END AS NT4,
  CASE strcnt(cs(User-Agent),’OS+X’) WHEN 1 THEN 1 ELSE 0 END AS OSX,
  CASE strcnt(cs(User-Agent),’Mac’) WHEN 1 THEN 1 ELSE 0 END AS Mac,
  CASE strcnt(cs(User-Agent),’PPC’) WHEN 1 THEN 1 ELSE 0 END AS PPC,
  CASE strcnt(cs(User-Agent),’Linux’) WHEN 1 THEN 1 ELSE 0 END AS Lnx
INTO
  GetOSUsed.txt
FROM
  UserAgentsUniqueUsers.csv

51)  Get timeout errors from the server Event Log.  Display results in a datagrid.

logparser "select * from \\servername\application where message like ‘%timeout expired%’" -i:EVT -o:datagrid

52)  Get exceptions from the server Event (Application) Log

logparser "select timegenerated, eventtypename, eventcategoryname, message into webserverlog.csv from \\servername\application where message like ‘%myapplication%exception%’" -i:EVT

Links

Check out the links below to find more in-depth discussion of Log Parser, as well as even more examples of its usage.

    About these ads

    38 Responses to Log Parser Rocks! More than 50 Examples!

    1. petarp says:

      Thnx men! The BEST list of useful examples for MS Logparser one can find on the Web. The power of logparser is unbelievable. I will only say that I’m also using Log Parser Lizard GUI from Lizard Labs and I will recommend it to every Logparser user, developer and system admininistrator.

    2. Great stuff ! Thank you !

    3. Pingback: Log Parser Examples | Ray's world with Ashley

    4. Pingback: 10 outils gratuits (donc indispensables) pour installer et administrer Sharepoint « Gardez un oeil sur Sharepoint et Office…

    5. Pingback: 10 outils gratuits (donc indispensables) pour installer et administrer Sharepoint « Gardez un oeil sur Sharepoint et Office…

    6. Praveen says:

      I am unable to run the script how do you pass the path the log file?

      • mlichtenberg says:

        Simply include the path to your log file or files in the FROM clause of the log parser query.

        For example, let’s say you have a folder named c:\logs that contains the log files ex120301.log and ex120302.log. To examine one of the log files, the logparser query would look something like “select * from c:\logs\ex120301.log”. To examine both log files in a single query, use “select * from c:\logs\ex12*.log”.

        If the path to your log files includes spaces, wrap the path in single quotes. For example “select * from ‘c:\my logs\ex120301.log’”.

        Hope that helps!

    7. Pingback: IIS LogParser scripts « Elmore IT's Blog

    8. rodvars says:

      Tnx for the post, very useful information.

    9. jd says:

      nice blog… I am new in logparser and would like to find out how can i differentiate date and time from the field wich shows date/time together?
      e.g. Select Field2 from abc.log
      -> this will return as “5/10/2012 6:26:19 PM”
      how can i display date and time in diff. field.

      • mlichtenberg says:

        Look at the TO_DATE and TO_TIME functions. The format for both is FUNCTIONNAME(TIMESTAMP). You may have to convert the date/time values from your log file to timestamps before passing them to the TO_DATE/TO_TIME functions. Use the TIMESTAMP function to do this. The format of that function is TIMESTAMP(Field2, ‘MM-dd-yyyy hh:mm:ss’), where the second argument specifies the format of the data in Field2. Putting it all together, your log parser select might look something like “select TO_DATE(TIMESTAMP(Field2, ‘MM-dd-yyyy hh:mm:ss’), TO_TIME(TIMESTAMP(Field2, ‘MM-dd-yyyy hh:mm:ss’) FROM abc.log”.

        Hope that helps. If I haven’t already mentioned it, I highly recommend the book “Microsoft Log Parser Toolkit”. Available from Amazon.com and Barnes and Noble. You can also find the book at other online booksellers, if you prefer.

    10. Pingback: LogParser examples « RaSor's Tech Blog

    11. Sinead says:

      thanks for this..it works brilliant. any idea how to convert the cs-uri-stem to all the one case before you import it into a SQL Server database. I have tried some combinations but the syntax fails. I wonder if its possible. thanks

      • mlichtenberg says:

        You should be able to use the TO_LOWERCASE or TO_UPPERCASE functions to convert cs_uri_stem to to all lowercase or all uppercase. Here is a modified version of Example 22 that includes the use of the TO_UPPERCASE function:

        logparser “select distinct TO_UPPERCASE(cs_uri_stem) into LogTable from [LogFileName] where cs-uri-stem like ‘/folder/filename%’” -o:SQL -createTable:ON -server:[DatabaseServer] -database:[Database] -username:[SqlUser] -password:[SqlPassword]

        Hope that helps.

    12. Pingback: How to Use Log Parser to Query Event Log Data | OrcsWeb Hosting

    13. Pingback: Ecommerce Development Foundation

    14. owen buttolph says:

      Thanks for this – really interesting. I’m having problems outputting to SQL. I am running log parser 2.2 and have tried to outputting to SQL Server but get this error message:

      Task aborted.
      Error connecting to ODBC Server
      SQL State: 08001
      Native Error: 17
      Error Message: [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server
      does not exist or access denied.

      Anyone else experience this?

      • mlichtenberg says:

        I’ve not encountered that with Log Parser, but the error message is a fairly standard SQL Server response. It looks like the servername, username, and/or password you supplied were incorrect. Double-check the values you supplied for the -server, -username, and -password Log Parser parameters, and make sure they are valid for your database server.

    15. Rodrigo says:

      I have found that the field cs-uri-stem can contain a whitespace…therefore my logs doesnt match the columns fields on my SQL Server, using LogParser 2.2

      for example i have found a request like this:

      /images/costa rica.jpg

      On my databse i see:
      cs-uri-stem cs-query
      /images/costa rica.jpg

      Does anyone had this problem?

      i am using w3c format, to output some custom fields.

      Thanks!

    16. Pingback: IIS Log Parser: An extremely useful tool « Essence of Code

    17. Pingback: Maik Koster at myITforum.com

    18. Very nice and useful post!
      Congrats!

    19. Pingback: VMWare vDR Backup HTML Report « Roshan Ratnayake – Solutions Architect

    20. yaseen says:

      Hi, In my query i want top 20 users of application usage but i want to differentiate time field where the auto year is 1/1/2000, I tried to use timestamp which says ..error in syntax. Pls help need your expertise..Thanks a lot for the help.

    21. 40a says:

      Reblogged this on 40a and commented:
      Add your thoughts here… (optional)

    22. Pingback: Confluence: Network KB

    23. Fernando says:

      does logparser support subqueries? Im trying to parse an XML, I can get all the items, but also I need another field that is outside items list and is the same for every item. So I need to add it as another colum with a subquery. Is that possible? THANKS A LOT

      • mlichtenberg says:

        Hmmm, I don’t really know. My initial guess is that subqueries are not supported. I’ll see what I can find out… to help me reproduce your situation, can you give an example of the type of query you would like to use?

        • I need to calculate methods execution duration. I have methods name, query identifier, and start/stop events:
          30.09.2013 15:28:05 Start MethodName QueryGuid
          ***** Some logs ****
          30.09.2013 15:58:32 Stop MethodName QueryGuid

          Is this any way to parse it. Currently i’m using Excel for it, but it rather slowly.

        • mlichtenberg says:

          Does this get you what you need?

          SELECT MethodName, QueryGuid,
          SUB(
          MAX(TO_INT(TO_TIMESTAMP(Date, Time))),
          MIN(TO_INT(TO_TIMESTAMP(Date, Time)))
          ) AS SecondsElapsed
          FROM YourLogFile
          GROUP BY MethodName, QueryGuid

          For each “MethodName” and “QueryGuid”, it calculates the difference (SUB) between gets the integer representations (TO_INT) of the start (MIN) and stop (MAX) date and time. The results of the query are the number of seconds elapsed between each start and stop event.

          One thing that is probably not obvious is that integer representations of TimeStamp values are the number of seconds elapsed since January 1, year 0.

          Hope that makes sense.

        • /*mlichtenberg says: Does this get you what you need? */

          I can not reply your comment.
          You helped me a lot! Big thanks to you! It is really what i need.

      • mlichtenberg says:

        I can confirm that subqueries such as the following are supported by Log Parser…

        logparser -i:EVT “select * from System where TimeGenerated in (select max(TimeGenerated) from Security where EventID = 1100)”

        However, I suspect that’s not the type of subquery that you were asking about. Does this help at all, and did you find a way to do what you needed?

    24. Robert says:

      SELECT TO_Date TIMESTAMP=(DateCreated, ‘MM-dd-yyyy hh:mm:ss’) DateCreated AS DateCreated, ID as EventID, Count (ID) as EventIDCount, LevelDisplayName as Severity, Server as Server,
      FROM c:\temp\Data-Test-JanFull-Test2.csv
      Where Severity LIKE ‘%Critical%’ OR Severity LIKE ‘%Warning%’
      GROUP BY DateCreated,ID,Severity,Server
      Order By DateCreated Asc

      Is there any way that i can separate a field that has “Date and Time” in the the same field? I only want to use the date from the field and not the time as well.

      Thanks

    25. Pingback: Logparser – Query tool for very big files – csv – sort – big file – text based file | Tips Thoughts Notes

    Leave a Reply

    Fill in your details below or click an icon to log in:

    WordPress.com Logo

    You are commenting using your WordPress.com account. Log Out / Change )

    Twitter picture

    You are commenting using your Twitter account. Log Out / Change )

    Facebook photo

    You are commenting using your Facebook account. Log Out / Change )

    Google+ photo

    You are commenting using your Google+ account. Log Out / Change )

    Connecting to %s

    Follow

    Get every new post delivered to your Inbox.

    %d bloggers like this: