Log Parser Rocks! More than 50 Examples!

Log Parser is a tool that has been around for quite some time (almost six years, in fact).  I can’t really do any better than the description on the official download page, so here it is: “Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory”.  

Log Parser is a command line (yes, command line!) tool that uses a SQL dialect to extract information from data sources.  In particular, I have found it to be invaluable for extracting information from the web server logs of the sites that I manage and develop.

First, about that SQL syntax Log Parser uses to query the data sources… many developers seem to have a natural aversion to SQL.  In addition, many new data access frameworks attempt to abstract SQL away from the developer.  However, I have always found SQL easy to work with and believe it to be an essential tool that every developer should at least have a working knowledge of.   For Log Parser, all that is necessary is a basic understanding of the core SQL SELECT statement, as implemented within Microsoft’s SQL Server (that is, T-SQL).  That means you should be familiar with the following elements of a SELECT statement: TOP, FROM, INTO, WHERE, ORDER BY, GROUP BY.  That’s all you need to perform most Log Parser operations.

Curiously, Log Parser has never received the amount of attention that I think it deserves.  Beyond a flurry of attention when it was first released, it seems to be mentioned rarely in official Microsoft communications or blogs.  Despite that, it remains a viable and valuable tool for parsing not just web server log files, but all types of structured text-based data.

In this post, rather than explaining how to use Log Parser. I’ll give a number of examples of its use.  In addition, I’ll document some useful locations where Log Parser information can be found on the web.

Examples

Keep in mind that most of the examples that I give here are all-in-one command line queries (even though many wrap to multiple lines when displayed here).  However, queries can also be run as

logparser file:XXXXX.sql

where XXXXX is the name of a file containing a logparser-friendly sql query.  There are a couple examples of this in the following list.

The examples given here have been obtained from a variety of sources, including the documentation that ships with the tool, blogs and online documentation, and my own experience.  Unfortunately, I don’t have a record of the origin of each individual example, as I’ve compiled these piecemeal over the last two or three years.

I hope you’ll find something useful here and gain an appreciation for just how robust this tool is.

1)  All pages hits by a given IP address

logparser "select cs-uri-stem, count(cs-uri-stem) as requestcount from [LogFileName] where c-ip = ‘000.00.00.000’ group by cs-uri-stem order by count(cs-uri-stem) desc"

2) Hits on a particular page by IP address

logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] where cs-uri-stem like ‘/search.aspx%’ group by c-ip order by count(c-ip) desc"

3)  ReverseDNS example.  This attempts to find the domain associated with a given IP address.

logparser "select c-ip, REVERSEDNS(c-ip) from [LogFileName] where c-ip = ‘000.00.00.000’ group by c-ip"

4)  CSV example. All hits on a page, written to a CVS file.

logparser "select * into OUTPUT.CSV from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’"

5)  Chart example.  All hits on a page by an IP address, displayed on a chart.

logparser "select c-ip, count(c-ip) as requestcount into logparserchart.gif from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’ group by c-ip order by count(c-ip) desc" -o:chart

6)  Hits per hour from a particular IP address

logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.000′ group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"

7)  Basic list of IP addresses generating traffic

logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] group by c-ip order by count(c-ip) desc"

8)  Basic list of pages being hit

logparser "select cs-uri-stem, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem order by count(cs-uri-stem) desc"

9)  Basic list of pages being hit, including which IPs are doing the hitting

logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"

10)  Pages being hit after a specific date and time

logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ and date=’2009-06-04′ and time > ’15:00:00′ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"

11)  Counts of hits of ASPX/ASHX pages by hour from a particular IP address

logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"

12)  Counts of hits against specific pages by hour from a particular IP address

logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), cs-uri-stem, count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600)), cs-uri-stem order by numberrequests desc"

13)  Top browsers

logparser "Select top 50 to_int(mul(100.0,PropCount(*))) as Percent, count(*) as TotalHits, cs(User-Agent) as Browser from [LogFileName] group by Browser order by Totalhits desc"

14)  Hourly Bandwidth (chart)

logparser "Select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)) As Hour, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByHour.gif From [LogFileName] Group By Hour"

15)  Requests by URI

logparser "SELECT top 80 QUANTIZE(TO_TIMESTAMP(date, time), 3600) as Hour, TO_LOWERCASE(STRCAT(‘/’,EXTRACT_TOKEN(cs-uri-stem,1,’/’))) as URI, COUNT(*) AS RequestsPerHour, SUM(sc-bytes) AS TotBytesSent, AVG(sc-bytes) AS AvgBytesSent, Max(sc-bytes) AS MaxBytesSent, ADD(1,DIV(Avg(time-taken),1000)) AS AvgTime, ADD(1,DIV(MAX(time-taken),1000)) AS MaxTime FROM [LogFileName] GROUP BY Hour, URI Having RequestsPerHour > 10 ORDER BY RequestsPerHour ASC"

16)  Top 10 Images by size

logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent From [LogFileName] Where (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘gif’;’jpg’;’png’)) AND (sc-status = 200) Group By To_Lowercase(cs-uri-stem) Order By BytesSent, Hits, MaxTime DESC"

17)  Top 10 URLs for a website, with total hits, max time to serve, and average time to serve

logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent FROM [LogFileName] GROUP BY cs-uri-stem ORDER BY TotalHits DESC"

18)  Top 20 clients

logparser "Select Top 20 c-ip AS Client, Count(*) AS Hits INTO Chart.gif FROM [LogFileName] GROUP BY c-ip ORDER BY Hits Desc"

19)  Referrer Broken Links (i.e. external references to broken links on your site)

logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.html FROM [LogFileName] WHERE cs(Referer) IS NOT NULL AND sc-status = 404 AND (sc-substatus IS NULL OR sc-substatus=0)" -tpl:ReferBrokenLinks.tpl

20)  Status codes

logparser "SELECT sc-status As Status, COUNT(*) As Number INTO StatusCodes.gif FROM <2> GROUP BY Status ORDER BY Status"

21)  Search the Event Log for W3SVC (IIS) log entries and color-coordinate as to Error, Warning, Information.  This example writes the output of the query to an HTML file that  is generated using a template file.

logparser "SELECT TimeGenerated,EventTypeName,Strings,Message,CASE EventTypeName WHEN ‘Error event’ THEN ‘RED’ WHEN ‘Warning event’ THEN ‘YELLOW’ WHEN ‘Information event’ THEN ‘WHITE’ ELSE ‘BLUE’ END As Color INTO file.html FROM System WHERE SourceName = ‘W3SVC’"  -tpl:IISEventLogEntries.tpl

Where IISEventLogEntries.tpl is a file that contains the following:

<LPHEADER>
<HTML>
<HEAD>
  <STYLE>
    TD { font-family: Arial };
    TH { font-family: Arial };
  </STYLE>
</HEAD>
<BODY>
<TABLE BORDERCOLOR="BLACK" BORDER="1" CELLPADDING="2" CELLSPACING="2">
<TR>
  <TH COLSPAN=4 BGCOLOR="BLACK"><FONT COLOR=WHITE>New W3SVC Messages in System Event Log</FONT></TH>
</TR>
<TR>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Time Generated</TH>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Event Type</TH>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Strings</TH>
  <TH ALIGN=LEFT BGCOLOR="#C0C0C0">Message</TH>
</TR>
</LPHEADER>
<LPBODY>
<TR bgCOLOR="%Color%">
  <TD>%TimeGenerated%</TD>
  <TD>%EventTypeName%</TD>
  <TD>%Strings%</TD>
  <TD>%Message%</TD>
</TR>
</LPBODY>
</TABLE>
</BODY>
</HTML>

22)  Upload Log Parser query results directly to a table in SQL Server

logparser "select * into LogTable from [LogFileName] where cs-uri-stem like ‘/folder/filename%’" -o:SQL -createTable:ON -server:[DatabaseServer] -database:[Database] -username:[SqlUser] -password:[SqlPassword]

23)  Top 10 images by size sent.  Note that this example also shows how to query multiple log files at once.

logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO TOP10ImagesBySize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN  (‘gif’;’jpg’;’png’)) AND (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"

24)  Browser types (two different approaches)

logparser "SELECT distinct cs(User-Agent), count(*) as hits INTO useragentsalltypes.txt FROM logs\iis\ex*.log GROUP BY cs(user-agent) ORDER BY hits DESC"

logparser "SELECT TO_INT(MUL(100.0,PROPCOUNT(*))) AS Percent,  COUNT(*) AS Hits, cs(User-Agent) as Browser INTO  UseragentsHits.txt FROM  logs\iis\ex*.log  GROUP BY Browser ORDER BY HITS DESC"

25)  Unique visitors per day.  This requires two queries.  The first query selects from the IIS logs into a CSV file, and the second selects from that CSV file.

logparser "SELECT DISTINCT cs-username, date INTO tempUniqueVisitorsPerDay.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL Group By Date, cs-username"

logparser "SELECT date, count(cs-username) as UniqueVisitors into test.txt FROM tempUniqueVisitorsPerDay.csv GROUP BY date"

26)  Top 10 largest ASPX pages.

logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS  RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile,  Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO top10pagesbysize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘aspx’)) AND  (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"

27)  Top 10 slowest ASPX pages

logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ GROUP BY cs-uri-stem ORDER BY MaxTime DESC"

28)  Top 10 slowest ASPX pages on a specific day

logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ AND TO_STRING(To_timestamp(date, time), ‘MMdd’)=’1003’  GROUP BY cs-uri-stem ORDER BY MaxTime DESC"

29)  Daily bandwidth

logparser "Select To_String(To_timestamp(date, time), ‘MM-dd’) As Day, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByDay.gif From logs\iis\ex*.log Group By Day"

30)  Bandwidth by hour

logparser "SELECT QUANTIZE(TO_TIMESTAMP(date, time), 3600) AS Hour, SUM(sc-bytes) AS TotalBytesSent INTO BytesSentPerHour.gif FROM logs\iis\ex*.log GROUP BY Hour ORDER BY Hour"

31)  Average page load time per user

logparser "Select Top 20 cs-username AS UserName, AVG(time-taken) AS AvgTime,  Count(*) AS Hits INTO AvgTimePerUser.txt FROM logs\iis\ex*.log WHERE cs-username IS NOT NULL GROUP BY cs-username ORDER BY AvgTime DESC"

32)  Ave page load time for a specific user

logparser "Select cs-username AS UserName, AVG(time-taken) AS AvgTime,  Count(*) AS Hits INTO AvgTimeOnSpecificUser.txt FROM logs\iis\ex*.log WHERE cs-username = ‘CONTOSO\User1234’ GROUP BY cs-username"

33)  Error trends.  This query is quite long, and is easier expressed in a text file than on the command line.  So, Log Parser reads and executes the query contained in the specified text file.

logparser file:errortrend.sql

Where errortrend.sql contains the following:

SELECT
  TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
  SUM(c200) AS 200s,
  SUM(c206) AS 206s,
  SUM(c301) AS 301s,
  SUM(c302) AS 302s,
  SUM(c304) AS 304s,
  SUM(c400) AS 400s,
  SUM(c401) AS 401s,
  SUM(c403) AS 403s,
  SUM(c404) AS 404s,
  SUM(c500) AS 500s,
  SUM(c501) AS 501s,
  SUM(c502) AS 502s,
  SUM(c503) AS 503s,
  SUM(c504) AS 504s,
  SUM(c505) AS 505s
USING
  CASE sc-status WHEN 200 THEN 1 ELSE 0 END AS c200,
  CASE sc-status WHEN 206 THEN 1 ELSE 0 END AS c206,
  CASE sc-status WHEN 301 THEN 1 ELSE 0 END AS c301,
  CASE sc-status WHEN 302 THEN 1 ELSE 0 END AS c302,
  CASE sc-status WHEN 304 THEN 1 ELSE 0 END AS c304,
  CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
  CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
  CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
  CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
  CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
  CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
  CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
  CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
  CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
  CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO ErrorChart.gif
FROM
    logs\iis\ex*.log
GROUP BY
  Day
ORDER BY
  Day

34)  Win32 errors

logparser "SELECT sc-win32-status as ErrorNumber, WIN32_ERROR_DESCRIPTION(sc-win32-status) as ErrorDesc, Count(*) AS Total INTO Win32ErrorNumbers.txt FROM logs\iis\ex*.log WHERE sc-win32-status>0 GROUP BY ErrorNumber ORDER BY Total DESC"

35)  Substatus codes

logparser "SELECT sc-status, sc-substatus, Count(*) AS Total INTO 401subcodes.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY sc-status, sc-substatus ORDER BY sc-status, sc-substatus DESC"

36)  Substatus codes per day.  This is another example of executing a query contained in a text file.

logparser file:substatusperday.sql

Where substatusperday.sql contains the following:

SELECT
  TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
  SUM(c1) AS 4011,
  SUM(c2) AS 4012,
  SUM(c3) AS 4013,
  SUM(c4) AS 4014,
  SUM(c5) AS 4015,
  SUM(c7) AS 4017
USING
  CASE sc-substatus WHEN 1 THEN 1 ELSE 0 END AS c1,
  CASE sc-substatus WHEN 2 THEN 1 ELSE 0 END AS c2,
  CASE sc-substatus WHEN 3 THEN 1 ELSE 0 END AS c3,
  CASE sc-substatus WHEN 4 THEN 1 ELSE 0 END AS c4,
  CASE sc-substatus WHEN 5 THEN 1 ELSE 0 END AS c5,
  CASE sc-substatus WHEN 7 THEN 1 ELSE 0 END AS c7
INTO
  401subcodesperday.txt
FROM
  logs\iis\ex*.log
WHERE
  sc-status=401
GROUP BY
  Day
ORDER BY
  Day

37)  Substatus codes per page

logparser "SELECT TOP 20 cs-uri-stem, sc-status, sc-substatus, Count(*) AS Total INTO 401Pagedetails.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY cs-uri-stem, sc-status, sc-substatus ORDER BY Total"

38)  MB sent per HTTP status code

logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.htm FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"

39) 500 errors per ASPX and Domain User

logparser "SELECT cs-username, cs-uri-stem, count(*) as Times INTO 500PagesByUserAndPage.txt FROM logs\iis\ex*.log WHERE sc-status=500 GROUP BY  cs-username, cs-uri-stem ORDER BY Times DESC"

40)  Percent of 500 errors caused by each user

logparser "SELECT cs-username, count(*) as Times, propcount(*) as Percent INTO 500ErrorsByUser.csv FROM  logs\iis\ex*.log WHERE sc-status=500 GROUP BY cs-username ORDER BY Times DESC"

41)  Determine what percentage of the total bytes sent are being caused by each page type

logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.txt FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"

42)  Top 20 pages with a specific HTTP return code

logparser "SELECT TOP 20 cs-uri-stem, sc-status, Count(*) AS Total INTO TOP20PagesWith401.txt FROM logs\iis\ex*.log WHERE TO_LOWERCASE(cs-uri-stem) LIKE ‘%.aspx’ and sc-status=401 GROUP BY cs-uri-stem, sc-status ORDER BY Total, cs-uri-stem, sc-status DESC"

43)  Check traffic from IP addresses

logparser "Select c-ip AS Client, Div(Sum(cs-bytes),1024) As IncomingBytes(K), Div(Sum(sc-bytes),1024) As OutgoingBytes(K), MAX(time-taken) as MaxTime, AVG(time-taken) as AvgTime, count(*) as hits INTO errorsperip.txt FROM logs\iis\ex*.log GROUP BY client ORDER BY Hits DESC"

44)  Check errors by IP address

logparser file:errorbyip.sql

Where errorbyip.sql contains the following:

Select
  c-ip AS Client,
  SUM(c400) AS 400s,
  sum(c401) AS 401s,
  SUM(c403) AS 403s,
  SUM(c404) AS 404s,
  SUM(c500) AS 500s,
  SUM(c501) AS 501s,
  SUM(c502) AS 502s,
  SUM(c503) AS 503s,
  SUM(c504) AS 504s,
  SUM(c505) AS 505s
USING
  CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
  CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
  CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
  CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
  CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
  CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
  CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
  CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
  CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
  CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO
  IPNumberFileName.txt
FROM
    logs\iis\ex*.log
WHERE
    c-ip='<IP address goes here>’
GROUP BY
    client

45)  Find broken links

logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.txt FROM logs\iis\ex*.log WHERE cs(Referer) IS NOT NULL AND sc-status=404 AND (sc-substatus IS NULL OR sc-substatus=0)"

46)  Top 10 pages with most hits

logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent INTO Top10Urls.txt FROM logs\iis\ex*.log GROUP BY cs-uri-stem ORDER BY TotalHits DESC"

47)  Unique users per browser type (requires two queries)

logparser "SELECT DISTINCT cs-username, cs(user-agent) INTO UserAgentsUniqueUsers1.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL GROUP BY cs-username, cs(user-agent)"

logparser "SELECT cs(user-agent), count(cs-username) as UniqueUsersPerAgent, TO_INT(MUL(PROPCOUNT(*), 100)) AS Percentage INTO UniqueUsersPerAgent.txt FROM UserAgentsUniqueUsers1.csv GROUP BY  cs(user-agent) ORDER BY UniqueUsersPerAgent DESC"

48)  Bytes sent per file extension

logparser "SELECT EXTRACT_EXTENSION( cs-uri-stem ) AS Extension, MUL(PROPSUM(sc-bytes),100.0) AS PercentageOfBytes, Div(Sum(sc-bytes),1024) as AmountOfMbBytes INTO BytesPerExtension.txt FROM logs\iis\ex*.log GROUP BY Extension ORDER BY PercentageOfBytes DESC"

49)  Domains referring traffic to your site

logparser "SELECT EXTRACT_TOKEN(cs(Referer), 2, ‘/’) AS Domain, COUNT(*) AS [Requests] INTO ReferringDomains.txt FROM  logs\iis\ex*.log GROUP BY Domain ORDER BY Requests DESC"

50)  OS types (requires two queries)

logparser "SELECT DISTINCT c-ip, cs(user-agent) INTO UserAgentsUniqueUsers.csv FROM logs\iis\ex*.log WHERE c-ip <> NULL GROUP BY c-ip, cs(user-agent)"

logparser file:getos.sql

Where getos.sql contains the following:

SELECT
  SUM (c70) AS Win7,
  SUM (c60) AS Vista,
  SUM (c52) AS Win2003,
  SUM (c51) AS WinXP,
  SUM (C50) AS Win2000,
  SUM (W98) AS Win98,
  SUM (W95) AS Win95,
  SUM (W9x) AS Win9x,
  SUM (NT4) AS WinNT4,
  SUM (OSX) AS OS-X,
  SUM (Mac) AS Mac-,
  SUM (PPC) AS Mac-PPC,
  SUM (Lnx) AS Linux
USING
  CASE strcnt(cs(User-Agent),’Windows+NT+6.1′) WHEN 1 THEN 1 ELSE 0 END AS C70,
  CASE strcnt(cs(User-Agent),’Windows+NT+6.0′) WHEN 1 THEN 1 ELSE 0 END AS C60,
  CASE strcnt(cs(User-Agent),’Windows+NT+5.2′) WHEN 1 THEN 1 ELSE 0 END AS C52,
  CASE strcnt(cs(User-Agent),’Windows+NT+5.1′) WHEN 1 THEN 1 ELSE 0 END AS C51,
  CASE strcnt(cs(User-Agent),’Windows+NT+5.0′) WHEN 1 THEN 1 ELSE 0 END AS C50,
  CASE strcnt(cs(User-Agent),’Win98′) WHEN 1 THEN 1 ELSE 0 END AS W98,
  CASE strcnt(cs(User-Agent),’Win95′) WHEN 1 THEN 1 ELSE 0 END AS W95,
  CASE strcnt(cs(User-Agent),’Win+9x+4.90′) WHEN 1 THEN 1 ELSE 0 END AS W9x,
  CASE strcnt(cs(User-Agent),’Winnt4.0′) WHEN 1 THEN 1 ELSE 0 END AS NT4,
  CASE strcnt(cs(User-Agent),’OS+X’) WHEN 1 THEN 1 ELSE 0 END AS OSX,
  CASE strcnt(cs(User-Agent),’Mac’) WHEN 1 THEN 1 ELSE 0 END AS Mac,
  CASE strcnt(cs(User-Agent),’PPC’) WHEN 1 THEN 1 ELSE 0 END AS PPC,
  CASE strcnt(cs(User-Agent),’Linux’) WHEN 1 THEN 1 ELSE 0 END AS Lnx
INTO
  GetOSUsed.txt
FROM
  UserAgentsUniqueUsers.csv

51)  Get timeout errors from the server Event Log.  Display results in a datagrid.

logparser "select * from \\servername\application where message like ‘%timeout expired%’" -i:EVT -o:datagrid

52)  Get exceptions from the server Event (Application) Log

logparser "select timegenerated, eventtypename, eventcategoryname, message into webserverlog.csv from \\servername\application where message like ‘%myapplication%exception%’" -i:EVT

Links

Check out the links below to find more in-depth discussion of Log Parser, as well as even more examples of its usage.

    145 Responses to Log Parser Rocks! More than 50 Examples!

    1. petarp says:

      Thnx men! The BEST list of useful examples for MS Logparser one can find on the Web. The power of logparser is unbelievable. I will only say that I’m also using Log Parser Lizard GUI from Lizard Labs and I will recommend it to every Logparser user, developer and system admininistrator.

      • mlichtenberg says:

        I gave Log Parser Lizard a test run. Found some things I liked and disliked. Good for cataloging queries. Good for analyzing query results. Not as helpful with query composition as I thought a GUI might be. OK with small result sets, but performed poorly with a large result set. Full details can be found at https://mlichtenberg.wordpress.com/2011/05/03/a-gui-for-log-parser/.

        • Reinhard says:

          I have developed a tool for log file analysis where you don’t need to use SQL-statements and I think it’s easier to achieve the same results as in the example list of this blog. However, it’s commercial but not very expansive. If you gave it a try and you would compare it with Microsoft’s log parser that would be interesing feedback for you.

        • mlichtenberg says:

          I do not have time in my schedule right now to do an in-depth evaluation and comparison, but I will leave this here for anyone else who might read these comments.

          I did give your tool (Ascolog.com) a quick 10-minute try, thinking that in that time it would be easy to load a web server log file and run a few queries. Unfortunately, the tool did not seem to recognize the format of the file, even though it was a standard W3C extended log file produced by IIS. I assume that means I needed to define a layout before submitting any queries? That was an immediate roadblock for me, and so that was as far as I was able to get in evaluating the tool. (Sorry!)

          Two additional points that apply to me, but certainly not to everyone… 1) As I am quite comfortable working with databases, I do not find that having to use SQL statements is a drawback to Log Parser. In fact, I rather like it. 2) For the amount of time that I spend analyzing log output, the $150 price tag is not justified. If I needed a log analysis tool on a daily or weekly basis, then the cost of Ascolog would be as you say, not very expensive.

    2. Great stuff ! Thank you !

    3. Pingback: Log Parser Examples | Ray's world with Ashley

    4. Pingback: 10 outils gratuits (donc indispensables) pour installer et administrer Sharepoint « Gardez un oeil sur Sharepoint et Office…

    5. Pingback: 10 outils gratuits (donc indispensables) pour installer et administrer Sharepoint « Gardez un oeil sur Sharepoint et Office…

    6. Praveen says:

      I am unable to run the script how do you pass the path the log file?

      • mlichtenberg says:

        Simply include the path to your log file or files in the FROM clause of the log parser query.

        For example, let’s say you have a folder named c:\logs that contains the log files ex120301.log and ex120302.log. To examine one of the log files, the logparser query would look something like “select * from c:\logs\ex120301.log”. To examine both log files in a single query, use “select * from c:\logs\ex12*.log”.

        If the path to your log files includes spaces, wrap the path in single quotes. For example “select * from ‘c:\my logs\ex120301.log'”.

        Hope that helps!

    7. Pingback: IIS LogParser scripts « Elmore IT's Blog

    8. rodvars says:

      Tnx for the post, very useful information.

    9. jd says:

      nice blog… I am new in logparser and would like to find out how can i differentiate date and time from the field wich shows date/time together?
      e.g. Select Field2 from abc.log
      -> this will return as “5/10/2012 6:26:19 PM”
      how can i display date and time in diff. field.

      • mlichtenberg says:

        Look at the TO_DATE and TO_TIME functions. The format for both is FUNCTIONNAME(TIMESTAMP). You may have to convert the date/time values from your log file to timestamps before passing them to the TO_DATE/TO_TIME functions. Use the TIMESTAMP function to do this. The format of that function is TIMESTAMP(Field2, ‘MM-dd-yyyy hh:mm:ss’), where the second argument specifies the format of the data in Field2. Putting it all together, your log parser select might look something like “select TO_DATE(TIMESTAMP(Field2, ‘MM-dd-yyyy hh:mm:ss’), TO_TIME(TIMESTAMP(Field2, ‘MM-dd-yyyy hh:mm:ss’) FROM abc.log”.

        Hope that helps. If I haven’t already mentioned it, I highly recommend the book “Microsoft Log Parser Toolkit”. Available from Amazon.com and Barnes and Noble. You can also find the book at other online booksellers, if you prefer.

    10. Pingback: LogParser examples « RaSor's Tech Blog

    11. Sinead says:

      thanks for this..it works brilliant. any idea how to convert the cs-uri-stem to all the one case before you import it into a SQL Server database. I have tried some combinations but the syntax fails. I wonder if its possible. thanks

      • mlichtenberg says:

        You should be able to use the TO_LOWERCASE or TO_UPPERCASE functions to convert cs_uri_stem to to all lowercase or all uppercase. Here is a modified version of Example 22 that includes the use of the TO_UPPERCASE function:

        logparser “select distinct TO_UPPERCASE(cs_uri_stem) into LogTable from [LogFileName] where cs-uri-stem like ‘/folder/filename%’” -o:SQL -createTable:ON -server:[DatabaseServer] -database:[Database] -username:[SqlUser] -password:[SqlPassword]

        Hope that helps.

    12. Pingback: How to Use Log Parser to Query Event Log Data | OrcsWeb Hosting

    13. Pingback: Ecommerce Development Foundation

    14. owen buttolph says:

      Thanks for this – really interesting. I’m having problems outputting to SQL. I am running log parser 2.2 and have tried to outputting to SQL Server but get this error message:

      Task aborted.
      Error connecting to ODBC Server
      SQL State: 08001
      Native Error: 17
      Error Message: [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server
      does not exist or access denied.

      Anyone else experience this?

      • mlichtenberg says:

        I’ve not encountered that with Log Parser, but the error message is a fairly standard SQL Server response. It looks like the servername, username, and/or password you supplied were incorrect. Double-check the values you supplied for the -server, -username, and -password Log Parser parameters, and make sure they are valid for your database server.

    15. Rodrigo says:

      I have found that the field cs-uri-stem can contain a whitespace…therefore my logs doesnt match the columns fields on my SQL Server, using LogParser 2.2

      for example i have found a request like this:

      /images/costa rica.jpg

      On my databse i see:
      cs-uri-stem cs-query
      /images/costa rica.jpg

      Does anyone had this problem?

      i am using w3c format, to output some custom fields.

      Thanks!

    16. Pingback: IIS Log Parser: An extremely useful tool « Essence of Code

    17. Pingback: Maik Koster at myITforum.com

    18. Very nice and useful post!
      Congrats!

    19. Pingback: VMWare vDR Backup HTML Report « Roshan Ratnayake – Solutions Architect

    20. yaseen says:

      Hi, In my query i want top 20 users of application usage but i want to differentiate time field where the auto year is 1/1/2000, I tried to use timestamp which says ..error in syntax. Pls help need your expertise..Thanks a lot for the help.

    21. 40a says:

      Reblogged this on 40a and commented:
      Add your thoughts here… (optional)

    22. Pingback: Confluence: Network KB

    23. Fernando says:

      does logparser support subqueries? Im trying to parse an XML, I can get all the items, but also I need another field that is outside items list and is the same for every item. So I need to add it as another colum with a subquery. Is that possible? THANKS A LOT

      • mlichtenberg says:

        Hmmm, I don’t really know. My initial guess is that subqueries are not supported. I’ll see what I can find out… to help me reproduce your situation, can you give an example of the type of query you would like to use?

        • I need to calculate methods execution duration. I have methods name, query identifier, and start/stop events:
          30.09.2013 15:28:05 Start MethodName QueryGuid
          ***** Some logs ****
          30.09.2013 15:58:32 Stop MethodName QueryGuid

          Is this any way to parse it. Currently i’m using Excel for it, but it rather slowly.

        • mlichtenberg says:

          Does this get you what you need?

          SELECT MethodName, QueryGuid,
          SUB(
          MAX(TO_INT(TO_TIMESTAMP(Date, Time))),
          MIN(TO_INT(TO_TIMESTAMP(Date, Time)))
          ) AS SecondsElapsed
          FROM YourLogFile
          GROUP BY MethodName, QueryGuid

          For each “MethodName” and “QueryGuid”, it calculates the difference (SUB) between gets the integer representations (TO_INT) of the start (MIN) and stop (MAX) date and time. The results of the query are the number of seconds elapsed between each start and stop event.

          One thing that is probably not obvious is that integer representations of TimeStamp values are the number of seconds elapsed since January 1, year 0.

          Hope that makes sense.

        • /*mlichtenberg says: Does this get you what you need? */

          I can not reply your comment.
          You helped me a lot! Big thanks to you! It is really what i need.

      • mlichtenberg says:

        I can confirm that subqueries such as the following are supported by Log Parser…

        logparser -i:EVT “select * from System where TimeGenerated in (select max(TimeGenerated) from Security where EventID = 1100)”

        However, I suspect that’s not the type of subquery that you were asking about. Does this help at all, and did you find a way to do what you needed?

    24. Robert says:

      SELECT TO_Date TIMESTAMP=(DateCreated, ‘MM-dd-yyyy hh:mm:ss’) DateCreated AS DateCreated, ID as EventID, Count (ID) as EventIDCount, LevelDisplayName as Severity, Server as Server,
      FROM c:\temp\Data-Test-JanFull-Test2.csv
      Where Severity LIKE ‘%Critical%’ OR Severity LIKE ‘%Warning%’
      GROUP BY DateCreated,ID,Severity,Server
      Order By DateCreated Asc

      Is there any way that i can separate a field that has “Date and Time” in the the same field? I only want to use the date from the field and not the time as well.

      Thanks

    25. Pingback: Logparser – Query tool for very big files – csv – sort – big file – text based file | Tips Thoughts Notes

    26. Avishek says:

      Hi,
      I’m using this query to get an idea on how many requests are coming to IIS every two minutes. The query works like a charm but the issue is the aggregate functions are removing the null values I suppose. I see rows missing and when go back to the log I find that during those 2 minutes there were no requests. Any idea how I can still see those rows with null values in the output. The query is below.

      SELECT QUANTIZE(time, 120) as Time, COUNT(*) as Hits
      FROM ‘C:\u_ex140605.log’
      WHERE cs-uri-stem NOT LIKE ‘%verifyhttp%’
      GROUP BY Time
      ORDER BY Time ASC

      • mlichtenberg says:

        Before I investigate (because this could take a little time to figure out), let me make sure I understand you correctly. Are you saying that you want your query output to include time intervals for which there was nothing logged? Finding an example of how to do that in SQL is pretty easy… since the Log Parser query language is a subset of SQL, the trick is figuring out which of those examples will work with Log Parser.

        • Avishek says:

          Yes, that’s right. I want to capture the traffic including the time intervals when there was none with a specific value or 0 may be. Right now those rows are just omitted and analyzing the results in excel is extremely difficult. I tried SQL like tricks like having case statement to select something else or using the ISNULL function. But nothing worked.

        • mlichtenberg says:

          The rows are omitted from the output because they don’t exist in the source files. Case statements and the ISNULL function evaluate the values in columns… but in this case it is the rows themselves that are missing, not the values of the columns within the rows. Having said that, there may still be a way to get this to work. Let me see what I can come up with, and I’ll report back with my findings.

        • mlichtenberg says:

          Here is a solution that I hope will work for you.

          Every SQL trick I found for inserting “missing” data into query output involved joining to a table that contains entries for each missing row. In short, those missing rows have to exist in your sources somewhere… you can’t just produce them from nothing.

          So, let’s say you want a query that will return one row for every 2-minute interval in a 24-hour period. Think of your log as a database table. In order to get the desired result, you need a second helper table that contains every possible 2-minute interval. By joining your log table to the helper table and using the appropriate aggregate functions, you can get the results you want.

          Here is the bad news… Log Parser does not support joins. A similar question to yours was asked on serverfault.com almost five years ago, and no one was able to suggest a solution (http://serverfault.com/questions/7776/logparser-and-quantize-add-0-for-empty-values).

          Here is the good news… I have come up with a way to simulate the join you need. Consider the following solution, which might work for you…

          1) Create a comma-separated file named MidStep.csv that contains every 2-minute interval in a 24-hour period. As an example, here is what the first six entries should look like:

          Time,Hits
          00:00:00,0
          00:02:00,0
          00:04:00,0
          00:06:00,0
          00:08:00,0
          00:10:00,0

          This is the extra “log” file to which you will “join” your log file. You might want to make a backup copy of this file so that it can be easily reused.

          2) Alter your query to be the following:

          SELECT QUANTIZE(time, 120) as Time, COUNT(*) AS Hits
          INTO MidStep.csv
          FROM ‘c:\u_ex140605.log’
          WHERE cs-uri-stem NOT LIKE ‘%verifyhttp%’
          GROUP BY Time

          3) Run your logparser query with the -filemode:0 option, which will append the output of the query to the file (instead of overwriting it). The logparser command should look like this:

          logparser ” [ your query goes here ] ” -filemode:0

          4) Let’s assume your original query produced the following output:

          Time,Hits
          00:02:00,14
          00:10:00,20

          After running the logparser command, the end of the MidStep.csv file should now look like this:

          Time,Hits
          00:00:00,0
          [ lots of rows not shown ]
          23:56:00,0
          23:58:00,0
          00:02:00,14
          00:10:00,20

          The data from each “log” has now been “joined” into a single temporary file.

          5) Run the following logparser command to produce the final result set:

          logparser “select Time, max(Hits) from MidStep.csv group by Time, order by Time”

          The first six rows of output will be:

          Time,Hits
          00:00:00,0
          00:02:00,14
          00:04:00,0
          00:06:00,0
          00:08:00,0
          00:10:00,20

          I hope all of that makes sense. Let me know if I need to clarify anything.

          Full disclosure: I was inspired by a solution in the Log Parser Toolkit, which is available at http://www.amazon.com/Microsoft-Log-Parser-Toolkit-undocumented/dp/1932266526

    27. Avishek says:

      mlichtenberg, what you are saying makes sense. m gonna check this and get back. And let me tell you are amazing buddy!

    28. NewVillage says:

      Hello! very informative site for log parser.
      I’m trying to do the following. In a text file, each record looks like:

      dwgname, blockname, attval, coord, datetime, handle

      When a record is added, datetime is the current date and time.
      handle is always the same.

      This creates duplicates, based on handle. I need to keep just the latest record based on datetime+handle.

      Can logparser do this?

      • mlichtenberg says:

        Not sure I understand the problem. Can you provide a few example rows, including duplicates, so that I can better understand the problem you are trying to solve?

    29. NewVillage says:

      Thanks for responding. I appreciate it.

      Header:
      dwgname, blockname, attval, datetime, handle

      Values:
      dwgname1, blockname1, attval1, 20140701.2240431, handleid1
      dwgname1, blockname2, attval5, 20140703.16413269, handleid2
      dwgname1, blockname2, attval1, 20140611.10420890, handleid3
      dwgname1, blockname2, attval2, 20140703.16434659, handleid3

      Lines 3 and 4 represent the same record.
      Line 3 is the same record as Line 4.
      I need to get rid of Line 3.
      Line 4 is appended by an external program.
      Line 4 has a newer datetime.
      I need lines 1, 2, and 4.
      In the end, the last field, i.e. handleid? is always unique in the file.

      I think the question may be: Is there a way to select based on the newest datetime and handleid?

      • mlichtenberg says:

        Sorry for the delay in getting back to you… I’ve just now had a chance to look at your reply. I’ll have to give this some thought.

        Just to make sure there’s no way to simplify this, I have another question. Is the value of the attval column in any way associated with the row sequence? In other words, in the case of rows like 3 and 4 in your example, will attval2 always be greater than attval1?

    30. NewVillage says:

      No problem. I appreciate your help.

      attval can be any value, including text.

      The key fields as I mentioned are handleid and datetime. For a given handleid, there may be 1 or more datetime. e.g.

      dwgname1, blockname2, attval_randomtext, 20140611.10420890, handleid3
      dwgname1, blockname2, attval_anotherrandomtext, 20140703.16434659, handleid3

      In this case, the record with the latest datetime, 20140703.16434659, is the one to keep.

      I hope that helps.

      • mlichtenberg says:

        I nearly have a solution, but the datetime format in your file is causing me problems. Log Parser is recognizing those datetime values as Real values instead of DateTime, and it automatically rounds Real values to six significant digits. That rounding, which appears to be unchangeable, is causing the loss of what I am sure are significant digits in the DateTime values.

        There is a switch (-iTsFormat) which is supposed to allow one to supply a format string for Log Parser to use to identify date values in a file. For example,

        logparser “select datecolumn from file” -iTsFormat:”yyyyMMdd.hhmmssll”

        Unfortunately, I’m not having much luck getting that to work… logparser still sees the columns as Real values and still rounds them off.

        Let me look at this a little longer. If I have no success I’ll at least post what I have… maybe you’ll be able to do something with it.

        A final question for you: do your files contain a header row, and/or can you easily add one?

    31. NewVillage says:

      Maybe there’s a way to join handleid3+datetime during the query and find the latest/last one….

      • mlichtenberg says:

        FWIW, the STRCAT function is used to concatenate two columns. For example,

        logparser “select STRCAT(column1, column2) from file”

        I didn’t see how that would help us out, though.

    32. NewVillage says:

      Yes, mlichtenberg… A header row can be easily added.

      • mlichtenberg says:

        Following is the best that I could come up with. For the reasons I mentioned in my earlier comment, this method does lose some of the precision from the datetime values (the milliseconds, I believe). If the datetime values were formatted differently, I think it would work perfectly. Hopefully it gives you something to work with.

        Let’s say you have a file named SAMPLE.CSV, with the following contents:

        dwgname, blockname, attval, datetime, handleid
        dwgname1, blockname1, attval1, 20140701.2240431, handleid1
        dwgname1, blockname2, attval5, 20140703.16413269, handleid2
        dwgname1, blockname2, attval1, 20140611.10420890, handleid3
        dwgname1, blockname2, attval2, 20140703.16434659, handleid3

        Use this logparser query to select the contents of the file into a new file named SAMPLEEVAL.CSV.

        logparser "select dwgname, blockname, attval, datetime, handleid into sampleeval.csv from sample.csv"

        Here is what SAMPLEEVAL.CSV looks like:

        dwgname,blockname,attval,datetime,handleid
        dwgname1,blockname1,attval1,20140701.224043,handleid1
        dwgname1,blockname2,attval5,20140703.164133,handleid2
        dwgname1,blockname2,attval1,20140611.104209,handleid3
        dwgname1,blockname2,attval2,20140703.164347,handleid3

        You can see the problem with datetime truncation. Also note that time values like 16595999 will be rounded to 165960 instead of 170000 (Log Parser thinks those are Real values, not DateTime values).

        For the next step, run the following query to append to the file the key values of the rows you want in the final result set.

        logparser "select '' as dwgname, '' as blockname, '' as attval, max(datetime) as datetime, handleid into sampleeval.csv from sampleeval.csv group by handleid" -filemode:0

        Here are the updated contents of SAMPLEEVAL.CSV:

        dwgname,blockname,attval,datetime,handleid
        dwgname1,blockname1,attval1,20140701.224043,handleid1
        dwgname1,blockname2,attval5,20140703.164133,handleid2
        dwgname1,blockname2,attval1,20140611.104209,handleid3
        dwgname1,blockname2,attval2,20140703.164347,handleid3
        ,,,20140701.224043,handleid1
        ,,,20140703.164133,handleid2
        ,,,20140703.164347,handleid3

        In a sense, you now have two “tables” in a single file: the original table of log entries, and a second table containing the key values of the rows to be selected.

        The final Log Parser query will “join” the two tables via the use of a GROUP BY… HAVING clause. The key values of each desired row appear in the file more than once, so a query with a GROUP BY HAVING COUNT(*) > 1 clause will select the correct rows. The query is:

        logparser "select max(dwgname) as dwgname, max(blockname) as blockname, max(attval) as attval, datetime as datetime, handleid from sampleeval.csv group by datetime, handleid having count(*) > 1"

        And here are the results…

        dwgname blockname attval datetime handleid
        -------- ---------- ------- --------------- ---------
        dwgname1 blockname1 attval1 20140701.224043 handleid1
        dwgname1 blockname2 attval5 20140703.164133 handleid2
        dwgname1 blockname2 attval2 20140703.164347 handleid3

        Not perfect, but maybe good enough? If not, hopefully you at least learned a new Log Parser trick or two.

    33. NewVillage says:

      Thank you. I appreciate your time. I’ll try it out.

    34. Pingback: Using LogParser 2.2 to traverse huge files « HKAL

    35. Steve says:

      Hi i want to get list of unique user name per month from IIS log , can someone please help me with command using log parser

      • mlichtenberg says:

        I assume your web site is NOT using Anonymous access, because if it is no usernames will be logged. This query is pretty simple. Something like

        logparser “select cs-username from group by cs-username”

        should get you close to what you need. Note that you can use wildcard characters in the filenames, so if your IIS log files have names like u_ex20140101.log, then you can use

        logparser “select * from u_ex201401*.log”

        to query all of the logs for January. See example #25 in the main post for another example that uses cs-username.

    36. ganaysa says:

      wow ! great examples. Can I use Logparser to access files on the server & know which files have been modified recently?

    37. Pingback: What's Log Parser? - Best SEO Stuff

    38. Pingback: Blog J.Schweiss

    39. Pingback: Compare Active Directory computer accounts with Configuration Manager resources | Giving Something Back

    40. Pingback: How to dump events from Windows event logs ? | Jacques DALBERA's IT world

    41. sudhir says:

      how can i display two graphs in one, let us say i have hits for brand one and brand two and both need to be displayed on same graph where x axis being time and y being hits to brand 1 and brand 2

      • mlichtenberg says:

        To honest, I do not think that I have used the charting/graphing functionality of LogParser for a real-world problem. Maybe I did once, but it would have been a few years ago.

        Just the same, I looked around a bit and was not able to find a way to have LogParser directly produce a chart that includes multiple series. I suspect that it is not possible. In addition, the charting/graphing functionality of LogParser depends on Office components that were removed from the Office products after version 2003! Those components (Office Web Components) were available as an add-on to Office 2007, but I do not know if they remain available today.

        For those reasons, I suggest using LogParser to query your data source and produce one CSV file for each data series that you wish to chart. Then open those files in Excel (or something equivalent) and use the charting functionality there to produce your graph.

        Probably not the answer you hoped for, but maybe still is something you can use?

      • NewVillage says:

        From the log parser help…

        LogParser “SELECT TO_UPPERCASE(EXTRACT_EXTENSION(cs-uri-stem)) AS PageType, MIN(sc-bytes) AS Minimum, AVG(sc-bytes) AS Average, MAX(sc-bytes) AS Maximum INTO BytesChart.gif FROM GROUP BY PageType ORDER BY Average ASC” -o:CHART -chartType:Column3D

    42. NewVillage says:

      Ignore my last post. I didn’t read your query that you have multiple inputs

    43. Jay says:

      How to do the same thing with JAVA programming??

      • mlichtenberg says:

        Sorry, I do not understand the question. Log Parser is a command line tool; no programming required. What is the “same thing” that you want to do with Java?

      • NewVillage says:

        Jay… Would using the “Log Parser COM API” help you?

        • NewVillage says:

          The Log Parser scriptable COM components are implemented as Automation objects, which means that they can be used from any programming environment supporting automation, including C++, C#, Visual Basic, JScript and VBScript.

    44. Paulo Junior says:

      Hey Mike, good morning !

      First, tks for the tips !

      I need a help, if you could. I’m trying to extract data from a print server log, but i’ve to consider just the lines that have 2 registers with the same date. An example.

      TimeGenerated Strings
      07/07/2015 10:18:40 109|46282|1|2|600|600|600|1|3
      07/07/2015 10:18:40 109|asd|321
      07/07/2015 10:18:56 110|167342|0|0|600|600|600|1|2
      07/07/2015 10:18:56 110|355|3213
      07/07/2015 10:18:59 80|1766408|0|1|0|0|0|1|0
      07/07/2015 10:19:10 111|272284|0|0|600|0|600|1|0
      07/07/2015 10:19:10 111|aaa|271996|1

      In the query i’d like to remove the line with the 10:18:59 time.

      I’m using the follwing query:

      SELECT timegenerated, strings
      FROM ‘[LOGFILEPATH]’
      where (eventid=307 or eventid=805)

      First line is from eventid 307 and the second from 805, but in some cases there’re just one, so in this case i’ll discard the line.

      Any idea ?

      Thank you very much !

      • mlichtenberg says:

        Sorry for the delayed response. I think I have a solution for you. Give this query a try:

        SELECT timegenerated, strings
        FROM ‘[LOGFILEPATH]’
        WHERE (eventid=307 or eventid=805)
        AND timegenerated IN
        (SELECT timegenerated
        FROM ‘[LOGFILEPATH]’
        WHERE (eventid=307 or eventid=805)
        GROUP BY timegenerated HAVING COUNT(*) = 2)

        • Paulo Junior says:

          You don’t think, you’ve !

          No words to thanks, you’re the man ! It worked !

          One more doubt, it’s possible tu put the result side by side ? Agregate the lines.

          ex:

          07/07/2015 10:18:40 109|46282|1|2|600|600|600|1|3
          07/07/2015 10:18:40 109|asd|321
          07/07/2015 10:18:56 110|167342|0|0|600|600|600|1|2
          07/07/2015 10:18:56 110|355|3213

          Result on this

          07/07/2015 10:18:40 109|46282|1|2|600|600|600|1|3|109|asd|321
          07/07/2015 10:18:56 110|167342|0|0|600|600|600|1|2|110|355|3213

          Once more, thank you very much !

        • mlichtenberg says:

          Try the following:

          SELECT timegenerated,
          MIN(CASE eventid WHEN 307 THEN strings END) as 307,
          MIN(CASE eventid WHEN 805 THEN strings END) as 805
          FROM ‘[LOGFILEPATH]’
          WHERE (eventid=307 or eventid=805)
          AND timegenerated IN
          (SELECT timegenerated
          FROM ‘[LOGFILEPATH]’
          WHERE (eventid=307 or eventid=805)
          GROUP BY timegenerated HAVING COUNT(*) = 2)
          GROUP BY timegenerated

          I may not have that exactly right, since I don’t have a file to test with. Think it will get you what you need, though. If you truly need the two string concatenated together, there is a STRCAT function that can do that for you. i.e. STRCAT(string1, string2)

        • Paulo Junior says:

          Once more, no words to thank you ! Worked like a charm !

        • prasanna says:

          I need a help,
          I am Beginner of sql and also powersheel.
          I have tried https://gallery.technet.microsoft.com/scriptcenter/PowerShell-Print-Logger-09a6f4e0 this link also not sent to sql

          if you could. I’m trying to extract data from a print server log.

          Document 1455, Microsoft Word – WORD owned by tmani on \\10.1.3.18 was printed on A4 through
          Document 1454, Microsoft Word – WORD owned by Administrator on CSG-MAINT-PC was printed on
          Document 1451, Full page photo owned by Administrator on CSG-MAINT-PC was printed on A4
          Document 1450, Full page photo owned by Administrator on CSG-MAINT-PC was printed on A4-4515X
          Document 1449, PDF.pdf owned by Administrator on CSG-MAINT-PC was printed on A4 through port
          Document 1448, Microsoft Word – WORD owned by Administrator on CSG-MAINT-PC was printed on

          Event id 307 and 805 move to sql,

          Please give and script
          Any idea ?
          Thank you very much !

        • mlichtenberg says:

          If I understand the question correctly, you want to extract data from a Windows Event Log and insert it into a SQL Server table. Is that correct? I have not used PowerShell myself, so I cannot help you to evaluate the script that you linked to.

          With Log Parser, you can try the following:

          logparser “select * into [Database Table] from [Event Log Path] where EventId=307 OR EventID=805” -i:EVT -o:SQL -server:[Server Name] -database:[Database Name] -username:[Database username] -password:[Database password] -createTable:ON

          -i:EVT tells Log Parser that the input is a Windows Event Log
          -o:SQL tells Log Parser to send the output to a SQL Server table

          The only thing I am not sure about is what value to replace [Event Log Path] with. For the standard Windows logs, you would use “Application”, “System”, “Setup”, or “Security”. I am not sure what value to use for the Print logs.

          Does that help at all?

        • prasanna says:

          Hi mlichtenberg,

          Thanks for you reply,

          I have tried that command but some error occurred \

          logparser “select * into printtest from %SystemRoot%\system32\winevt\Logs\Application.evtx where EventId=307 OR EventID=805” -i:EVTX -o:SQL -server:10.x.x.10 -database:print -username:administrator -password:microsoft -createTable:ON

          (Error: detected extra argument “*” after query)

          Any other changes for that command

        • mlichtenberg says:

          It looks like you may have cut-and-pasted the log parser command directly from my blog. When I did that the quotes were copied as Unicode characters, rather than as characters from the default character set used by the command line. I copied-and-pasted and got the same error you did… re-typing the quotes on the command line fixed the problem.

          Besides that, there are a couple other problems. First -i:EVTX should be -i:EVT.

          Next, in my tests, Log Parser was not able to read directly from the %SystemRoot%\system32\winevt\Logs folder. I had to use the Event Viewer to save the log entries to a new *.evtx file in a temporary location before Log Parser was able to access them. I suspect a permissions issue, since %SystemRoot%\system32\winevt\Logs is a system folder (the error persisted even if I ran the command as Administrator).

          Finally, if you are going to be querying one of the primary Windows logs (Application, System, Setup, Security), you can simply specify the name of the log in the Log Parser command. No need to use the full path to the EVTX file. For example:

          logparser “select * from Application” -i:EVT

    45. sdecorme says:

      Great but too much example 🙂
      I would like to get all the registery key containing this string “HID\VID”.
      I’ve tried
      string query = string.Format(@”SELECT Path, ValueName from \HKCR, \HKCU, \HKLM, \HKCC, \HKU like ‘%HID\VID%'”);
      rs = qry.Execute(query, registryFormat);
      Without success could you help please.
      Thanks you

      • mlichtenberg says:

        Try this as your query: SELECT Path, ValueName FROM HKCR, HKCU, HKLM, HKCC, HKU WHERE Path LIKE ‘%HID\\VID%’

        That should work from the command line. Personally, I have not used Log Parser from within a C# application, so I cannot guarantee that it will work for you exactly as I have written it. In particular, the number of backslashes might need to be different within a C# application.

    46. Pingback: ELMORE IT's Blog » IIS LogParser scripts

    47. Pingback: Confluence: Credant

    48. Pingback: Confluence: Credant

    49. LK says:

      Great page and very useful. I’m trying to use number 50 OS types (requires two queries) however im getting an error when running the SQL file shown below. The first querie that outputs the srouce csv works fine. Any help would be great.

      C:\Program Files (x86)\Log Parser 2.2>logparser file:c:\temp\getos.sql
      WARNING: Output format not specified – using NAT output format.
      WARNING: Input format not specified – using CSV input format.
      Error: SELECT clause: Syntax Error: unknown field ”Windows+NT+6.1”

      • mlichtenberg says:

        Make sure the CASE statements include single quotes around the user agent strings. In other words, use this:

        CASE strcnt(cs(User-Agent),’Windows+NT+6.1′) WHEN 1 THEN 1 ELSE 0 END AS C70,

        and not this:

        CASE strcnt(cs(User-Agent),Windows+NT+6.1) WHEN 1 THEN 1 ELSE 0 END AS C70,

        Also, be sure that the quotes are simple ASCII characters (decimal code 27, I believe), and not a character from an extended or Unicode character set. I believe my blogging client replaced the simple quotes characters with Unicode representations, so that may be the problem if you have cut-and-paste directly from my original post.

        This isn’t directly relevant to your question, but it looks like the GetOS.SQL query could be improved by adding some additional cases to catch Windows 8.x, Windows 10, and mobile operating systems.

        • LK says:

          Thats sorted it. Thanks a load.

        • LK says:

          Hello again. I’ve added some of the newer operating systems and it’s working a treat. When I finish adding them i’ll share the sql script for the community.

    50. Pingback: Event Log Parser | mesotheliomaattorneyillinois.xyz

    51. Henk Beersma says:

      Hi, I am looking for the following . I am parsing a file with five fields called EventNumber, EventName, EventType, Timestamp and UserData. The output is a CSV file. The problem is that the field Timestamp has no milliseconds whereas I would need these. Cannot get it to work not with TO_TIMESTAMP() nor with TIMESTAMP(). Can you help?

      • mlichtenberg says:

        I am not sure I understand your scenario. Can you give examples of 1) the data in the file you are parsing [just two or three lines would be fine], and 2) the output you are trying to produce. A bonus would be examples of a couple queries you have already tried.

        • Henk Beersma says:

          Hi Mike,

          Thanks for helping me out.
          My problem in a nut shell: I have an ETL file that has its timestamps in 7 decimals resolution.
          The ETL File can be viewed by a viewer called ETWViewer.exe. See Example 1 .
          The file can also be exported from this viewer as a CSV file. See Example 2.
          After export it still contains the 7 decimals timestamp.
          I am using this data to analyze problems. To do so I am importing this in my own Excel Sheet and combining it with other data running some macros from the sheet.
          This involves first opening the ETL file in the ETWViewer.exe, then exporting it as a CSV file and subsequently importing the CSV file into my own Excel Sheet. This is rather cumbersome so I sought for a way of importing the file directly into my own Excel Sheet.
          LogParser does the trick by using:
          LogParser “SELECT EventNumber, EventName, EventTypeName, TimeStamp, UserData INTO c:\TestLOG1.csv FROM C:\TestLOG.etl”
          Unfortunately this strips off the decimals. See Example 3
          So I tried this:
          LogParser “SELECT EventNumber, EventName, EventTypeName, TO_TIMESTAMP(‘Timestamp’, ‘yyyy-mm-dd hh:mm:ss’), UserData INTO c:\TestLOG1.csv FROM C:\TestLOG.etl”
          Get an error then:
          WARNING: Input format not specified – using ETW input format.
          Error: SELECT clause: Syntax Error: unknown field ”Timestamp”.

          Also tried this, Timestamp without quotes
          LogParser “SELECT EventNumber, EventName, EventTypeName, TO_TIMESTAMP(Timestamp, ‘yyyy-mm-dd hh:mm:ss’), UserData INTO c:\TestLOG1.csv FROM C:\TestLOG.etl”
          Then I get:
          Error: SELECT clause: Semantic Error: arguments of function TO_TIMESTAMP must be both TIMESTAMP or STRING

          So I am at a loss at the moment. Any help would be greatly appreciated. A resolution of 7 decimals would be nice but 3 would already be great

          Example 1
          FSM Event Connecting to PIM using USB 2016/03/17, 22:49:35.3850352
          FSM Error Starting Attempt Failed for PIM : 2016/03/17, 22:49:35.3850320
          FSM Event Connecting to PIM using USB 2016/03/17, 22:49:33.3849648
          FSM Error Starting Attempt Failed for PIM : 2016/03/17, 22:49:33.3849632
          FSM Event Connecting to PIM using USB 2016/03/17, 22:49:31.3850768

          Example 2

          FSM Event Connecting to PIM using USB 3/17/2016 22:49:35.3850352
          FSM Error PIM_WD_ERROR_DISCONNECTION 3/17/2016 22:49:35.3850320
          FSM Event Connecting to PIM using USB 3/17/2016 22:49:33.3849648
          FSM Error PIM_WD_ERROR_DISCONNECTION 3/17/2016 22:49:33.3849632
          FSM Event Connecting to PIM using USB 3/17/2016 22:49:31.3850768

          Example 3
          FSM Event 2016/03/18 22:49:34.995 Connecting to PIM using USB
          FSM Error 2016/03/17 22:49:33.000 PIM_WD_ERROR_DISCONNECTION
          FSM Event 2016/03/17 22:49:33.000 Connecting to PIM using USB
          FSM Error 2016/03/17 22:49:31.000 PIM_WD_ERROR_DISCONNECTION
          FSM Event 2016/03/17 22:49:31.000 Connecting to PIM using USB

        • mlichtenberg says:

          In your last query…

          SELECT EventNumber, EventName, EventTypeName, TO_TIMESTAMP(Timestamp, ‘yyyy-mm-dd hh:mm:ss’), UserData INTO c:\TestLOG1.csv FROM C:\TestLOG.etl

          …try replacing TO_TIMESTAMP with TO_STRING, and changing the date format to ‘yyyy-mm-dd hh:mm:ss.nnn’. For example…

          SELECT EventNumber, EventName, EventTypeName, TO_STRING(Timestamp, ‘yyyy-mm-dd hh:mm:ss.nnn’), UserData INTO c:\TestLOG1.csv FROM C:\TestLOG.etl

          Does that work and get you what you need? (I apologize that I have not had time to try it myself.)

          If the date format still isn’t quite right, maybe the Log Parser timestamp format information at http://words.strivinglife.com/post/Microsoft-Log-Parser-timestamp-formats/ will help.

        • Henk Beersma says:

          Hi Mike,
          Unfortunately this does not work. This is the result.
          LogParser “SELECT EventNumber, EventName,EventTypeName, TO_STRING(Timestamp, ‘yyyy-mm-dd hh:mm:ss.nnn’) INTO c:\GPCLOG1.csv FROM C:\GPCLOG.etl”
          Error: Syntax Error: : cannot find closing parenthesis for function TO_STRING

          I think the problem may be caused by the fact that it considers Timestamp as a function rather than a field. Would you agree?

        • mlichtenberg says:

          I was able to reproduce the error by cutting and pasting the SELECT statement from your comment. I assume you cut-and-pasted as well? WordPress encodes the quote characters, which causes problems when cutting-and-pasting to the Windows command line. Type the SELECT statement instead of cutting-and-pasting, and you should be ok.

          Having said that, if you DO run into a problem with a field name matching a Log Parser keyword, surround the field name with square brackets (i.e. [timestamp]).

          Also, I was mistaken about using “nnn” in the format string; instead, just use “nn”. So, what I should have suggested is this:

          SELECT EventNumber, EventName, EventTypeName, TO_STRING([Timestamp], ‘yyyy-mm-dd hh:mm:ss.nn’) INTO c:\GPCLOG1.csv FROM C:\GPCLOG.etl

          Does that help?

    52. Henk Beersma says:

      Mike sorry, please ignore the first line of example 3 (which does show decimals). I changed that manually

    53. Henk Beersma says:

      Hi Mike, Thanks so much. It works!. Manually typing the command iso copy/paste did the trick. Tks again

    54. Amjath says:

      Hi. Thanks. But I’m not sure why I am getting errors for almost all the commands you put together in here. Am I doing it wrongly? Also, is that possible to pull all the URLs/Pages hits by Googlebot? Thanks again!

      • mlichtenberg says:

        I would have to say yes, you must be doing something wrong. Can you provide an example of a simple query that is failing for you, including the exact error message?

        For GoogleBot your best bet is to query by IP address (see examples 1 and 2). I am not sure if Google publishes a list of IP addresses used by the GoogleBot. You might need to first query for all IP addresses associated with a user-agent that contains “googlebot”, and then use that list to filter your results to just hits from those IP addresses. Note that this may not be ideal, as anyone can spoof the user agent and identify themselves as Google. Using a known, verified set of IP addresses is the best approach.

    55. Hi peeps,

      I am brand new to Log Parser and I have been tasked to split data contained in a log field.

      Here is an example of what the string looks like:

      ip=00.000.00.00 date=2016_06_20 time=06:00:00 device_id=ABC123456789 log_id=0000000013

      I need to pars the log file so that I have the IP address, date, time, device_id and log_d each in a separate field.

      I have close to a million strings and i need someone to help me with the query to be able to separate these fields. Can anyone perhaps assist me with this please?

      • mlichtenberg says:

        I will try to take a look at this in the next 24 hours. In your file are the fields separated by spaces or tabs?

        • mlichtenberg says:

          Here you go…

          Assumptions:
          1) Fields in the original log file are separated by spaces
          2) There is no header in the original log file
          3) Desired output is another space-separated log file
          4) The original log file is named MyLog.log

          Solution:

          Step 1)
          logparser -i:TSV -iSeparator:space -headerRow:OFF
          “select * into ‘MyLogTemp.log’ from ‘MyLog.log'”
          -o:TSV -oSeparator:space -headers:ON

          Step 2)
          logparser -i:TSV -iSeparator:space -headerRow:ON
          “select REPLACE_STR(Field1, ‘ip=’, ”) AS ip,
          REPLACE_STR(Field2, ‘date=’, ”) AS date,
          REPLACE_STR(Field3, ‘time=’, ”) AS time,
          REPLACE_STR(Field4, ‘device_id=’, ”) AS device_id,
          REPLACE_STR(Field5, ‘log_id=’, ”) AS log_id
          into ‘MyLogTransformed.log’
          from ‘MyLogTemp.log'”
          -o:TSV -oSeparator:space -headers:ON

          Details: See logparsers’s built-in help.
          1) logparser -h
          2) logparser -h -i:TSV
          3) logparser -h -o:TSV

          Hope this helps.

    56. Pingback: Log Parser – Transforming Plain Text Files | LichtenBytes

    57. ms says:

      Great site!

    58. Joe Yusko says:

      This is a great site! Sadly, I can’t seem to figure out how to do what I need.

      I have some IIS sites that answer requests to multiple subdomanis, for example:

      a.zzz.com
      bda.zzz.com
      bnoaeecz.zzz.com
      etc…

      I’m trying to figure out how to parse the log so that I can get a count of number of times calls were made to each subdomain. Is this ridiculously hard to do?

      Thanks,
      joe

      • mlichtenberg says:

        Without seeing an example of your log files, this is my guess. I believe that you need to be logging the host header name. I do not think this field is logged by default, so in the IIS manager you may need to turn on logging of the “Host (cs-host)” field. Once you have done that and have verified that the data is being collected, a simple Log Parser query should get you what you need.

        Does that help?

    59. sheikvara says:

      Reblogged this on Cloud & Virtualization and commented:
      Awesome article

    60. Pingback: Log Parser – Querying IIS Logs – Coding for a Living

    61. Pingback: Free log parser | Linux, Windows, Heklanje, Kuhinja

    62. AJ says:

      I setup log parser 2.1 on my 2012 r2 server, which is networked to a couple nodes that are Windows 10 and i am not able to collect the logs or report from each node. The error returns “451 object not a collection”

      Can someone please assist?

      • mlichtenberg says:

        Need a clarification… is it correct that you are trying to query the Windows Event Logs on the Windows 10 machines? And you are trying to do that from the Windows 2012 R2 server?

        • AJ says:

          I am trying to pull run the tool from the Windows 2012 R2 server and pull/query the Windows Event Logs from the Windows 10 machines.

        • mlichtenberg says:

          Hmmm… querying the Event Logs on a Windows 10 machine is no problem, and neither is querying a machine remotely. My guess is a syntax error in the query or a permissions issue, but I have not encountered this error before. Do you have a domain
          network or a workgroup network? Can you post the query you are attempting to execute?

    63. Shaan says:

      hi
      logparser -i:iisw3c “select count(distinct cs-username) from *.log where cs-username like ‘sample%’ ”
      i will get result as
      Users:
      20

      Static:
      XXX

      how shall i access the number of users.
      Basically i want to store 20 in some variable

      • mlichtenberg says:

        I’m not sure exactly what your use case looks like here, so I may be misunderstanding… but here are two options. 1) Write the output of the log parser command to a file, and then read the value from the file. 2) Log Parser ships with scriptable COM components that might be useful. They are quite old, and I’ve not used them myself, so I cannot guarantee that they will work. Might be worth a look, though.

    64. Shaan says:

      thanks bro!!! i have used first option and its working fine. Simple solution but smart one

    65. Pingback: Tip: Microsoft LogParser [Studio] superfast SQL-like querying of any log file | HAVIT Knowledge Base

    66. Pingback: Tip: Microsoft LogParser [Studio] – rychlé SQL dotazování do textových log-souborů | HAVIT Knowledge Base

    67. Anil says:

      Hi ….I need a help for logparser code.. If any one expert on that , please mail to me anilkumar.btechece@gmail.com

    68. Nicki Blaese says:

      Very good information. Lucky me I found your site by chance (stumbleupon). I’ve saved as a favorite for later!

    69. J. says:

      Hi there. I can greatly use your assistance… I have many log files on a Titan FTP Server and I’d like to weed out all the usernames from these files and their time stamps. Couple of examples of a login entry within these log files are:
      2019-06-11 00:00:09 [2/4212/d2c] User “WN007” authenticated successfully.
      2019-06-10 23:00:44 [2/5508/11d0] User “XXppy2N” authentication failed.

      I’d like to create an output csv file with Date, Time, User “loginID”, Authenticated Successfully or Authentication failed.
      Will I need to create a batch file on the FTP server or export them to my local machine and work with them there by creating a batch file or Powershell? Step by step process will be a great help.

      Any help is greatly appreciated. Thx.

      • mlichtenberg says:

        Very sorry for the delayed response.

        If your log file is a tab-separated text file that has headers, then there are two simple Log Parser queries that will return the successful and failed authentication attempts.

        Assuming your log file is named ftplog.csv and looks something like this:

        Date Time Code Message
        2019-06-11 00:00:09 [2/4212/d2c] User “WN007” authenticated successfully.
        2019-06-10 23:00:44 [2/5508/11d0] User “XXppy2N” authentication failed.

        Then here is a query to return all successful authentications:

        logparser -i:TSV “select Date, Time, REPLACE_STR(REPLACE_STR(Message, ‘User \”‘, ”), ‘\” authenticated successfully.’, ”) AS Message into ‘output.csv’ from ‘ftplog.csv’ where Message like ‘%successfully%’

        For this example file, the output is:

        Date Time Message
        ———- ——– ——-
        2019-06-11 00:00:09 WN007

        The corresponding query for failed authentication attempts is this:

        logparser -i:TSV “select Date, Time, REPLACE_STR(REPLACE_STR(Message, ‘User \”‘, ”), ‘\” authentication failed.’, ”) AS Message into ‘output.csv’ from ‘ftplog.csv’ where Message like ‘%failed%’

        As far as where to run the Log Parser queries, that depends on a number of factors. Can you access the FTP server from your local machine? If so, you can use Log Parser to query the files in place. If not, are you comfortable installing Log Parser and running log analysis on your live FTP server? If so, you can run the Log Parser analysis on the FTP server. Otherwise, exporting the logs to your local machine is probably the way to go.

        Hope this helps!

    70. Pingback: Application logs are your friend (and how to read them) - The Seeley Coder

    71. Pingback: 各种日志分析方式汇总 – 沧海月明珠有泪

    72. Pingback: Confluence: Campaigner Operations

    73. Jithender Jithu says:

      Hi all i need to log parse query to count of time between 0 to 5 seconds ,6 to 10 seconds and 10 20 sec and >20seconds.

      below it the query i have writteen but it is not working.kindly help us.

      !D!SELECT EXTRACT_SUFFIX(cs-uri-stem,0,’/’) AS API,SUM(CASE WHEN (time-taken)>=0 AND (time-taken)=6000 AND (time-taken)=10000 AND (time-taken)=20000 THEN 1 ELSE 0 END) AS ‘>=20 sec’,COUNT(time-taken) INTO !REPORTFILE! FROM !LOGFILE! WHERE cs-uri-stem like ‘%GetGamePlayDetails%’ group by API

      • mlichtenberg says:

        If I understand what you are looking for, you want a query that will return the name of an API endpoint, along with the total number of requests for that endpoint, the number of requests that took 0-6 seconds, the number that took 6-10 seconds, the number that took 10-20 seconds, and the number that took more than 20 seconds. Is that correct?

        This is tricky. Ideally, you would write something like this:

        SELECT EXTRACT_SUFFIX(cs-uri-stem,0,’/’) AS API,
        SUM(CASE WHEN [time-taken]>=0 AND [time-taken]<=6000 THEN 1 ELSE 0 END) AS [0-6 sec],
        SUM(CASE WHEN [time-taken]>6000 AND [time-taken]<=10000 THEN 1 ELSE 0 END) AS [6-10 sec],
        SUM(CASE WHEN [time-taken]>10000 AND [time-taken]<=20000 THEN 1 ELSE 0 END) AS [10-20 sec],
        SUM(CASE WHEN [time-taken]>20000 THEN 1 ELSE 0 END) AS [>20 sec],
        COUNT(time-taken) AS [Total]
        INTO !REPORTFILE!
        FROM !LOGFILE!
        WHERE cs-uri-stem like ‘%GetGamePlayDetails%’
        GROUP BY EXTRACT_SUFFIX(cs-uri-stem,0,’/’)

        Unfortunately, that will NOT work. LogParser only accepts simple case statements that check a value for equality with some other value. So you cannot use something like “[time-taken]>6000 AND [time-taken]<=10000” as either the CASE or WHEN expression.

        However, you can use a function that returns a distinct value as the CASE or WHEN expression. For example, DIV(time-taken, 6000) returns 0 for any value of time-taken that is less than 6000, and 1 for any other value. So, you can use something like “CASE DIV([time-taken], 6000) WHEN 0 THEN 1 ELSE 0 END” to look for requests that took between 0 and 6 seconds (technically, between 0 and 5.999 seconds).

        Putting it all together, you can create a query like the following:

        SELECT EXTRACT_SUFFIX(cs-uri-stem,0,’/’) AS API,
        SUM(CASE DIV([time-taken], 6000) WHEN 0 THEN 1 ELSE 0 END) AS [0-6 sec],
        SUM(CASE DIV([time-taken], 1000)
        WHEN 6 THEN 1
        WHEN 7 THEN 1
        WHEN 8 THEN 1
        WHEN 9 THEN 1
        ELSE 0 END
        ) AS [6-10 sec],
        SUM(CASE DIV([time-taken], 1000)
        WHEN 10 THEN 1
        WHEN 11 THEN 1
        WHEN 12 THEN 1
        WHEN 13 THEN 1
        WHEN 14 THEN 1
        WHEN 15 THEN 1
        WHEN 16 THEN 1
        WHEN 17 THEN 1
        WHEN 18 THEN 1
        WHEN 19 THEN 1
        ELSE 0 END) AS [10-20 sec],
        SUM(CASE DIV([time-taken], 20000) WHEN 0 THEN 0 ELSE 1 END) AS [>20 sec],
        COUNT(time-taken) AS [Total]
        INTO !REPORTFILE!
        FROM !LOGFILE!
        WHERE cs-uri-stem like ‘%GetGamePlayDetails%’
        GROUP BY EXTRACT_SUFFIX(cs-uri-stem,0,’/’)

        If I understood your need correctly, that should get you what you want. Add a new comment if that doesn’t make sense, and I will try to explain further.

        By the way, I recommend saving that query to a file and running it using the syntax “logparser file:”

    74. Pingback: Using Log Parser with SharePoint to Generate Interesting Reports - SharePoint Diary

    75. Pingback: Using Log Parser with SharePoint to Generate Interesting Reports – My WordPress

    76. Pingback: I LOVE LogParser - PowerCram Blog

    77. Pingback: Using Log Parser with SharePoint to Generate Interesting Reports - SharePoint Diary

    78. Vinod says:

      Hi mlichtenberg,

      By using the following query I am able to convert milliseconds into seconds however I want the results to be retrieved in decimal numbers. For example 0.16 or etc instead of 0.

      SELECT TOP 20 cs-uri-stem,
      COUNT(*) AS Total,
      Div(Avg(time-taken),1000) AS AvgTime,
      Div(Min(time-taken),1000) AS MinTime,
      Div(Max(time-taken),1000) AS MaxTime
      FROM ‘[LOGFILEPATH]’ WHERE cs-method = ‘POST’
      GROUP BY cs-uri-stem
      ORDER BY Total DESC

      Can you please help me out to get it done.

      Thanks,
      Vinod

      • mlichtenberg says:

        Try converting the values to the REAL data type before applying the DIV function. For example:

        SELECT TOP 20 cs-uri-stem,
        COUNT(*) AS Total,
        Div(TO_REAL(Avg(time-taken)),1000) AS AvgTime,
        Div(TO_REAL(Min(time-taken)),1000) AS MinTime,
        Div(TO_REAL(Max(time-taken)),1000) AS MaxTime
        FROM ‘[LOGFILEPATH]’ WHERE cs-method = ‘POST’
        GROUP BY cs-uri-stem
        ORDER BY Total DESC

        BTW, https://logparserplus.com/functions/ provides a great resource for looking up what functions are available for use with Log Parser.

    79. Pingback: Windows日志分析工具 – Venhow's Blog

    80. Pingback: IIS Log Files: How To Interpret, Customize, Query, and Aggregate

    Leave a reply to mlichtenberg Cancel reply