DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
  1. DZone
  2. Data Engineering
  3. Databases
  4. Up Log Creek Without a Paddle – Part 2: IIS Log File Investigations

Up Log Creek Without a Paddle – Part 2: IIS Log File Investigations

Douglas Rathbone user avatar by
Douglas Rathbone
·
Aug. 24, 12 · Interview
Like (0)
Save
Tweet
Share
4.85K Views

Join the DZone community and get the full member experience.

Join For Free

when it comes to reviewing visitor site usage, server bandwidth usage, or forensic security investigations; iis log files often hold the answers. although as i'm sure you’re more than aware, gigantic text files can be hard to view let alone pull intelligence from. investigating a website attack can be really daunting when looking at log files as an information source. in my previous post i covered a tool to help with windows security logs . lucky for us it’s just as awesome when dealing with huge iis logs.

this is part 2 in a 2 part series on reviewing actual or investigating potential website security intrusions on iis hosted websites. view part 1 here .

image

image credit: jim b l

information overload…

iis log files are great as they store the timeline of all usage on your website. if you know where to look, you can retrieve information that helps draw conclusions on so many different visitor scenarios. in the context of security, if you’re armed with the knowledge of what to look for, iis log files can be a great tool to forensically review an attackers behaviour; whether they have been passive in their prodding of your application by just sniffing around or active by actually trying to break things.

fields of gold

when setting up your website, iis can log a significant set of data; but not every option is turned on. to turn on full logging, open iis manager, select your server’s root node and select “logging”, then open up the “select fields” button.

image

select every option to include as much data as possible.

image

once you have this turned on, you’ll end up with a sea of information in these log files. on a busy website, these files will probably be too large to open in notepad.

image

log parser to the rescue

as i covered in my previous post , opening and then scanning huge logs can be a daunting task – luckily microsoft log parser 2.2 solves this problem with a tool that makes it easy to query multiple huge log files using well known sql in record time!

log parser lizard takes this one step further by allowing you to do this from a gui interface.

to prove this point, load up log parser lizard and write a simple query against the whole log folder (not just a since large file, but many) for one of your iis sites:

select top 100 * from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'

image

pretty cool, eh?

so you’ve got log files, you’ve got a kick ass tool to query them, but what queries will help you make sense of the data or track down the evil doers?

first look: security queries

to start with, you need to think of the context you’re looking at when querying your log files in. in issues where someone has attacked your site in the past, you’re probably in the search for answers on how.

if so, do you have an incident time to start trying to track down how it happened?

getting a feel for when/how

maybe try a query for a certain date/time:

select * 
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'
where date = '2012-07-26' and time between timestamp('02:03:00','hh:mm:ss') and timestamp('02:07:00','hh:mm:ss')

or maybe you just want to start by looking for server errors as a start, to see if an attacker was testing your website for vulnerabilities?

query looking for server http response code 500:

select * 
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'
where sc-status = 500

another handy behaviour to look for is where you have a secure admin section. most admin sections on websites redirect you if you aren’t logged in. they usually do this with a 302 redirect.

so let’s look for visitors trying to access your site using the relative url “/admin/*” and who’re being redirected using 302 headers. while this will show some false positives for wherever you’ve tried to log in and forgotten the password, it’ll also give you an idea if someone is sniffing around.

select * 
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log' 
where  cs-uri-stem like '/admin/%' and sc-status = 302

following a security trail

once you have what looks like an intruder poking around, you then want to see what else they’ve been doing – and you should have their ip address from the previous query so we can query the logs for them:

select *
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'
where s-ip='*ip address of attacker*'

you can also see how much traffic is coming from single ip addresses by querying for each of their total usage – this might show someone attempting to brute force your site over time.

select to_localtime(quantize(to_timestamp(date, time), 3600)), count(*) as numberrequests 
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'  
group by to_localtime(quantize(to_timestamp(date,time), 3600))
order by numberrequests desc

you can then query your logs for how many requests are coming from certain ip addresses to your “admin” section, again seeing brute forcing issues.

select to_localtime(quantize(to_timestamp(date, time), 3600)), count(*) as numberrequests 
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'
where (cs-uri-stem like '/admin/%') 
group by to_localtime(quantize(to_timestamp(date,time), 3600))
order by numberrequests desc

non security related queries

checking for incoming broken referral links:

select distinct cs(referer) as referer, cs-uri-stem as url
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'
where cs(referer) is not null and sc-status = 404 and (sc-substatus is null or sc-substatus=0)

top 10 slowest pages/files to load:

select top 10 cs-uri-stem, max(time-taken) as maxtime, avg(time-taken) as avgtime
from 'c:\inetpub\logs\logfiles\*website log folder*\*.log'
group by cs-uri-stem
order by maxtime desc

traffic by day:

select to_string(to_timestamp(date, time), 'mm-dd') as day, 
    div(sum(cs-bytes),1024) as incoming(k), 
    div(sum(sc-bytes),1024) as outgoing(k)
from 'c:\inetpub\logs\logfiles\*website log folder\*.log'
group by day

summary

log parser is an awesome tool for querying large log files. log parser lizard makes this even easier with the addition of a gui interface. for extremely large files i prefer to use the command line client for speed, but using the gui to build your queries makes like just so easy – all of a sudden your information overload becomes a high signal/noise ratio sweet symphony that can help you either get a better feel for how your site is being used, or in the worst case scenario help you track down an evil doer and how they got in.

Database Paddle

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Spring Cloud: How To Deal With Microservice Configuration (Part 1)
  • GPT-3 Playground: The AI That Can Write for You
  • How to Create a Real-Time Scalable Streaming App Using Apache NiFi, Apache Pulsar, and Apache Flink SQL
  • Implementing Infinite Scroll in jOOQ

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: