Re: statistic tool for httpd ?

Bryon Propst (bryon@meaddata.com)
Wed, 14 Dec 1994 00:03:53 +0100

Isn't this written for the NCSA httpd? Will it work with CERN's? If
not, any similar programs for the CERN httpd logs?

Bryon Propst

> Date: Tue, 13 Dec 1994 14:57:41 +0100
> From: "Kurt Westh Nielsen" <kwn@ingenioeren.dk>
>
> Does anybody know of some statistics (unix) tools, well suited to
> work on the access_log file that NCSA's httpd creates ?
>
> /Kurt
> _______________________
> Kurt Westh Nielsen email: kwn@ingenioeren.dk
> -------------------------
>
> Roy Fielding's wwwstat program is in use at a lot of sites, including
> here; see
>
> http://www.ics.uci.edu/WebSoft/wwwstat/
>
> The only problem I've had with it is that it produces too much
> information to peruse on a daily basis (wwwstat summaries are still
> pretty big).
>
> I finally wrote a program of my own which takes the wwwstat summary,
> and produces a two-page metasummary whose major content is a breakdown
> of traffic by directory (cumulative over subdirectories), with traffic
> in images broken out from other files (to give some idea how much
> traffic comes from inline images), and with low-traffic directories
> suppressed. This is a perl5 script; see
>
> http://www.ai.mit.edu/tools/usum/usum.html
>
> for pointers to a sample of the output, and a copy of the script, if
> you're interested.
>
> rst
>
>