Re: WWW Usage (sample from two hosts) (Lou Montulli)
From: (Lou Montulli)
Message-id: <>
Subject: Re: WWW Usage (sample from two hosts)
Date: Wed, 25 Aug 93 18:23:07 CDT
In-reply-to: <9308251838.AA29424@austin.BSDI.COM>; from "Tony Sanders" at Aug 25, 93 1:38 pm
X-Mailer: ELM [version 2.3 PL2]
Status: RO
> > Your work is useful for me, as I will go on an evangelising tour of  
> > Europe soon. I would like a breakdown hierarchically: total for uk,  
> > then total for ac inside uk, and so on. Thus we would get something  
> Ok, How about the format below.  It goes two levels down for all
> domains that end with a two letter code, and one level down for the rest.
> For the online version it will sort each top level domain into seperate
> files and generate an index summary with pointers to the detailed lists.
> Should I limit the three level reports to domains with .ac .edu or .com
> subdomains (i.e., .at .au .be .il .jp .kr .nz .pl .sg .tr .tw .uk .us .za)?
> so .ch would look like this sample.  On the other hand I could have it
> so that it only details known subdomain names (.edu .ac .co .gv .gov .com
> any others?).  Speak now.
> The detailed reports are sorted alphabetically but the index will be
> sorted by count.  Does that sounds ok?

I fail to see the usefulness of this data.  The number of different
ip addresses accessing servers is very misleading.  For
a good example of why, consider that 1000 users a day use our anonymous
access acount, yet they only ever have one ip address.  The only thing
that these numbers tell you is how many different computers are being
used.  Why don't you collect the number of hits from each ip address
and merge them to find out the number of total hits per domain.