The traffic on my website has now increased to the point that awstats can no longer process my logs.
My logfile or this month is 197M and now awstats crashes with out of memory errors. I know my server has enough memory, 1 gig of ram and a 1gig swapfile but it seems the data file is now too big to work with (splitting up my logs doesn't help any). The only ulimits it is processing under are datasize 524288 kb and stacksize 65536 kb which I dont seem to be able to do anything about.
Even if I do change my awstats settings to store less data, is there any way to reprocess the data file without having to reprocess 11 days of log?
I doubt that 192MB of logfile is really too much for AWStats. I'm using it on a site that yields a tad more than 30GB (yes, no typo here) per month; the logs are split up in per-server- and per-day-chunks, ranging from around 105MB up to 183MB and I'm processing them in one go using the logresolvemerge.pl tool that comes with AWStats. The logs are processed with AWStats 6.3 on a Windows 2000 server running Cygwin with Perl 5.8.4, it's got a PIV 2.4 GHz CPU and 1GB of RAM. Which version of AWStats are you using, which version of Perl? Have you tried to find on which line of your logs AWStats chokes exactly? Maybe there's some illegal log-record that produces some strange results (had that once).
FYI, you really should upgrade from 6.3 to 6.4, because it fixes security holes.
None of the features/bugfixes of 6.4 is of any concern in my case - I'm not using it on a public server, it can only be run from command-line and in fact I'm the only one who has access to it anyway; and as I've included a small hack, I'm not too hot on making each and every available upgrade, as I don't see any benefits for my case. I agree though, that keeping one's server-software up to date is extremely important nowadays.
Log in to post a comment.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.