|
From: Strahan, B. <bob...@hp...> - 2007-11-05 22:36:07
|
Mike - I have more questions for you.. sorry - you've opened the floodgates=
by being helpful first time :)
I'd like to set up my multi-process perl app to support chainsaw as a log v=
iewer..
I followed the instructions in the FAQ, and it worked.. but only if chainsa=
w was up and running.
But if chainsaw wasn't started, log4perl would die, complaining it couldn't=
establish the connection.
I discovered the 'silent_recovery' flag - but although this keeps the appli=
cation from dying if chainsaw isn't running, it does very significantly slo=
w down, as each log call is attempting (and failing) to establish a connect=
ion to the non-existant chainsaw port.
Any ideas how to configure things so I can use chainsaw, but avoid impact t=
o the performance of the application if it's not running?
Ideally I'd like to leverage the chainsaw's 'SocketHub' receiver, in order =
to support multiple/remote chainsaw connections to my perl service..
I suspect that I'll need some sort of process that serves as a socket hub -=
to accept multiple connections on PortA from my log4perl app, and zero or =
more connections on portB from chainsaw - and route log4perl messages to an=
y/all chainsaw connections?
Have you done anything like this before? Any existing modules I can reuse f=
or this?
All pointers gratefully received..
Thanks
Bob
-----Original Message-----
From: Mike Schilli [mailto:m...@pe...]
Sent: Sunday, November 04, 2007 3:55 PM
To: Strahan, Bob
Cc: Mike Schilli; log...@li...
Subject: RE: [log4perl-devel] log4perl causing perl process to die (fwd)
On Sun, 4 Nov 2007, Strahan, Bob wrote:
> We do use the 'close_after_write' option... As I mentioned, there are
> multiple concurrent processes continually being spawned by the
> service, each using log4perl to log to the same logfile. So we
> figured we needed to use File::Locked along with close_after_write to
> ensure each process got an exclusive lock on the logfile before
> writing to it.
I see -- the recommended ways of synchronizing access to an appender are
listed in the Log4perl FAQ:
http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804
I'm not sure how well they work on Windows, though, but give the
'syswrite' option a try, that should be the easiest.
-- Mike
Mike Schilli
m...@pe...
> Let me know if there is a better (more efficient) way to handle
> multiple concurrent processes logging to the same file e.g. Would
> using socket appenders to route log messages to single log server
> process which handles file i/o from one process be a better option?
>
>
> > Which version of Windows are you running by the way? On regular XP, it
> > seems to work as expected.
>
> Windows 2003 64-bit server.. I haven't tried it on other flavors of Wind=
ows.
>
>
> For now I have worked around the problem by inserting the open() call int=
o a retry loop..
> #open $fh, "$self->{mode}$self->{filename}"
> # or die "Cannot write to '$self->{filename}': $!";
> while (1) {
> last if open $fh, "$self->{mode}$self->{filename}" ;
> }
>
>
>
>
>
>
>
>
> -----Original Message-----
> From: log...@li... [mailto:log4perl-devel=
-bo...@li...] On Behalf Of Mike Schilli
> Sent: Saturday, November 03, 2007 6:32 PM
> To: Mike Schilli
> Cc: log...@li...
> Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd)
>
> On Fri, 2 Nov 2007, Bob Strahan wrote:
>
> > However, it seems that if certain filesystem operations are
> > performed on the logfile it can cause the logger to execute die(),
> > causing my service to die, with the following error
> >
> > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt':
> > Permission denied at D:\Program Files (x86)\My
> > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86.
>
> Hmm, this is Log::Dispatch::File's _open_file() function complaining
> that an open() failed. Does your service open a files after it's been
> running for a while? Typically, Log::Dispatch::File(::Locked) opens the
> file only once unless 'close_after_write' is given.
>
> Which version of Windows are you running by the way? On regular XP, it
> seems to work as expected.
>
> -- Mike
>
> Mike Schilli
> m...@pe...
>
> > I am using log4perl in a Win32 service that needs to run forever.. How=
ever, I have encountered a situation where the logger call is executing a d=
ie() and causing my service to die...
> >
> >
> > The service spawns multiple child processes which run concurrently but =
all log to the same logfile.. We're using File::Locked to avoid contention.=
. Extract from our logger config below..
> >
> > "log4perl.appender.myapp" =3D> "Log::Dispatch::File::Locked",
> > "log4perl.appender.myapp.filename" =3D> "D:/Program Files (x86)/My App/=
logs/logfile.txt",
> > "log4perl.appender.myapp.mode" =3D> "append",
> > "log4perl.appender.myapp.close_after_write" =3D> "true",
> > "log4perl.appender.myapp.permissions" =3D> "0660",
> > Etc..
> >
> >
>
> > I can reproduce the problem sporadically by simply opening the logfile =
in Wordpad..
> > I can reproduce it reliably by repeatedly copying the logfile using tes=
t script below
> >
> > #!perl -w
> > use File::Copy ;
> > while (1) {
> > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program =
Files (x86)/My App/logs/logfileCOPY.txt") ;
> > print "." ;
> > }
> >
> >
> > Any suggestions on how to defend against users copying or opening the l=
ogfile? We should block and retry until open() suceeds, rather than die()=
, I think.
> >
> > Please let me know if you can help with a patch, workaround, or suggest=
ion.
> >
> > Regards
> >
> >
> >
> > Bob Strahan
> > -----------------------------------------------------------------------=
--
> > This SF.net email is sponsored by: Splunk Inc.
> > Still grepping through log files to find problems? Stop.
> > Now Search log events and configuration files using AJAX and a browser.
> > Download your FREE copy of Splunk now >> http://get.splunk.com/
> > _______________________________________________
> > log4perl-devel mailing list
> > log...@li...
> > https://lists.sourceforge.net/lists/listinfo/log4perl-devel
> >
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems? Stop.
> Now Search log events and configuration files using AJAX and a browser.
> Download your FREE copy of Splunk now >> http://get.splunk.com/
> _______________________________________________
> log4perl-devel mailing list
> log...@li...
> https://lists.sourceforge.net/lists/listinfo/log4perl-devel
>
|