From: Mike S. <m...@pe...> - 2007-11-02 23:12:00
|
Forwarded with permission: ---------- Forwarded message ---------- From: "Strahan, Bob" <bob...@hp...> To: "log...@pe..." <log...@pe...> Subject: log4perl causing perl process to die Date: Fri, 2 Nov 2007 20:05:37 +0000 Hi I am using log4perl in a Win32 service that needs to run forever.. However, I have encountered a situation where the logger call is executing a die() and causing my service to die... The service spawns multiple child processes which run concurrently but all log to the same logfile.. We're using File::Locked to avoid contention.. Extract from our logger config below.. "log4perl.appender.myapp" => "Log::Dispatch::File::Locked", "log4perl.appender.myapp.filename" => "D:/Program Files (x86)/My App/logs/logfile.txt", "log4perl.appender.myapp.mode" => "append", "log4perl.appender.myapp.close_after_write" => "true", "log4perl.appender.myapp.permissions" => "0660", Etc.. However, it seems that if certain filesystem operations are performed on the logfile it can cause the logger to execute die(), causing my service to die, with the following error Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': Permission denied at D:\Program Files (x86)\My App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. I can reproduce the problem sporadically by simply opening the logfile in Wordpad.. I can reproduce it reliably by repeatedly copying the logfile using test script below #!perl -w use File::Copy ; while (1) { copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program Files (x86)/My App/logs/logfileCOPY.txt") ; print "." ; } Any suggestions on how to defend against users copying or opening the logfile? We should block and retry until open() suceeds, rather than die(), I think. Please let me know if you can help with a patch, workaround, or suggestion. Regards Bob Strahan HP Software, R&D 703.579.1929 office | 702.967.5228 mobile | 702.579.1929 fax | bob...@hp...<mailto:bob...@hp...> 10700 Parkridge Blvd. #500 | Reston | VA 20191 |
From: Mike S. <m...@pe...> - 2007-11-03 22:32:02
|
On Fri, 2 Nov 2007, Bob Strahan wrote: > However, it seems that if certain filesystem operations are > performed on the logfile it can cause the logger to execute die(), > causing my service to die, with the following error > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > Permission denied at D:\Program Files (x86)\My > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. Hmm, this is Log::Dispatch::File's _open_file() function complaining that an open() failed. Does your service open a files after it's been running for a while? Typically, Log::Dispatch::File(::Locked) opens the file only once unless 'close_after_write' is given. Which version of Windows are you running by the way? On regular XP, it seems to work as expected. -- Mike Mike Schilli m...@pe... > I am using log4perl in a Win32 service that needs to run forever.. However, I have encountered a situation where the logger call is executing a die() and causing my service to die... > > > The service spawns multiple child processes which run concurrently but all log to the same logfile.. We're using File::Locked to avoid contention.. Extract from our logger config below.. > > "log4perl.appender.myapp" => "Log::Dispatch::File::Locked", > "log4perl.appender.myapp.filename" => "D:/Program Files (x86)/My App/logs/logfile.txt", > "log4perl.appender.myapp.mode" => "append", > "log4perl.appender.myapp.close_after_write" => "true", > "log4perl.appender.myapp.permissions" => "0660", > Etc.. > > > I can reproduce the problem sporadically by simply opening the logfile in Wordpad.. > I can reproduce it reliably by repeatedly copying the logfile using test script below > > #!perl -w > use File::Copy ; > while (1) { > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program Files (x86)/My App/logs/logfileCOPY.txt") ; > print "." ; > } > > > Any suggestions on how to defend against users copying or opening the logfile? We should block and retry until open() suceeds, rather than die(), I think. > > Please let me know if you can help with a patch, workaround, or suggestion. > > Regards > > > > Bob Strahan > HP Software, R&D > > 703.579.1929 office | 702.967.5228 mobile | 702.579.1929 fax | bob...@hp...<mailto:bob...@hp...> > 10700 Parkridge Blvd. #500 | Reston | VA 20191 > > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > log4perl-devel mailing list > log...@li... > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > |
From: Strahan, B. <bob...@hp...> - 2007-11-04 14:06:40
|
Hi Mike > Does your service open a files after it's been > running for a while? Typically, Log::Dispatch::File(::Locked) opens the > file only once unless 'close_after_write' is given. We do use the 'close_after_write' option... As I mentioned, there are multi= ple concurrent processes continually being spawned by the service, each usi= ng log4perl to log to the same logfile. So we figured we needed to use Fil= e::Locked along with close_after_write to ensure each process got an exclus= ive lock on the logfile before writing to it. Let me know if there is a better (more efficient) way to handle multiple co= ncurrent processes logging to the same file e.g. Would using socket appende= rs to route log messages to single log server process which handles file i/= o from one process be a better option? > Which version of Windows are you running by the way? On regular XP, it > seems to work as expected. Windows 2003 64-bit server.. I haven't tried it on other flavors of Window= s. For now I have worked around the problem by inserting the open() call into = a retry loop.. #open $fh, "$self->{mode}$self->{filename}" # or die "Cannot write to '$self->{filename}': $!"; while (1) { last if open $fh, "$self->{mode}$self->{filename}" ; } -----Original Message----- From: log...@li... [mailto:log4perl-devel-b= ou...@li...] On Behalf Of Mike Schilli Sent: Saturday, November 03, 2007 6:32 PM To: Mike Schilli Cc: log...@li... Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd) On Fri, 2 Nov 2007, Bob Strahan wrote: > However, it seems that if certain filesystem operations are > performed on the logfile it can cause the logger to execute die(), > causing my service to die, with the following error > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > Permission denied at D:\Program Files (x86)\My > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. Hmm, this is Log::Dispatch::File's _open_file() function complaining that an open() failed. Does your service open a files after it's been running for a while? Typically, Log::Dispatch::File(::Locked) opens the file only once unless 'close_after_write' is given. Which version of Windows are you running by the way? On regular XP, it seems to work as expected. -- Mike Mike Schilli m...@pe... > I am using log4perl in a Win32 service that needs to run forever.. Howev= er, I have encountered a situation where the logger call is executing a die= () and causing my service to die... > > > The service spawns multiple child processes which run concurrently but al= l log to the same logfile.. We're using File::Locked to avoid contention.. = Extract from our logger config below.. > > "log4perl.appender.myapp" =3D> "Log::Dispatch::File::Locked", > "log4perl.appender.myapp.filename" =3D> "D:/Program Files (x86)/My App/lo= gs/logfile.txt", > "log4perl.appender.myapp.mode" =3D> "append", > "log4perl.appender.myapp.close_after_write" =3D> "true", > "log4perl.appender.myapp.permissions" =3D> "0660", > Etc.. > > > I can reproduce the problem sporadically by simply opening the logfile in= Wordpad.. > I can reproduce it reliably by repeatedly copying the logfile using test = script below > > #!perl -w > use File::Copy ; > while (1) { > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program Fi= les (x86)/My App/logs/logfileCOPY.txt") ; > print "." ; > } > > > Any suggestions on how to defend against users copying or opening the log= file? We should block and retry until open() suceeds, rather than die(), = I think. > > Please let me know if you can help with a patch, workaround, or suggestio= n. > > Regards > > > > Bob Strahan > HP Software, R&D > > 703.579.1929 office | 702.967.5228 mobile | 702.579.1929 fax | bob.straha= n...@hp...<mailto:bob...@hp...> > 10700 Parkridge Blvd. #500 | Reston | VA 20191 > > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > log4perl-devel mailing list > log...@li... > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > ------------------------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Still grepping through log files to find problems? Stop. Now Search log events and configuration files using AJAX and a browser. Download your FREE copy of Splunk now >> http://get.splunk.com/ _______________________________________________ log4perl-devel mailing list log...@li... https://lists.sourceforge.net/lists/listinfo/log4perl-devel |
From: Mike S. <m...@pe...> - 2007-11-04 20:55:34
|
On Sun, 4 Nov 2007, Strahan, Bob wrote: > We do use the 'close_after_write' option... As I mentioned, there are > multiple concurrent processes continually being spawned by the > service, each using log4perl to log to the same logfile. So we > figured we needed to use File::Locked along with close_after_write to > ensure each process got an exclusive lock on the logfile before > writing to it. I see -- the recommended ways of synchronizing access to an appender are listed in the Log4perl FAQ: http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804 I'm not sure how well they work on Windows, though, but give the 'syswrite' option a try, that should be the easiest. -- Mike Mike Schilli m...@pe... > Let me know if there is a better (more efficient) way to handle > multiple concurrent processes logging to the same file e.g. Would > using socket appenders to route log messages to single log server > process which handles file i/o from one process be a better option? > > > > Which version of Windows are you running by the way? On regular XP, it > > seems to work as expected. > > Windows 2003 64-bit server.. I haven't tried it on other flavors of Windows. > > > For now I have worked around the problem by inserting the open() call into a retry loop.. > #open $fh, "$self->{mode}$self->{filename}" > # or die "Cannot write to '$self->{filename}': $!"; > while (1) { > last if open $fh, "$self->{mode}$self->{filename}" ; > } > > > > > > > > > -----Original Message----- > From: log...@li... [mailto:log...@li...] On Behalf Of Mike Schilli > Sent: Saturday, November 03, 2007 6:32 PM > To: Mike Schilli > Cc: log...@li... > Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd) > > On Fri, 2 Nov 2007, Bob Strahan wrote: > > > However, it seems that if certain filesystem operations are > > performed on the logfile it can cause the logger to execute die(), > > causing my service to die, with the following error > > > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > > Permission denied at D:\Program Files (x86)\My > > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. > > Hmm, this is Log::Dispatch::File's _open_file() function complaining > that an open() failed. Does your service open a files after it's been > running for a while? Typically, Log::Dispatch::File(::Locked) opens the > file only once unless 'close_after_write' is given. > > Which version of Windows are you running by the way? On regular XP, it > seems to work as expected. > > -- Mike > > Mike Schilli > m...@pe... > > > I am using log4perl in a Win32 service that needs to run forever.. However, I have encountered a situation where the logger call is executing a die() and causing my service to die... > > > > > > The service spawns multiple child processes which run concurrently but all log to the same logfile.. We're using File::Locked to avoid contention.. Extract from our logger config below.. > > > > "log4perl.appender.myapp" => "Log::Dispatch::File::Locked", > > "log4perl.appender.myapp.filename" => "D:/Program Files (x86)/My App/logs/logfile.txt", > > "log4perl.appender.myapp.mode" => "append", > > "log4perl.appender.myapp.close_after_write" => "true", > > "log4perl.appender.myapp.permissions" => "0660", > > Etc.. > > > > > > > I can reproduce the problem sporadically by simply opening the logfile in Wordpad.. > > I can reproduce it reliably by repeatedly copying the logfile using test script below > > > > #!perl -w > > use File::Copy ; > > while (1) { > > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program Files (x86)/My App/logs/logfileCOPY.txt") ; > > print "." ; > > } > > > > > > Any suggestions on how to defend against users copying or opening the logfile? We should block and retry until open() suceeds, rather than die(), I think. > > > > Please let me know if you can help with a patch, workaround, or suggestion. > > > > Regards > > > > > > > > Bob Strahan > > HP Software, R&D > > > > 703.579.1929 office | 702.967.5228 mobile | 702.579.1929 fax | bob...@hp...<mailto:bob...@hp...> > > 10700 Parkridge Blvd. #500 | Reston | VA 20191 > > > > > > ------------------------------------------------------------------------- > > This SF.net email is sponsored by: Splunk Inc. > > Still grepping through log files to find problems? Stop. > > Now Search log events and configuration files using AJAX and a browser. > > Download your FREE copy of Splunk now >> http://get.splunk.com/ > > _______________________________________________ > > log4perl-devel mailing list > > log...@li... > > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > > > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > log4perl-devel mailing list > log...@li... > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > |
From: Strahan, B. <bob...@hp...> - 2007-11-05 22:36:07
|
Mike - I have more questions for you.. sorry - you've opened the floodgates= by being helpful first time :) I'd like to set up my multi-process perl app to support chainsaw as a log v= iewer.. I followed the instructions in the FAQ, and it worked.. but only if chainsa= w was up and running. But if chainsaw wasn't started, log4perl would die, complaining it couldn't= establish the connection. I discovered the 'silent_recovery' flag - but although this keeps the appli= cation from dying if chainsaw isn't running, it does very significantly slo= w down, as each log call is attempting (and failing) to establish a connect= ion to the non-existant chainsaw port. Any ideas how to configure things so I can use chainsaw, but avoid impact t= o the performance of the application if it's not running? Ideally I'd like to leverage the chainsaw's 'SocketHub' receiver, in order = to support multiple/remote chainsaw connections to my perl service.. I suspect that I'll need some sort of process that serves as a socket hub -= to accept multiple connections on PortA from my log4perl app, and zero or = more connections on portB from chainsaw - and route log4perl messages to an= y/all chainsaw connections? Have you done anything like this before? Any existing modules I can reuse f= or this? All pointers gratefully received.. Thanks Bob -----Original Message----- From: Mike Schilli [mailto:m...@pe...] Sent: Sunday, November 04, 2007 3:55 PM To: Strahan, Bob Cc: Mike Schilli; log...@li... Subject: RE: [log4perl-devel] log4perl causing perl process to die (fwd) On Sun, 4 Nov 2007, Strahan, Bob wrote: > We do use the 'close_after_write' option... As I mentioned, there are > multiple concurrent processes continually being spawned by the > service, each using log4perl to log to the same logfile. So we > figured we needed to use File::Locked along with close_after_write to > ensure each process got an exclusive lock on the logfile before > writing to it. I see -- the recommended ways of synchronizing access to an appender are listed in the Log4perl FAQ: http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804 I'm not sure how well they work on Windows, though, but give the 'syswrite' option a try, that should be the easiest. -- Mike Mike Schilli m...@pe... > Let me know if there is a better (more efficient) way to handle > multiple concurrent processes logging to the same file e.g. Would > using socket appenders to route log messages to single log server > process which handles file i/o from one process be a better option? > > > > Which version of Windows are you running by the way? On regular XP, it > > seems to work as expected. > > Windows 2003 64-bit server.. I haven't tried it on other flavors of Wind= ows. > > > For now I have worked around the problem by inserting the open() call int= o a retry loop.. > #open $fh, "$self->{mode}$self->{filename}" > # or die "Cannot write to '$self->{filename}': $!"; > while (1) { > last if open $fh, "$self->{mode}$self->{filename}" ; > } > > > > > > > > > -----Original Message----- > From: log...@li... [mailto:log4perl-devel= -bo...@li...] On Behalf Of Mike Schilli > Sent: Saturday, November 03, 2007 6:32 PM > To: Mike Schilli > Cc: log...@li... > Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd) > > On Fri, 2 Nov 2007, Bob Strahan wrote: > > > However, it seems that if certain filesystem operations are > > performed on the logfile it can cause the logger to execute die(), > > causing my service to die, with the following error > > > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > > Permission denied at D:\Program Files (x86)\My > > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. > > Hmm, this is Log::Dispatch::File's _open_file() function complaining > that an open() failed. Does your service open a files after it's been > running for a while? Typically, Log::Dispatch::File(::Locked) opens the > file only once unless 'close_after_write' is given. > > Which version of Windows are you running by the way? On regular XP, it > seems to work as expected. > > -- Mike > > Mike Schilli > m...@pe... > > > I am using log4perl in a Win32 service that needs to run forever.. How= ever, I have encountered a situation where the logger call is executing a d= ie() and causing my service to die... > > > > > > The service spawns multiple child processes which run concurrently but = all log to the same logfile.. We're using File::Locked to avoid contention.= . Extract from our logger config below.. > > > > "log4perl.appender.myapp" =3D> "Log::Dispatch::File::Locked", > > "log4perl.appender.myapp.filename" =3D> "D:/Program Files (x86)/My App/= logs/logfile.txt", > > "log4perl.appender.myapp.mode" =3D> "append", > > "log4perl.appender.myapp.close_after_write" =3D> "true", > > "log4perl.appender.myapp.permissions" =3D> "0660", > > Etc.. > > > > > > > I can reproduce the problem sporadically by simply opening the logfile = in Wordpad.. > > I can reproduce it reliably by repeatedly copying the logfile using tes= t script below > > > > #!perl -w > > use File::Copy ; > > while (1) { > > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program = Files (x86)/My App/logs/logfileCOPY.txt") ; > > print "." ; > > } > > > > > > Any suggestions on how to defend against users copying or opening the l= ogfile? We should block and retry until open() suceeds, rather than die()= , I think. > > > > Please let me know if you can help with a patch, workaround, or suggest= ion. > > > > Regards > > > > > > > > Bob Strahan > > -----------------------------------------------------------------------= -- > > This SF.net email is sponsored by: Splunk Inc. > > Still grepping through log files to find problems? Stop. > > Now Search log events and configuration files using AJAX and a browser. > > Download your FREE copy of Splunk now >> http://get.splunk.com/ > > _______________________________________________ > > log4perl-devel mailing list > > log...@li... > > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > > > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > log4perl-devel mailing list > log...@li... > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > |
From: Kevin M. G. <cp...@go...> - 2007-11-06 16:29:48
|
Mike Schilli wrote: > I see -- the recommended ways of synchronizing access to an appender are > listed in the Log4perl FAQ: > > http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804 Something I learned recently that's apropos that I've been meaning to mention: on Linux, you don't have to worry about interleaving messages when writing to the same file as long as the messages themselves are smaller than PIPE_BUF, which on my FC5 machine here is defined as 4096 bytes: $ grep -r PIPE_BUF /usr/include/ /usr/include/linux/limits.h:#define PIPE_BUF 4096 /* # bytes in atomic write to a pipe */ |
From: Strahan, B. <bob...@hp...> - 2007-11-05 22:00:26
|
Thanks Mike.. Looks like the syncer appender has a set of additional module dependencies = we don't have setup yet in our environment.. I'll try it out when I get a c= hange to get things set up. -----Original Message----- From: Mike Schilli [mailto:m...@pe...] Sent: Sunday, November 04, 2007 3:55 PM To: Strahan, Bob Cc: Mike Schilli; log...@li... Subject: RE: [log4perl-devel] log4perl causing perl process to die (fwd) On Sun, 4 Nov 2007, Strahan, Bob wrote: > We do use the 'close_after_write' option... As I mentioned, there are > multiple concurrent processes continually being spawned by the > service, each using log4perl to log to the same logfile. So we > figured we needed to use File::Locked along with close_after_write to > ensure each process got an exclusive lock on the logfile before > writing to it. I see -- the recommended ways of synchronizing access to an appender are listed in the Log4perl FAQ: http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804 I'm not sure how well they work on Windows, though, but give the 'syswrite' option a try, that should be the easiest. -- Mike Mike Schilli m...@pe... > Let me know if there is a better (more efficient) way to handle > multiple concurrent processes logging to the same file e.g. Would > using socket appenders to route log messages to single log server > process which handles file i/o from one process be a better option? > > > > Which version of Windows are you running by the way? On regular XP, it > > seems to work as expected. > > Windows 2003 64-bit server.. I haven't tried it on other flavors of Wind= ows. > > > For now I have worked around the problem by inserting the open() call int= o a retry loop.. > #open $fh, "$self->{mode}$self->{filename}" > # or die "Cannot write to '$self->{filename}': $!"; > while (1) { > last if open $fh, "$self->{mode}$self->{filename}" ; > } > > > > > > > > > -----Original Message----- > From: log...@li... [mailto:log4perl-devel= -bo...@li...] On Behalf Of Mike Schilli > Sent: Saturday, November 03, 2007 6:32 PM > To: Mike Schilli > Cc: log...@li... > Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd) > > On Fri, 2 Nov 2007, Bob Strahan wrote: > > > However, it seems that if certain filesystem operations are > > performed on the logfile it can cause the logger to execute die(), > > causing my service to die, with the following error > > > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > > Permission denied at D:\Program Files (x86)\My > > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. > > Hmm, this is Log::Dispatch::File's _open_file() function complaining > that an open() failed. Does your service open a files after it's been > running for a while? Typically, Log::Dispatch::File(::Locked) opens the > file only once unless 'close_after_write' is given. > > Which version of Windows are you running by the way? On regular XP, it > seems to work as expected. > > -- Mike > > Mike Schilli > m...@pe... > > > I am using log4perl in a Win32 service that needs to run forever.. How= ever, I have encountered a situation where the logger call is executing a d= ie() and causing my service to die... > > > > > > The service spawns multiple child processes which run concurrently but = all log to the same logfile.. We're using File::Locked to avoid contention.= . Extract from our logger config below.. > > > > "log4perl.appender.myapp" =3D> "Log::Dispatch::File::Locked", > > "log4perl.appender.myapp.filename" =3D> "D:/Program Files (x86)/My App/= logs/logfile.txt", > > "log4perl.appender.myapp.mode" =3D> "append", > > "log4perl.appender.myapp.close_after_write" =3D> "true", > > "log4perl.appender.myapp.permissions" =3D> "0660", > > Etc.. > > > > > > > I can reproduce the problem sporadically by simply opening the logfile = in Wordpad.. > > I can reproduce it reliably by repeatedly copying the logfile using tes= t script below > > > > #!perl -w > > use File::Copy ; > > while (1) { > > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program = Files (x86)/My App/logs/logfileCOPY.txt") ; > > print "." ; > > } > > > > > > Any suggestions on how to defend against users copying or opening the l= ogfile? We should block and retry until open() suceeds, rather than die()= , I think. > > > > Please let me know if you can help with a patch, workaround, or suggest= ion. > > > > Regards > > > > > > > > Bob Strahan > > HP Software, R&D > > > > 703.579.1929 office | 702.967.5228 mobile | 702.579.1929 fax | bob.stra= ha...@hp...<mailto:bob...@hp...> > > 10700 Parkridge Blvd. #500 | Reston | VA 20191 > > > > > > -----------------------------------------------------------------------= -- > > This SF.net email is sponsored by: Splunk Inc. > > Still grepping through log files to find problems? Stop. > > Now Search log events and configuration files using AJAX and a browser. > > Download your FREE copy of Splunk now >> http://get.splunk.com/ > > _______________________________________________ > > log4perl-devel mailing list > > log...@li... > > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > > > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > log4perl-devel mailing list > log...@li... > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > |
From: Mike S. <m...@pe...> - 2007-11-06 07:22:21
|
On Mon, 5 Nov 2007, Strahan, Bob wrote: > Looks like the syncer appender has a set of additional module > dependencies we don't have setup yet in our environment.. I'll try it > out when I get a change to get things set up. Actually, I was referring to the 'syswrite' option, which is often the easiest way to get non-interleaving log messages without further synchronization. -- Mike Mike Schilli m...@pe... > > > > > -----Original Message----- > From: Mike Schilli [mailto:m...@pe...] > Sent: Sunday, November 04, 2007 3:55 PM > To: Strahan, Bob > Cc: Mike Schilli; log...@li... > Subject: RE: [log4perl-devel] log4perl causing perl process to die (fwd) > > On Sun, 4 Nov 2007, Strahan, Bob wrote: > > > We do use the 'close_after_write' option... As I mentioned, there are > > multiple concurrent processes continually being spawned by the > > service, each using log4perl to log to the same logfile. So we > > figured we needed to use File::Locked along with close_after_write to > > ensure each process got an exclusive lock on the logfile before > > writing to it. > > I see -- the recommended ways of synchronizing access to an appender are > listed in the Log4perl FAQ: > > http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804 > > I'm not sure how well they work on Windows, though, but give the > 'syswrite' option a try, that should be the easiest. > > -- Mike > > Mike Schilli > m...@pe... > > > Let me know if there is a better (more efficient) way to handle > > multiple concurrent processes logging to the same file e.g. Would > > using socket appenders to route log messages to single log server > > process which handles file i/o from one process be a better option? > > > > > > > Which version of Windows are you running by the way? On regular XP, it > > > seems to work as expected. > > > > Windows 2003 64-bit server.. I haven't tried it on other flavors of Windows. > > > > > > For now I have worked around the problem by inserting the open() call into a retry loop.. > > #open $fh, "$self->{mode}$self->{filename}" > > # or die "Cannot write to '$self->{filename}': $!"; > > while (1) { > > last if open $fh, "$self->{mode}$self->{filename}" ; > > } > > > > > > > > > > > > > > > > > > -----Original Message----- > > From: log...@li... [mailto:log...@li...] On Behalf Of Mike Schilli > > Sent: Saturday, November 03, 2007 6:32 PM > > To: Mike Schilli > > Cc: log...@li... > > Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd) > > > > On Fri, 2 Nov 2007, Bob Strahan wrote: > > > > > However, it seems that if certain filesystem operations are > > > performed on the logfile it can cause the logger to execute die(), > > > causing my service to die, with the following error > > > > > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > > > Permission denied at D:\Program Files (x86)\My > > > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. > > > > Hmm, this is Log::Dispatch::File's _open_file() function complaining > > that an open() failed. Does your service open a files after it's been > > running for a while? Typically, Log::Dispatch::File(::Locked) opens the > > file only once unless 'close_after_write' is given. > > > > Which version of Windows are you running by the way? On regular XP, it > > seems to work as expected. > > > > -- Mike > > > > Mike Schilli > > m...@pe... > > > > > I am using log4perl in a Win32 service that needs to run forever.. However, I have encountered a situation where the logger call is executing a die() and causing my service to die... > > > > > > > > > The service spawns multiple child processes which run concurrently but all log to the same logfile.. We're using File::Locked to avoid contention.. Extract from our logger config below.. > > > > > > "log4perl.appender.myapp" => "Log::Dispatch::File::Locked", > > > "log4perl.appender.myapp.filename" => "D:/Program Files (x86)/My App/logs/logfile.txt", > > > "log4perl.appender.myapp.mode" => "append", > > > "log4perl.appender.myapp.close_after_write" => "true", > > > "log4perl.appender.myapp.permissions" => "0660", > > > Etc.. > > > > > > > > > > > I can reproduce the problem sporadically by simply opening the logfile in Wordpad.. > > > I can reproduce it reliably by repeatedly copying the logfile using test script below > > > > > > #!perl -w > > > use File::Copy ; > > > while (1) { > > > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program Files (x86)/My App/logs/logfileCOPY.txt") ; > > > print "." ; > > > } > > > > > > > > > Any suggestions on how to defend against users copying or opening the logfile? We should block and retry until open() suceeds, rather than die(), I think. > > > > > > Please let me know if you can help with a patch, workaround, or suggestion. > > > > > > Regards > > > > > > > > > > > > Bob Strahan > > > HP Software, R&D > > > > > > 703.579.1929 office | 702.967.5228 mobile | 702.579.1929 fax | bob...@hp...<mailto:bob...@hp...> > > > 10700 Parkridge Blvd. #500 | Reston | VA 20191 > > > > > > > > > ------------------------------------------------------------------------- > > > This SF.net email is sponsored by: Splunk Inc. > > > Still grepping through log files to find problems? Stop. > > > Now Search log events and configuration files using AJAX and a browser. > > > Download your FREE copy of Splunk now >> http://get.splunk.com/ > > > _______________________________________________ > > > log4perl-devel mailing list > > > log...@li... > > > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > > > > > > > ------------------------------------------------------------------------- > > This SF.net email is sponsored by: Splunk Inc. > > Still grepping through log files to find problems? Stop. > > Now Search log events and configuration files using AJAX and a browser. > > Download your FREE copy of Splunk now >> http://get.splunk.com/ > > _______________________________________________ > > log4perl-devel mailing list > > log...@li... > > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > > > |