cgiwrap-users Mailing List for CGIWrap (Page 22)
Brought to you by:
nneul
You can subscribe to this list here.
2000 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(21) |
Sep
(23) |
Oct
(4) |
Nov
(15) |
Dec
(25) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(5) |
Feb
(19) |
Mar
(19) |
Apr
(13) |
May
(12) |
Jun
(23) |
Jul
(6) |
Aug
(16) |
Sep
(6) |
Oct
(31) |
Nov
(23) |
Dec
(28) |
2002 |
Jan
(4) |
Feb
(9) |
Mar
(6) |
Apr
(23) |
May
(29) |
Jun
(16) |
Jul
(10) |
Aug
(41) |
Sep
(16) |
Oct
(8) |
Nov
(7) |
Dec
(7) |
2003 |
Jan
(13) |
Feb
(30) |
Mar
(6) |
Apr
(12) |
May
(23) |
Jun
(12) |
Jul
(11) |
Aug
(20) |
Sep
|
Oct
|
Nov
(10) |
Dec
(8) |
2004 |
Jan
(1) |
Feb
(11) |
Mar
(3) |
Apr
(10) |
May
(6) |
Jun
|
Jul
(3) |
Aug
(4) |
Sep
(3) |
Oct
(9) |
Nov
(2) |
Dec
|
2005 |
Jan
(7) |
Feb
|
Mar
(7) |
Apr
(1) |
May
(3) |
Jun
(2) |
Jul
(8) |
Aug
|
Sep
|
Oct
|
Nov
(2) |
Dec
(2) |
2006 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
(1) |
Sep
(2) |
Oct
(2) |
Nov
|
Dec
|
2007 |
Jan
|
Feb
|
Mar
|
Apr
(2) |
May
(12) |
Jun
(1) |
Jul
(1) |
Aug
|
Sep
(1) |
Oct
|
Nov
(14) |
Dec
|
2008 |
Jan
(5) |
Feb
(10) |
Mar
|
Apr
(12) |
May
(5) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(6) |
Dec
|
2009 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(2) |
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
2010 |
Jan
|
Feb
|
Mar
(1) |
Apr
(4) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(2) |
Nov
|
Dec
|
2011 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(2) |
Jun
|
Jul
|
Aug
(5) |
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(2) |
Sep
|
Oct
|
Nov
|
Dec
(4) |
From: Tuc <tu...@tt...> - 2001-12-04 13:42:19
|
> > So, by *NOT* including the --with-rlimit-nproc=XX switch, am I allowing > an unlimited number of scripts to be run by CGIWrap? I looked in the > configure.h file, but it just seemed to be undefined, as Tuc pointed > out. I'm hoping undefined = unlimited. :) > To the best of MY knowledge, yes... And I want to STOP THIS since we have a few hackers pounding CGIs creating a DOS on our machine. But, from what I can read of the man page, setting it won't help either. I can't really tell. Nathan? Tuc/TTSG Internet Services, Inc. |
From: Kyle <ky...@cc...> - 2001-12-04 13:27:38
|
> It's a standard O/S facility. See setrlimit man page on your O/S. Um, I can't get that to work in RedHat Linux, Nathan. There is ulimit, but it is set to 'unlimited' by default, which is how my server is setup. To expand on my original question, I want my server unlimited on the number of CGIs it can process. I don't care if it slows down to a snail's pace, I want them ALL to run when called. So, by *NOT* including the --with-rlimit-nproc=XX switch, am I allowing an unlimited number of scripts to be run by CGIWrap? I looked in the configure.h file, but it just seemed to be undefined, as Tuc pointed out. I'm hoping undefined = unlimited. :) -Kyle Nathan Neulinger wrote: > > Well, you ran "./configure"... it did exactly what it's supposed to. None > of those rlimit limit are turned on unless you specfifically request them > to be turned on. > > You need to specify --with-rlimit-nproc or --with-rlimit-nproc=XX > > As far as what it does, it simply issues the setrlimit() system call prior > to executing your script. It's a standard O/S facility. See setrlimit man > page on your O/S. > > -- Nathan > > On Mon, Dec 03, 2001 at 06:41:43PM -0500, Tuc wrote: > > > > > On Mon, Dec 03, 2001 at 05:01:25PM -0500, Tuc wrote: > > > > > > > > > > Here's a code snip from the cgiwrap web site: > > > > > > > > > > --with-rlimit-nproc=COUNT > > > > > limit number of processes with setrlimit > > > > > > > > > > That seems to be the entire documentation on this setting that I can > > > > > find. So, what is the default value if I don't specify COUNT? > > > > > > > > > UNLIMITED from what I can tell. It **LOOKS** like the doco says > > > > 32, but I checked configure and it seems to imply unlimited. > > > > > > I just looked at configure, and it should be 32. At least that is what > > > the source of configure.in says for nproc, but I have not tried it. > > > > > > > From "./configure" > > > > checking for limit rlimit-cpu... none > > checking for limit rlimit-vmem... none > > checking for limit rlimit-as... none > > checking for limit rlimit-fsize... none > > checking for limit rlimit-data... none > > checking for limit rlimit-stack... none > > checking for limit rlimit-core... none > > checking for limit rlimit-rss... none > > checking for limit rlimit-nproc... none > > checking for limit rlimit-nofile... none > > checking for limit rlimit-memlock... none > > > > > > from config.h > > > > /* #undef CONF_USE_RLIMIT_ANY */ > > /* #undef CONF_USE_RLIMIT_AS */ > > /* #undef CONF_USE_RLIMIT_CPU */ > > /* #undef CONF_USE_RLIMIT_VMEM */ > > /* #undef CONF_USE_RLIMIT_FSIZE */ > > /* #undef CONF_USE_RLIMIT_DATA */ > > /* #undef CONF_USE_RLIMIT_STACK */ > > /* #undef CONF_USE_RLIMIT_CORE */ > > /* #undef CONF_USE_RLIMIT_RSS */ > > /* #undef CONF_USE_RLIMIT_NPROC */ > > /* #undef CONF_USE_RLIMIT_NOFILE */ > > /* #undef CONF_USE_RLIMIT_MEMLOCK */ > > > > > > Nathan, can you tell us more what the NPROC actually > > is all about? > > > > Tuc/TTSG Internet Services, Inc. > > ------------------------------------------------------------ > Nathan Neulinger EMail: nn...@um... > University of Missouri - Rolla Phone: (573) 341-4841 > Computing Services Fax: (573) 341-4216 |
From: Tuc <tu...@tt...> - 2001-12-04 00:27:18
|
> > Well, you ran "./configure"... it did exactly what it's supposed to. None > of those rlimit limit are turned on unless you specfifically request them > to be turned on. > > You need to specify --with-rlimit-nproc or --with-rlimit-nproc=XX > Understandable... The issue is if you DON'T specify, then its almost like saying --with-rlimit-nproc=999999999 . 8-) > > As far as what it does, it simply issues the setrlimit() system call prior > to executing your script. It's a standard O/S facility. See setrlimit man > page on your O/S. > DISCLAIMER: I'm not a C/C++ programmer. My man page says : Limits on the consumption of system resources by the a process and each process it creates may be obtained with the getrlimit() or getprlimit() call, and set with the setrlimit() or setprlimit() call. My take would be that the CGIWRAP process is something that starts, runs the underling, and stops. Wouldn't it be useless for 100 CGIWRAPS to set it to 32? RLIMIT_NPROC The maximum number of simultaneous processes for this us- er id. and later on Because this information is stored in the per-process information, this system call must be executed directly by the shell if it is to affect all future processes created by the shell. Most shells have a limit, ulimit or unlimit built-in command. Or is it just that NPROC applies to ANY running process? And what if someone else set it to 100? Are you setting the hard/soft to be the same? Thanks, Tuc/TTSG Internet Services, Inc. |
From: Tuc <tu...@tt...> - 2001-12-03 23:42:07
|
> > On Mon, Dec 03, 2001 at 05:01:25PM -0500, Tuc wrote: > > > > > > Here's a code snip from the cgiwrap web site: > > > > > > --with-rlimit-nproc=COUNT > > > limit number of processes with setrlimit > > > > > > That seems to be the entire documentation on this setting that I can > > > find. So, what is the default value if I don't specify COUNT? > > > > > UNLIMITED from what I can tell. It **LOOKS** like the doco says > > 32, but I checked configure and it seems to imply unlimited. > > I just looked at configure, and it should be 32. At least that is what > the source of configure.in says for nproc, but I have not tried it. > From "./configure" checking for limit rlimit-cpu... none checking for limit rlimit-vmem... none checking for limit rlimit-as... none checking for limit rlimit-fsize... none checking for limit rlimit-data... none checking for limit rlimit-stack... none checking for limit rlimit-core... none checking for limit rlimit-rss... none checking for limit rlimit-nproc... none checking for limit rlimit-nofile... none checking for limit rlimit-memlock... none from config.h /* #undef CONF_USE_RLIMIT_ANY */ /* #undef CONF_USE_RLIMIT_AS */ /* #undef CONF_USE_RLIMIT_CPU */ /* #undef CONF_USE_RLIMIT_VMEM */ /* #undef CONF_USE_RLIMIT_FSIZE */ /* #undef CONF_USE_RLIMIT_DATA */ /* #undef CONF_USE_RLIMIT_STACK */ /* #undef CONF_USE_RLIMIT_CORE */ /* #undef CONF_USE_RLIMIT_RSS */ /* #undef CONF_USE_RLIMIT_NPROC */ /* #undef CONF_USE_RLIMIT_NOFILE */ /* #undef CONF_USE_RLIMIT_MEMLOCK */ Nathan, can you tell us more what the NPROC actually is all about? Tuc/TTSG Internet Services, Inc. |
From: Tuc <tu...@tt...> - 2001-12-03 22:01:33
|
> > Here's a code snip from the cgiwrap web site: > > --with-rlimit-nproc=COUNT > limit number of processes with setrlimit > > That seems to be the entire documentation on this setting that I can > find. So, what is the default value if I don't specify COUNT? > UNLIMITED from what I can tell. It **LOOKS** like the doco says 32, but I checked configure and it seems to imply unlimited. Tuc/TTSG Internet Services, Inc. (I also have this question out, no replies) |
From: Kyle <ky...@cc...> - 2001-12-03 21:41:03
|
Here's a code snip from the cgiwrap web site: --with-rlimit-nproc=COUNT limit number of processes with setrlimit That seems to be the entire documentation on this setting that I can find. So, what is the default value if I don't specify COUNT? -Kyle |
From: Nick D. -- US C. -- N. LLC. <ce...@no...> - 2001-12-01 04:14:31
|
Right now we use a method similar to that of: http://www.engelschall.com/pw/apache/rewriteguide/#ToC35 for sub domains. Anyway the point being that it seems that when sites are setup this way ScriptAlias and Alias don't work for these sites even when globally enabled. The reason this creates a problem is the way cgi is setup on our server is: ScriptAlias /cgi-bin /home/usr/cgiwrap/cgiwrap # To use CGI Wrapped scripts: Action cgi-wrapper /cgi-bin AddHandler cgi-wrapper .cgi .pl And this doesn't work on these sites since scriptAlias doesn't work. I tried doing something like AddHandler /home/usr/cgiwrap/cgiwrap .cgi .pl but that didn't work as it did /home/sites/web/test.com/home/usr/cgiwrap/cgiwrap etc. I tried putting ScriptAlias in a .htaccess file however the problem with that was that ScriptAlias cannot be put in .htaccess. So does someone know how to allow .htaccess to use ScriptAlias's ? Or how to use cgiwrap without it? Or a substitute for ScriptAlias ? |
From: Joe H. <on...@dc...> - 2001-11-29 17:52:02
|
On Wed, 28 Nov 2001, Andrew Vernon wrote: > I host a number of virtual domains that use cgiwrap, each owned by a > different user. > > In many cases, cgiwrap incorrectly guesses the username the script > should run under. > > To keep them out of the rest of my system, I disallow shell and chroot > their FTP session into their home directories. > > Each virtual domain is defined like this: > > NameVirtualHost 208.15.218.26 > <VirtualHost 208.15.218.26> > ServerAdmin web...@mc... > DocumentRoot /home/ataylor/public_html > ServerName www.mcfumbler.com > AddHandler cgi-wrapper .cgi > Action cgi-wrapper /cgiwrap/cgiwrap/ataylor > <Directory /home/ataylor/public_html> > Options Indexes Includes ExecCGI > </Directory> > <Directory /home/ataylor/public_html/havers/dross> > Options Indexes Includes ExecCGI > </Directory> > CustomLog /var/log/httpd/mcfumbler.com-access_log combined > </VirtualHost> > > This should allow the domain owner (ataylor in this case) to run > scripts through the wrapper anywhere in their web space. > > However, this fails for http://www.mcfumbler.com/havers/dross/gm.cgi > > It's as though Apache or cgiwrap ignores the "ataylor" part of the action > directive. The 'cgiwrapd' option is your friend. Set up a virtual host calling cgiwrapd instead of cgiwrap, and check what it's passing in. ----- Joe Hourcle |
From: Andrew V. <av...@dr...> - 2001-11-29 05:46:39
|
I host a number of virtual domains that use cgiwrap, each owned by a different user. In many cases, cgiwrap incorrectly guesses the username the script should run under. To keep them out of the rest of my system, I disallow shell and chroot their FTP session into their home directories. Each virtual domain is defined like this: NameVirtualHost 208.15.218.26 <VirtualHost 208.15.218.26> ServerAdmin web...@mc... DocumentRoot /home/ataylor/public_html ServerName www.mcfumbler.com AddHandler cgi-wrapper .cgi Action cgi-wrapper /cgiwrap/cgiwrap/ataylor <Directory /home/ataylor/public_html> Options Indexes Includes ExecCGI </Directory> <Directory /home/ataylor/public_html/havers/dross> Options Indexes Includes ExecCGI </Directory> CustomLog /var/log/httpd/mcfumbler.com-access_log combined </VirtualHost> This should allow the domain owner (ataylor in this case) to run scripts through the wrapper anywhere in their web space. However, this fails for http://www.mcfumbler.com/havers/dross/gm.cgi It's as though Apache or cgiwrap ignores the "ataylor" part of the action directive. I used the following config to build the wrapper: ../configure --with-local-contact-name=Webmaster \ --with-local-contact-email=web...@dr... \ --with-wall \ --with-perl=/usr/bin/perl \ --with-cgi-dir=public_html \ --with-install-dir=/home/httpd/cgiwrap \ --with-httpd-user=nobody \ --with-minimum-uid=500 \ --with-logging-file=/var/log/httpd/cgiwrap.log \ --with-rlimit-cpu=15 |
From: Tuc <tu...@tt...> - 2001-11-27 23:52:21
|
Hi, We are running into a problem with hackers. They beat the crap out of a specific CGI trying to get it to release information it shouldn't. In the matter of seconds its gotten a system up to a load average of 200, taking 20 minutes to telnet into it. This is also after we have instituted a program that once the load average is over 12 (It runs every 2 minutes) it will chmod 000 the cgi! Its currently wrapped by an older copy of cgiwrap (3.6.4). Its compiled with : ./configure --with-perl=/usr/local/bin/perl -with-install-dir=/var/www/cgi-bin - -with-cgi-dir='' --with-httpd-user=www --with-logging-syslog='' --with-install-g roup=wheel Is there a way to stop, or atleast curb this insanity? I saw --with-rlimit-nproc=, which I don't quite understand (I'm not a programmer). Does this mean only 32 can run at one time? Was this always the default, since we can easily have more than 32 running. Is there some other items we can tweak to help? I don't know if limiting the CPU seconds works with a default either, since it seems to allow it to have more than 10 (On another machine we have an issue... It has actually let CGI run for HOURS...). Any help/hints/pointers are appreciated. Thanks, Tuc/TTSG Internet Services, Inc. |
From: Joe H. <on...@dc...> - 2001-11-27 14:07:51
|
On Mon, 26 Nov 2001, Ian 'Ivo' Veach wrote: > We have users who want to organize their "many" cgi into subdirectories > under their personal cgi-bin/ directory. At a glance, this seems like a > bad idea [one I'm not thrilled about anyway]. However, given our user > requests and since cgiwrap can check for symlinks and ../, is allowing > users the capability to store subdir cgi a bad thing? Are there other > issues at hand? Can someone give a practical counter example? The only issue that I can think of occurs when your cgi-bin directory is a subdirectory to your public_html directory (which unfortunately is the default). On servers like that, I'll touch index.html in their directory, so that the server won't give a listing of their files. With multiple directories, you'd want to do this for every directory. Unfortunately, this is only fixing one symptom of a larger problem -- anyone can still read your files if they know the name of them, it just makes it harder for them to figure out the names of the files. So, yes, there's sort of a problem, but if you have that problem, you have an even bigger issue to deal with. ----- Joe Hourcle |
From: Neulinger, N. <nn...@um...> - 2001-11-26 22:12:24
|
No real reason against it... When I first wrote cgiwrap, the 'possible danger' though popped into my head, so I didn't allow it initially. After I thought it through more, it really isn't any more of an issue than allowing cgi in the first place. > -----Original Message----- > From: Ian 'Ivo' Veach [mailto:iv...@sc...] > Sent: Monday, November 26, 2001 4:08 PM > To: cgi...@li... > Subject: [cgiwrap-users] are cgi-bin subdirs okay? > > > > > We have users who want to organize their "many" cgi into > subdirectories > under their personal cgi-bin/ directory. At a glance, this > seems like a > bad idea [one I'm not thrilled about anyway]. However, given our user > requests and since cgiwrap can check for symlinks and ../, is allowing > users the capability to store subdir cgi a bad thing? Are there other > issues at hand? Can someone give a practical counter example? > > > cheers and thanks, > ______________________________________________________________ > __________ > Ian 'Ivo' Veach, im...@ne... UCCSN System > Computing Services > http://www.nevada.edu/~ivo > postmaster/webmaster/sysadmin > ______________________________________________________________ > __________ > > > > _______________________________________________ > cgiwrap-users mailing list > cgi...@li... > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > |
From: Ian 'I. V. <iv...@sc...> - 2001-11-26 22:07:55
|
We have users who want to organize their "many" cgi into subdirectories under their personal cgi-bin/ directory. At a glance, this seems like a bad idea [one I'm not thrilled about anyway]. However, given our user requests and since cgiwrap can check for symlinks and ../, is allowing users the capability to store subdir cgi a bad thing? Are there other issues at hand? Can someone give a practical counter example? cheers and thanks, ________________________________________________________________________ Ian 'Ivo' Veach, im...@ne... UCCSN System Computing Services http://www.nevada.edu/~ivo postmaster/webmaster/sysadmin ________________________________________________________________________ |
From: Nick D. -- US C. -- N. LLC. <ce...@no...> - 2001-11-26 21:15:59
|
Ok here is the deal I own a cobalt, a raq4i, and have been reprogramming these babies for weeks now. Yesterday I went to conquer the cgi wrap and standard cgi. We have always given our users a choice of using either one. The way it is setup on our old servers is .cgi and .pl along with anything in the cgi-bin are standard cgi. Then anything with .scgi or .spl or that are in the scgi-bin are ran wrapped. Well after quite a bit of hacking, some rewrite rules, and a mis-use of the case sensitivity I was able to get the scgi-bin to work. But here is my problem. If any extension other then .pl or .cgi is associated with the cgi wrapper (or if the script is in the scgi-bin with a non-default extension) then it doesn't work. It does try and run it with cgi wrapper since I get a pretty cgi-wrapper error page and it says Script Not Found. This is quite weird since the path it gives to the script it correct. Anyway I downloaded the binary of cgi wrap off my cobalt to see if I could find anything and sure enough in there it referenced .pl and .cgi and through some hex editing I was able to change the extension to something else but as you may or may not know in binary you cant really add to the file without corrupting it. Anyway so this didn't truly solve my problem. So then I asked the owner and he said cobalts were a mess and he didn't support them so I installed his version but he said on cobalt it was impossible for whatever reason to get the user to run the scripts on from the file permissions it had to get it from the url like whatever/cgiwrap/user which I didn't like. So what im here to ask is either: A) How to convert my cobalt version to work with all file extensions not just .pl or .cgi Or B) How to get his version to work but not need to get the login for the url but rather who owns the file:-) Thank You For Your Time, Nick Daum US CEO Novanix, LLC. <http://www.Novanix.com> http://www.Novanix.com This message was sent using a Digital Certificate to verify the sender. If you have an attachment with a .p7s exstention that means your email client does not support digital signatures and you should ignore it. This message was also set with a vCard Attachment (ending in .vcf) if your email client supports it you can import this into your Address Book. |
From: Joe H. <on...@dc...> - 2001-11-26 16:26:36
|
I remember years ago, the issue of CGIWrap under SSI would come up every month or so (much like the .htaccess issues do these days), and I think I may have found a solution. I've tested this under netscape enterprise 4.1sp7, but not yet through apache. First off, set up SSI with a noexec option (which seems to no longer be documented on Netscape's site for hacking the object file directly, so you might want to use the GUI, or you can try adding: In your init section: Init funcs="shtml_init,shtml_send" shlib="/full/path/libShtml.so" NativeThread="no" fn="load-modules" Init LateInit="yes" fn="shtml_init" Add in the object: ObjectType fn="shtml-hacktype" Service fn="shtml_send" type="magnus-internal/parsed-html" method="(GET|HEAD)" opts="noexec" Now, you can call the SSI through: <!--#include virtual="/cgi-bin/cgiwrap/user/script"--> Which will cause the webserver to process the page independantly, calling CGIWrap, and without the exec option, keeping them from being able to directly call the CGI script. [This may have already been discussed, but well, I didn't see it in the docs, so I just thought I'd share.] ----- Joe Hourcle |
From: Joe H. <on...@dc...> - 2001-11-26 16:08:57
|
On Mon, 26 Nov 2001 ce...@sm... wrote: > Hi, > > I am using CGIWRAP seemlessly with the AddHandler / Action directives with > apache. I just discovered that it was possible to execute a script that is > located in a password protected directory using .htaccess file. > > When accessing http://www.website.com/protected/script.cgi, Apache won't > allow unauthorized access because there is a .htaccess file with the right > directives in the protected directory. When supplying the right password, > apache will execute the script by calling CGIWRAP. This is all good. > > Although, you guessed it, if one calls > http://www.website.com/cgibin/cgiwrap/username/protected/script.cgi, it is > easy to breach in. > > I would like to prevent cgiwrap to be accessed directly, like in the second > exemple. Is there a way to do this? Well, I would think that you'd have different environmental variables set when called through the two different paths. You might check if there's something that you can key off of, so that you can reject if it's not being called through the way which you prefer. ----- Joe Hourcle |
From: <ce...@sm...> - 2001-11-26 06:02:22
|
Hi, I am using CGIWRAP seemlessly with the AddHandler / Action directives with apache. I just discovered that it was possible to execute a script that is located in a password protected directory using .htaccess file. When accessing http://www.website.com/protected/script.cgi, Apache won't allow unauthorized access because there is a .htaccess file with the right directives in the protected directory. When supplying the right password, apache will execute the script by calling CGIWRAP. This is all good. Although, you guessed it, if one calls http://www.website.com/cgibin/cgiwrap/username/protected/script.cgi, it is easy to breach in. I would like to prevent cgiwrap to be accessed directly, like in the second exemple. Is there a way to do this? Thank you for your help, Cédric |
From: Neulinger, N. <nn...@um...> - 2001-11-21 20:21:54
|
Correct. That's why it is important to be very careful what you put in that directory and why it isn't enabled by default. As long as you are careful to only put scripts that you have hand verified to be secure in there, you should be perfectly safe. -- Nathan ------------------------------------------------------------ Nathan Neulinger EMail: nn...@um... University of Missouri - Rolla Phone: (573) 341-4841 Computing Services Fax: (573) 341-4216 > -----Original Message----- > From: Gennadi Umanski [mailto:um...@ti...] > Sent: Wednesday, November 21, 2001 10:51 AM > To: Neulinger, Nathan > Subject: cgiwrap & option --with-multiuser-cgi-dir=PATH > > > Hi, > > i have a question about cgiwrap and the option > "--with-multiuser-cgi-dir=PATH" > > > --with-multiuser-cgi-dir=PATH > > define a central cgi script directory that is searched if > the script is not found > > in a user directory. This can be used to make a single > script available > > that will run as any user, however, this can be very > dangerous if you're not > > extremely careful designing your script. Do not enable > this unless you know > > what you're > > doing. It is not needed for normal usage. > > We want use a shared cgi-directory with a cetrain > cgi-scripts, that root > installed. This cgi-script should run under user-id. We dont > need a execution of > user-script placed in user-homes. > My question is: what are the dangers of such solution? > These common-scripts > run with user permission and they may do the same what a "normal" > user-home-script may do too, dont they? > > TIA, > G.Umanski > > -- > +----------------------------------------------------------------+ > | Dipl.Inform. G. Umanski | Phone: +49 651 201 2840 | > | Dept. Computer Science | Fax : +49 651 201 3954 | > | University of Trier / Germany | Room : V214 | > | http://www.informatik.uni-trier.de/~umanskij/ | > +----------------------------------------------------------------+ > |
From: Nathan N. <nn...@um...> - 2001-11-16 12:57:18
|
rm...@tr... wrote: > > Hello, > > We have got a Colbalt RaQ with cgiwrap installed on it, which is excellent because it allows me > to up-load files without contacting our ISP. > > However we can't get the attached file to work because of cgiwrap and no one knows why. > > Can you help? > > Many Thanks > Rob Martin > Transair UK Ltd > > ------------------------------------------------------------------------ > Name: encrypt.pl > encrypt.pl Type: Perl Program (application/x-perl) > Encoding: base64 Since you don't provide any diagnostic information or error messages, and since you're running on a RaQ, Not really. Does the perl script work when when run directly from the shell? Are you getting any other errors? Are you able to run other perl scripts via cgi? -- Nathan ------------------------------------------------------------ Nathan Neulinger EMail: nn...@um... University of Missouri - Rolla Phone: (573) 341-4841 Computing Services Fax: (573) 341-4216 |
From: Joe H. <on...@dc...> - 2001-11-15 15:16:26
|
On Wed, 14 Nov 2001, Neulinger, Nathan wrote: > Hacking around things should not be needed. Something is wrong with the > setup on that machine if regular forks/system/etc. are not working. The one machine that I've seen that had issues forking was calling _a_lot_ of CGIs (they had 2 SSI's on the main site, which called a CGI to do browser detection, and output the correct CSS w/ HTML header, and again for the HTML footer) Every so often, you'd get a zombie process, and they'd slowly build up, until the system wasn't able to fork any more. Now, when running things from the shell, they ran just fine, so I'd assume that it was an issue with the number of children allowed to a parent process (the webserver). Just to check if it's even an issue with perl, have you tried the equivalent shell script? ##### #!/bin/sh echo "Content-type: text/html" echo uptime ##### ----- Joe Hourcle |
From: Steven H. <st...@ha...> - 2001-11-15 04:35:22
|
On 15/11/2001 05:04, Ralph Huntington wrote: >I don't think he's using php; it's a perl script. I think he's just got a >perl usage issue. He makes a system call, but does nothing with it -- or >so it seems. He needs to capture the result of the system call. however, the `uptime' command prints result to the standard output. it should show in the browser (given the proper http headers have been sent previously). -- sh |
From: Nathan N. <nn...@um...> - 2001-11-15 00:17:45
|
Ralph Huntington wrote: > > I don't think he's using php; it's a perl script. I think he's just got > a > perl usage issue. He makes a system call, but does nothing with it -- or > so it seems. He needs to capture the result of the system call. There's nothing wrong with that, except maybe an output buffering timing issue, and running under cgiwrapd should eliminate that as a problem. If he's not getting output w/ cgiwrapd, there is something wrong with the machine, that is not a cgiwrap issue. I don't see any output from the script in those commands - he should be getting at least the content-type header output, although it may still be buffered. > On Wed, 14 Nov 2001, Neulinger, Nathan wrote: > > > Hacking around things should not be needed. Something is wrong with > the > > setup on that machine if regular forks/system/etc. are not working. > > > > Do you have php configured properly, i.e. with the > --enable-discard-path > > option? > > > > -- Nathan > > > > ------------------------------------------------------------ > > Nathan Neulinger EMail: nn...@um... > > University of Missouri - Rolla Phone: (573) 341-4841 > > Computing Services Fax: (573) 341-4216 > > > > > > > -----Original Message----- > > > From: Ralph Huntington [mailto:rj...@mo...] > > > Sent: Wednesday, November 14, 2001 3:52 PM > > > To: Marten Lehmann > > > Cc: cgi...@li... > > > Subject: Re: [cgiwrap-users] forking not allowed? > > > > > > > > > Have you tried something like this > > > > > > #!/usr/bin/perl > > > $uptime = `uptime`; > > > print "Content-type: text/plain\n\n"; > > > print "$uptime"; > > > > > > Note the back-quotes around the shell-escaped command (not > > > single quotes) > > > (This is not really a cgiwrap question; it is a perl question.) > > > > > > On Wed, 14 Nov 2001, Marten Lehmann wrote: > > > > > > > Hello, > > > > > > > > even a small perl script showing the uptime is hanging: > > > > > > > > #!/usr/bin/perl > > > > print "Content-type: text/plain\n\n"; > > > > system("uptime"); > > > > > > > > I started the script in debug-mode: > > > > > > > > ... > > > > > > > > argv[0] = 'cgiwrapd' > > > > Executing '/vrmd/http/web/test.pl' > > > > Output of script follows: > > > > ===================================================== > > > > <nothing follows, but connection stays open> > > > > > > > > How can I configure cgiwrap to execute "uptime" or any > > > other command? > > > > > > > > Regards > > > > Marten > > > > > > > > _______________________________________________ > > > > cgiwrap-users mailing list > > > > cgi...@li... > > > > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > > > > > > > > > > > > > _______________________________________________ > > > cgiwrap-users mailing list > > > cgi...@li... > > > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > > > > > -- ------------------------------------------------------------ Nathan Neulinger EMail: nn...@um... University of Missouri - Rolla Phone: (573) 341-4841 Computing Services Fax: (573) 341-4216 |
From: Ralph H. <rj...@mo...> - 2001-11-14 22:04:46
|
I don't think he's using php; it's a perl script. I think he's just got a perl usage issue. He makes a system call, but does nothing with it -- or so it seems. He needs to capture the result of the system call. On Wed, 14 Nov 2001, Neulinger, Nathan wrote: > Hacking around things should not be needed. Something is wrong with the > setup on that machine if regular forks/system/etc. are not working. > > Do you have php configured properly, i.e. with the --enable-discard-path > option? > > -- Nathan > > ------------------------------------------------------------ > Nathan Neulinger EMail: nn...@um... > University of Missouri - Rolla Phone: (573) 341-4841 > Computing Services Fax: (573) 341-4216 > > > > -----Original Message----- > > From: Ralph Huntington [mailto:rj...@mo...] > > Sent: Wednesday, November 14, 2001 3:52 PM > > To: Marten Lehmann > > Cc: cgi...@li... > > Subject: Re: [cgiwrap-users] forking not allowed? > > > > > > Have you tried something like this > > > > #!/usr/bin/perl > > $uptime = `uptime`; > > print "Content-type: text/plain\n\n"; > > print "$uptime"; > > > > Note the back-quotes around the shell-escaped command (not > > single quotes) > > (This is not really a cgiwrap question; it is a perl question.) > > > > On Wed, 14 Nov 2001, Marten Lehmann wrote: > > > > > Hello, > > > > > > even a small perl script showing the uptime is hanging: > > > > > > #!/usr/bin/perl > > > print "Content-type: text/plain\n\n"; > > > system("uptime"); > > > > > > I started the script in debug-mode: > > > > > > ... > > > > > > argv[0] = 'cgiwrapd' > > > Executing '/vrmd/http/web/test.pl' > > > Output of script follows: > > > ===================================================== > > > <nothing follows, but connection stays open> > > > > > > How can I configure cgiwrap to execute "uptime" or any > > other command? > > > > > > Regards > > > Marten > > > > > > _______________________________________________ > > > cgiwrap-users mailing list > > > cgi...@li... > > > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > > > > > > > > > _______________________________________________ > > cgiwrap-users mailing list > > cgi...@li... > > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > > > |
From: Neulinger, N. <nn...@um...> - 2001-11-14 21:56:16
|
Hacking around things should not be needed. Something is wrong with the setup on that machine if regular forks/system/etc. are not working. Do you have php configured properly, i.e. with the --enable-discard-path option? -- Nathan ------------------------------------------------------------ Nathan Neulinger EMail: nn...@um... University of Missouri - Rolla Phone: (573) 341-4841 Computing Services Fax: (573) 341-4216 > -----Original Message----- > From: Ralph Huntington [mailto:rj...@mo...] > Sent: Wednesday, November 14, 2001 3:52 PM > To: Marten Lehmann > Cc: cgi...@li... > Subject: Re: [cgiwrap-users] forking not allowed? > > > Have you tried something like this > > #!/usr/bin/perl > $uptime = `uptime`; > print "Content-type: text/plain\n\n"; > print "$uptime"; > > Note the back-quotes around the shell-escaped command (not > single quotes) > (This is not really a cgiwrap question; it is a perl question.) > > On Wed, 14 Nov 2001, Marten Lehmann wrote: > > > Hello, > > > > even a small perl script showing the uptime is hanging: > > > > #!/usr/bin/perl > > print "Content-type: text/plain\n\n"; > > system("uptime"); > > > > I started the script in debug-mode: > > > > ... > > > > argv[0] = 'cgiwrapd' > > Executing '/vrmd/http/web/test.pl' > > Output of script follows: > > ===================================================== > > <nothing follows, but connection stays open> > > > > How can I configure cgiwrap to execute "uptime" or any > other command? > > > > Regards > > Marten > > > > _______________________________________________ > > cgiwrap-users mailing list > > cgi...@li... > > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > > > > > _______________________________________________ > cgiwrap-users mailing list > cgi...@li... > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > |
From: Ralph H. <rj...@mo...> - 2001-11-14 21:52:27
|
Have you tried something like this #!/usr/bin/perl $uptime = `uptime`; print "Content-type: text/plain\n\n"; print "$uptime"; Note the back-quotes around the shell-escaped command (not single quotes) (This is not really a cgiwrap question; it is a perl question.) On Wed, 14 Nov 2001, Marten Lehmann wrote: > Hello, > > even a small perl script showing the uptime is hanging: > > #!/usr/bin/perl > print "Content-type: text/plain\n\n"; > system("uptime"); > > I started the script in debug-mode: > > ... > > argv[0] = 'cgiwrapd' > Executing '/vrmd/http/web/test.pl' > Output of script follows: > ===================================================== > <nothing follows, but connection stays open> > > How can I configure cgiwrap to execute "uptime" or any other command? > > Regards > Marten > > _______________________________________________ > cgiwrap-users mailing list > cgi...@li... > https://lists.sourceforge.net/lists/listinfo/cgiwrap-users > |