From: Jamie C. <jca...@we...> - 2007-02-01 01:19:50
|
On 31/Jan/2007 16:26 Chris Mckenzie wrote .. <blockquote type=3D"cite"> <p><font size=3D"2">Thanks for the pointer Jamie.</font> </p> <p><font size=3D"2">So I funked around with my own version of ReadParseMime() and found that it was quite easy to get my own version going. The attached file read/write needed to be handled in the ReadParseMime() funct.</font></p> <p><font size=3D"2">In web-lib-funcs.pl, around line 392, there's a $in{$name} concat of each line read from the posted file. Is it because of the CONTENT_TYPE bounrdy checking? This could probably be worked around. Anyways, it seems to be the culprit. Once the concatination wasn't factored in, memory usage was hovering around 5.8%.<br /></font></p></blockquote>Yes, it needs to check for the boundary on each line. <br />I'd be intersted to see your code change that reduced memory use .. I didn't see anything attached to your posting.<br /><blockquote type=3D"cite"> <p><font size=3D"2">I could speculate that the <STDIN> read used is reading in browser sent blocks of data, and isn't controlled from the function. If a sysread from <STDIN> was used, we could probably control the size of data read. I don't necessarily like that 5.8% number, I'm not sure where the size block size is determined. (based on something in the file/data that the Perl read decided was a read line, or something the browser blocked itself)<br /></font></p></blockquote>A read from stdin will just get a single line at a time, which shouldn't be too large.<br /><p>The only way I can see to really reduce memory use is for the caller of ReadParseMime to specify a file that should be written to for a particular named input. That way a large upload could be saved directly to disk on the server..<br /></p><p>=A0- Jamie<br /></p><blockquote type=3D"cite"> <p><font size=3D"2">Please let me know what you think, though I'll probably write my own version of ReadParseMime(). You're more than welcome to the result, but I have a less Perl style of coding, so you'd probably end up re-writing yourself anyways.</font></p> <p><font size=3D"2">I really like the way you've used callback functions, it's going to factor into my upload progress.</font> </p> <p><font size=3D"2">Thanks!</font> </p> <p><font size=3D"2">- Chris</font> <br /><font size=3D"2">-----Original Message-----</font> <br /><font size=3D"2">From: web...@li... [<a href=3D"reply_mail.cgi?new=3D1&to=3Dwebadmin%2Ddevel%2Dbounces%40lists%2Esourceforge%2Enet">mailto:web...@li...</a>] On Behalf Of Jamie Cameron</font></p> <p><font size=3D"2">Sent: Wednesday, January 31, 2007 6:30 PM</font> <br /><font size=3D"2">To: Webmin development list</font> <br /><font size=3D"2">Subject: Re: [webmin-devel] Webmin 1.320 file upload improvements question</font> </p> <br /> <p><font size=3D"2">On 31/Jan/2007 14:33 Chris Mckenzie wrote .. </font> <br /><font size=3D"2">Hi all. </font> <br /><font size=3D"2">Long time webmin tinkerer, short time webamin-devel list troller. </font> <br /><font size=3D"2">I had a question regarding the the improved file upload module and progress bar window. </font> <br /><font size=3D"2">* Improved handling of large file uploads so that they are no longer read into memory by Webmin webserver. Also added a progress bar window for tracking uploads. </font></p> <p><font size=3D"2">Firstly, my tests have shown that when attempting a file upload to miniserv, the main miniserv process maynot gobble up memory for the download but the spawned CGI that's being posted to does.</font></p> <p><font size=3D"2">Yes, that is still unfortunately true - when using the default ReadParseMime function for reading uploaded files, the entire content is read into memory.</font></p> <p><font size=3D"2">I wrote a quick test to see if the fix would be suitable for a module I'm looking at doing. (large file upload) It starts with a simple form:</font></p> <p><font size=3D"2"><form method=3D"POST" action=3D"/test/handleUpload.cgi" enctype=3D"multipart/form-data"> </font> <br /><font size=3D"2">=A0 <input type=3D"file" name=3D"file" id=3D"file" size=3D"20"> </font> <br /><font size=3D"2"></form> </font> <br /><font size=3D"2">My handleUpload.cgi is pretty straightforward too: </font> <br /><font size=3D"2">&ReadParseMime(); </font> <br /><font size=3D"2">... </font> <br /><font size=3D"2">=A0=A0=A0 my $fh =3D new IO::File ">$write_file"; </font> <br /><font size=3D"2">=A0=A0=A0 if(defined($fh)) { </font> <br /><font size=3D"2">=A0=A0=A0=A0=A0 binmode $fh; </font> <br /><font size=3D"2">=A0=A0=A0=A0=A0 print $fh $in{'file'}; </font> <br /><font size=3D"2">=A0=A0=A0 } </font> <br /><font size=3D"2">=A0=A0=A0 $fh->close(); </font> <br /><font size=3D"2">...</font> <br /><font size=3D"2">The only way to avoid comsuming memory like this would be to write your own replacement for ReadParseMime to write data directly to a file as it is read by the CGI. I haven't done this yet, but it would be at least theoretically possible now that the miniserv.pl process doesn't read the whole upload into memory too :-)</font></p> <p><font size=3D"2">=A0- Jamie</font> </p> </blockquote><br /> |