|
From: Chris M. <Chr...@en...> - 2007-02-01 00:27:04
|
Thanks for the pointer Jamie.
So I funked around with my own version of ReadParseMime() and found that it
was quite easy to get my own version going. The attached file read/write
needed to be handled in the ReadParseMime() funct.
In web-lib-funcs.pl, around line 392, there's a $in{$name} concat of each
line read from the posted file. Is it because of the CONTENT_TYPE bounrdy
checking? This could probably be worked around. Anyways, it seems to be the
culprit. Once the concatination wasn't factored in, memory usage was
hovering around 5.8%.
I could speculate that the <STDIN> read used is reading in browser sent
blocks of data, and isn't controlled from the function. If a sysread from
<STDIN> was used, we could probably control the size of data read. I don't
necessarily like that 5.8% number, I'm not sure where the size block size is
determined. (based on something in the file/data that the Perl read decided
was a read line, or something the browser blocked itself)
Please let me know what you think, though I'll probably write my own version
of ReadParseMime(). You're more than welcome to the result, but I have a
less Perl style of coding, so you'd probably end up re-writing yourself
anyways.
I really like the way you've used callback functions, it's going to factor
into my upload progress.
Thanks!
- Chris
-----Original Message-----
From: web...@li...
[mailto:web...@li...] On Behalf Of Jamie
Cameron
Sent: Wednesday, January 31, 2007 6:30 PM
To: Webmin development list
Subject: Re: [webmin-devel] Webmin 1.320 file upload improvements question
On 31/Jan/2007 14:33 Chris Mckenzie wrote ..
Hi all.
Long time webmin tinkerer, short time webamin-devel list troller.
I had a question regarding the the improved file upload module and progress
bar window.
* Improved handling of large file uploads so that they are no longer read
into memory by Webmin webserver. Also added a progress bar window for
tracking uploads.
Firstly, my tests have shown that when attempting a file upload to miniserv,
the main miniserv process maynot gobble up memory for the download but the
spawned CGI that's being posted to does.
Yes, that is still unfortunately true - when using the default ReadParseMime
function for reading uploaded files, the entire content is read into memory.
I wrote a quick test to see if the fix would be suitable for a module I'm
looking at doing. (large file upload) It starts with a simple form:
<form method="POST" action="/test/handleUpload.cgi"
enctype="multipart/form-data">
<input type="file" name="file" id="file" size="20">
</form>
My handleUpload.cgi is pretty straightforward too:
&ReadParseMime();
...
my $fh = new IO::File ">$write_file";
if(defined($fh)) {
binmode $fh;
print $fh $in{'file'};
}
$fh->close();
...
The only way to avoid comsuming memory like this would be to write your own
replacement for ReadParseMime to write data directly to a file as it is read
by the CGI. I haven't done this yet, but it would be at least theoretically
possible now that the miniserv.pl process doesn't read the whole upload into
memory too :-)
- Jamie
|