From: Vasiljevic Z. <zv...@ar...> - 2007-10-23 19:51:53
|
On 23.10.2007, at 21:26, Vlad Seryakov wrote: > We are building web site which will accept uploads of very big files, > 1-2GB is average. Spooler works fine but the problem comes after the > upload. With big number of users, all my virtual memory will be > exhausted very quickly by mmap, and after the upload i need to copy > those files somewhere because they are temp and deleted already. > > This makes the whole spooling thing useless by its own, i think it > needs > to be extended so i can somehow tell that those urls needs to go into > normal files, not mmaped without parsing multipart-data at all, for > such > big files, it is better to do it offline than on the web server. Your dilemma ist: how to specifiy that server should NOT preparse the uploaded multipart-data? |