Menu

#53 Upload of Big Files

Next_Release
closed
None
1
2015-04-20
2012-11-15
Armin
No

Hi,
we're running filelocker 2.6 with source code from september 2012. Filelocker runs fine.

We now wanted to do an upload of a 8 GB File. This didn't work.
We tried IE9, Firefox (latest), Chrome (latest) and Opera (latest).

Now I would like to know if the Upload of big files (several GB) can be enabled via a setting in python. And if yes, it would be fine to have this setting enabled per default.
The quota for the users was set to 10 GB. So this wasn't the problem...

Thanks for a short answer.

Best Regards

Armin

Discussion

  • Byron

    Byron - 2013-01-08

    FYI, we have run into an issue with uploading large files on FileLocker 2.4 because of an issue with clamav. It seems on 32 bit systems, clamscan is limited to scanning files 2 gb or smaller. I've not tested on 64 bit. Our uploads of large files would fail because of the virus scan failing. if we disable virus scan, it uploads okay.

     
  • Mike Burns

    Mike Burns - 2013-01-11

    I've had mixed success testing large files. I'm using Filelocker 2.4.5, RHEL 6.3 64-bit, CherryPy 3.1.2, and Apache 2.

    I was encountering an Apache timeout issue when it takes over 60 seconds to scan and encrypt the file that was uploaded. The file does complete the scan and encrypt process, so it's more of a warning then an error, but Apache logs it as an error and I didn't like seeing these messages cluttering up the log file.

    The messages Apache logged were of the form

    [error] (70007)The timeout specified has expired: proxy: error reading status line from remote server localhost
    [error] proxy: Error reading from remote server returned by /file_interface/upload,

    The fix was to set Apache's ProxyTimeout directive to some value greater than 60 seconds that's long enough to scan and encrypt the largest file you expect to upload.

    The problem I'm still having is that with a 15GB file (but not for a 10GB or smaller file), Filelocker isn't freeing the disk space associated with the raw tmp file ([1]fltmp885854.tmp). An "ls" shows that the file has been removed from the directory, but "df" shows the disk space hasn't been freed/reclaimed. "lsof" shows that python/webFilelocker2.py still has the tmp file open. I've even waited a few days and the space still isn't freed. I have to stop webFilelocker2.py to free the space.

     
    • Mike Burns

      Mike Burns - 2013-01-22

      I resolved my problem with uploading files larger than 10GB. The largest file I've tried so far is a 40GB file. The fix was to close()the temp file. The attachment has the diff that I'm using.

       

      Last edit: Mike Burns 2013-01-22
  • David Hutchins

    David Hutchins - 2015-04-20
    • status: open --> closed
    • assigned_to: David Hutchins
     
  • David Hutchins

    David Hutchins - 2015-04-20

    This works in 2.6. I added the code for closing the temp file. There may still be some problems behind Apache/Nginx proxies, this isn't really a problem with Filelocker, but more of a configuration/capability issue with the proxy due to the way they handle uploaded files before passing them to Filelocker.

     

Log in to post a comment.