[Web-ftp] Uploading/downloading recursively and others
Status: Beta
Brought to you by:
aball
From:
<jmf...@cn...> - 2002-07-17 19:49:57
|
Hi everybody, I've just installed today latest version of WebFTP, and it's very, very good. But, while I was testing it I realized there is no easy way to transfer tons of files from or to the FTP server. =09 These are my thoughts about a 'recursive' implementation. I remembered the trick wuftpd implements, and from the WebFTP point of view means downloading all the selected directories and/or files, and send the compressed archive back to the web client (not all the FTP servers are wuftpd). For uploading could be similar, but in the other way: server receives a compressed file (.tar.gz or .zip, for instance) and it decompresses the content in a tmp dir, and then all the content is transferred to a given directory (or the current one) in the FTP server. Then, looking the code, I realized all the content to be shown in WebFTP has to be stored in memory before sending it. I'm developing CGIs since more than 3 years, and one thing I've learnt is you can feed web server (and so, the web client) with a pipe, so you can send/receive content on the fly if you are able to process it on the fly, with no rewinds. Both zip and tar/gzip are able to work with pipes, but unzip depends on the way it was generated. A security problem can arise if 'recursive' uploading is done using compressed archives, if absolute paths are used or relative paths reaches parent directories. The general sandbox is to work in a chroot'ed directory, but this is a privileged operation... Jos=E9 Mar=EDa Fern=E1ndez --=20 Jos=E9 Mar=EDa Fern=E1ndez Gonz=E1lez e-mail: jmf...@cn... Tlfn: (+34) 91 585 49 21 Fax: (+34) 91 585 45 06 Grupo de Dise=F1o de Proteinas Protein Design Group Centro Nacional de Biotecnolog=EDa National Center of Biotechnology C.P.: 28049 Zip Code: 28049 Campus Universidad Aut=F3noma. Cantoblanco, Madrid, Spain. |