[Ssh-sftp-perl-users] Infelicity in Net::SFTP for large remote directories
Brought to you by:
dbrobins
From: Phillip K. <Phi...@no...> - 2004-08-26 18:24:58
|
Distinguished List: We are happy users of the module Net::SFTP. But if we could clear up a little infelicity, we would be even happier. What we are seeing is that in using Net::SFTP to get a remote directory, the entire perl session dies on a PARI error. The attached script reproduces the condition using Net::SFTP version 0.08. (And, we believe, very recent versions of all subsidiary modules. Full list on request.) Using a long directory tree on the remote end is an essential element of reproducing this condition. I was able to fit a full year's worth of filenames on "/tmp" but only 12 weeks on "/diskb2/data1/goes/sst/gridfiles". The error looks like this: PARI: *** the PARI stack overflows ! current stack size: 4.0 Mbytes [hint] you can increase GP stack with allocatemem() Any ideas? Many thanks, -PBK- Phillip Keegstra SPS Inc for NOAA Coastwatch WWB 601 5200 Auth Rd. Camp Springs, MD 20746 ++++++++++++++++++++++++++++++++++++++++++++++++++ I couldn't tell if the list accepts attachments, so here's the demo script appended directly. ++++++++++++++++++++++++++++++++++++++++++++++++++ #!/usr/local/bin/perl # Tickle a bug in Net::SFTP getting large directories. use Net::SFTP; my $ftpref = { '127.0.0.1' => "1\tXXXXXXXX\tZZZZZZZZZZ\t/diskb2/data1/goes/sst/gridfiles", }; { #Begin SCP of files foreach (keys %{$ftpref}) { # print $_, ' ', ${$ftpref}{$_}, "\n"; # Need to turn off utf8 flags for SFTP. my $ip = pack('C*', unpack('C*', $_)); my $ff = pack('C*', unpack('C*', ${$ftpref}{$_})); my($ll,$user,$pass,$path_there) = split(/\t/, $ff); # First element is 0 or 1 for inactive or active. next unless ($ll); my $xfertime = "999999999"; $on = "1"; my $ftp = Net::SFTP->new( ${ip}, user=>${user}, password=>${pass}, ); die 'Connect failed' unless defined($ftp); # Create an empty local file. { my $i = system("touch dum.$$"); die "touch returns $i creating dum.$$" if ($i); } # Make a years worth of files, 64 per day, on remote.... my $t = time(); for ($i=0; $i < 365; $i++) { my($s1,$n1,$h1,$d1,$m1,$y1,$w1,$j1,$dst) = gmtime($t - 86400*$i); for ($j=0; $j < 8; $j++) { foreach ('A','B','C','D','E','F','G','H') { my $nomen = sprintf("%4.4d_%3.3d_3%d%s.dat", $y1+1900, $j1+1, $j, $_); $ftp->put("dum.$$", "${path_there}/$nomen"); print("Pushed new $nomen\n"); } } if ($i%7 == 6) { # Once per week, try to get the directory. my(@list) = $ftp->ls("${path_there}"); print "\nWeek ", int($i/7), " size of list = ", scalar(@list), "\n\n"; } } } } ++++++++++++++++++++++++++++++++++++++++++++++++++ |