On Mon, Jul 19, 1999 at 02:21:22PM -0800, Ben Brewer wrote: } Hi! } } I've been using the code below to successfully upload all the files in a } folder on my mac to our web server. Files over a certain length however } (seems to be about 20k), are truncated when I check the new copies on the } server. Any thoughts on why this is and how to circumvent it would be most } appreciated! That's a known issue (the web server runs Solaris, right?). It's a server/OS specific problem, and so far as I know, no one understands why or what to do about it. All I can suggest is to try using Anarchie (Chris Nandor has a Perl front end for it, or maybe that's been rolled into Mac::Glue) or Fetch, which seem to be able to handle whatever is happening more robustly, although not perfectly. I see this on MacOS more often than on other OS's, but it's not an exclusively MacOS problem in my experience, and I have seen others report similar problems on the newsgroups. } } Thanks! } } } #local source for files to be uploaded } chdir($dbDirectory); } opendir(D, $dbDirectory) } or die "Could not open $dbDirectory: $!"; } } my $ftp = Net::FTP->new("$remoteHost"); } $ftp->login($usr, $pswd); } $ftp->cwd($directory); } } foreach my $fileToSend (readdir(D)) { } $ftp->put($fileToSend, $fileToSend) } or die "Could not put $fileToSend\n"; } } } } $ftp->quit(); } } } } ===== Want to unsubscribe from this list? } ===== Send mail with body "unsubscribe" to macperl-request@macperl.org } } -- Paul Schinder schinder@pobox.com ===== Want to unsubscribe from this list? ===== Send mail with body "unsubscribe" to macperl-request@macperl.org