At 7:18 PM -0500 3/5/97, Paul J. Schinder wrote: >Kent Cowgill <kentc@intersites.com> writes: >} >}Actually, I found that the problem was with large files. The files I >}thought were slipping through -T were .pdf files, which I didn't realize >}until yesterday were actually TEXT files. Some of the .pdf files in the >}directories were > 1.3MB, which I suppose was causing MacPerl to run out of >}memory. > >No, PDF isn't text, but it has a large plain text header that frequently >fools Perl (both Mac and Unix). I suppose a fix to the base Perl code may >be in order, maybe reading in a larger chunk of the beginning of the file >to do its checking. Being fully cognizant of the fact that .pdf aren't text files by any stretch of the imagination; their _file type code_ is actually 'TEXT' (I suppose I wasn't clear on that point in my previous message). I assume MacPerl uses the file's type code to test -T, not (as someone suggested) the first few bites of the file. >}Yeah, the way I worked around using "$ls = `ls $file`;" made a few other >}things in a few of the other routines break; as a result, I'm going to have >}to do twice the work to convert paths into URL's. Heck, though. It makes >}it twice as fun to learn :) > >You don't need to if you don't want to. In my MacPerl port of >libwww-perl-5 (http://mors.gsfc.nasa.gov/MacPerl.html), there are routines >that convert back and forth. They deal with all sorts of esoteric stuff >(at least they're supposed to) that you may never need, but that means they >work for the simple stuff as well. Look in particular at >URI::URL::newlocal. Great; I'll check it out. (Although doing it myself is still fun, since I'm still a newbie to Perl (and especially MacPerl :) Thanks. -Kent Kent Cowgill .---'''''---... 1 West State Street Intersites, Inc. 'i n t e r s i t e s. Geneva, IL 60134 .-'-.-'-.-'-.-'-.-'-. ''''---.....---' .-'-.-'-.-'-.-'-.-'-.-'-. kentc@intersites.com http://www.intersites.com/