[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

[MacPerl] MacPerl File Size Limits?



I've just discovered that something that I thought was working is broken in
an odd way and I hope maybe someone here can give me a hint as to what is
going on. The short version is that I am trying to process a file that I
exported from Quark including the Xpress tags (618k) and am doing a lot of
substitution stuff in order to prepare the file for use in a hash. There are
lots of strange gotchas in this endeavor, but the one I had not counted on
was MacPerl failing to read (or at least write) most of the file.It happens
to be the A section of a dictionary and even if I just run:

local $/;
open(FILE1, "a.tag");
$my_file = <FILE1>;
print($my_file);
close(FILE1);

The only portion that prints is from atxxxx instead of aaxxxx as expected.
With another file that is tagged with html (423k), the results are even
crazier as it begins by printing from lines 8070-8190 (eof) then starts
again with lines 7706-8190 (eof). If I use normal line by line processing
instead, this behavior is avoided, but it takes forever to run. I am using
MacPerl 5.2.0r4 on an 8100/100, OS8.5.1 and did not see any changes when I
reinstalled MacPerl or raised the memory allocation to 5000k from the
default with vm on. I have not tried to work with these files under Perl in
mkLinux or Solaris yet.

So, is this a memory thing or what? If it is memory, is there a formula
based on file size that I might use to estimate how much to raise MacPerl's
usual allocation? If it's something else, will moving my work to mkLinux get
around it? Thanks & Merry Xmas.

Richard Gordon
Gordon Consulting & Design
Voice: 770-565-8267  Fax: 770-971-6887



***** Want to unsubscribe from this list?
***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch