>local $/; >open(FILE1, "a.tag"); >$my_file = <FILE1>; >print($my_file); >close(FILE1); > >The only portion that prints is from atxxxx instead of aaxxxx as expected. >With another file that is tagged with html (423k), the results are even >crazier as it begins by printing from lines 8070-8190 (eof) then starts >again with lines 7706-8190 (eof). If I use normal line by line processing >instead, this behavior is avoided, but it takes forever to run. I am using >MacPerl 5.2.0r4 on an 8100/100, OS8.5.1 and did not see any changes when I >reinstalled MacPerl or raised the memory allocation to 5000k from the >default with vm on. I have not tried to work with these files under Perl in >mkLinux or Solaris yet. Do you get the same effect whether you output to a file or a window? MacPerl can only display 32k of data in a window, so AFAIK, output in excess of 32k is provided by "popping off" the oldest out put. That could explain why you are "losing" the earliest data in the dictionary example. Another thing to check is that the lines are delimited with a carriag return (ASCII 13). Programs such as BBEdit can "mask" the actual line endings of a file. I have not had any problems working with exceptionally large files (>30mb), although I have never worked with lines longer than about 1k. Hope this helps, Alex ***** Want to unsubscribe from this list? ***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch