This seems like a newbie question, but I can't seem to find any documentation that will help me out. So at the risk of ridicule, here goes. I've got this bit of code that looks something like this: while (<IN>) { #IN is an arbitrary text file chomp; print "$_\n"; #STDOUT has been redirected } I know it's lame, but it's just to illustrate a point. My (possibly naive) understanding is that perl should read one line at a time into $_ each time through the loop. I'm pretty sure I don't have a line end problem, BTW. That was the first thing I checked. My problem seems to be with input buffering. This works OK for reasonably small files (I've run it successfully on a 2MB file). However, for sufficiently large files (on a 45MB file in particular), I get a problem. MacPerl runs out of memory and terminates the script, but the output file will only contain a few hundred lines. It seems that MacPerl is trying to read the whole file into a buffer asynchronously, even though the script is processing only one line at a time... I know how to turn off output buffering, but what about input buffering? Changing MacPerl's partition will probably work, but doesn't seem the right solution and wouldn't work for any arbitrary length file. I'm pretty sure I wouldn't have this problem with sysread (right?), but that will add quite a bit of complexity to my code, I think. Is there a simple answer that I've just overlooked? I've tried to see if perl on one of our unix boxes has the same problem, but it's a bit difficult to emulate with a GByte of RAM and virtual memory to boot!!! jay ***** Want to unsubscribe from this list? ***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch