[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

[MacPerl] Out of memory! error...doing standard file I/O



I'm just trying to parse a log file and MacPerl is giving me "Out of
memory!" errors. I have a 12MB log file that I want to extract entries that
match a particular domain name into another file...so I wrote a simple
little script that reads the original file line by line (instead of reading
total file into an array then going line by line, since that would take too
much RAM), checks to see if the domain name exists in the line and exports
it to another file if it does. This script works fine on my SunOS box and my
YellowDogLinux box, but not under MacPerl 5.2.0r4 with 96MB of RAM dedicated
to it.  :(

(BTW - the script works fine if I only give it a 1MB log file, but after
some larger file size limit it just can't seem to handle it properly...)

-- being script --
open(INFILE,"<WebTen.log") || die "can't open file!";
open(OUTFILE,">www.kev.mb.ca.log") || die "can't open NEW file!";

$index = 0;
$item = <INFILE>;

while ($item)
{
    $position = index($item,"www.kev.mb.ca");

    if ($position != -1) { print OUTFILE $item; }
    $item = <INFILE>;
}

close(INFILE);
close(OUTFILE);
-- end script --

when I use the other method I know for file I/O to pre-load the whole data
file in memory and then output the parsed data, I don't get any out of
memory errors (provided I give MacPerl more than 60MB of RAM), but the data
that it outputs is incorrect, it just echo's the entire data file...again,
this version of the script works fine under SunOS or YellowDogLinux, just
now MacPerl.

HELP! :)

->  Brock


# ===== Want to unsubscribe from this list?
# ===== Send mail with body "unsubscribe" to macperl-request@macperl.org