I'm having a spot of bother with a script I'm using to build a set of html pages from a text file of data. The script runs fine until the input file is over 1/2 Mg, then Perl starts crashing, I think because it's running out of memory. On my mac, I can just increase MacPerl's partition size, but I also have to run it on my service providers Solaris box--and at the moment it's crashing that too. The script builds a big data structure at the start, but it seems to get past this ok--it's when it's building the pages in subroutines bigAlphaSub, alphaSub and makePageSub that it seems to fall over. I use a lot of references to build the data structures and I have to admit that I'm winging it a bit: it does the job, but I'm a bit hazy on the how! It seemed to me that if it reads in the file and gets through most of the script (it writes most of the pages before crashing) then it might be possible to release memory space taken by variables that have become redundant. You can see some of my failed attempts to do this in the attached code (eg, undef @{$entry};) Can anyone tell me if this is possible and, if so, how to go about doing ti? Thanks very much, Alex Attachment converted: /:MAILservUpdateXtra.pl (TEXT/MSIE) (01000296) ------------------------------------------------------------- LX Solutions: Web System Development and Programming HTML, JavaScript, Perl, Java, Visual Design ** Email: oops@cryogen.com ** ** Web: http://www.wldcup.com/ ** -------------------------------------------------------------