On Sun, 08 Aug 1999, gene.ray@excite.com wrote: > For those of you who didnt read my first post.... > > I can successfully open a text file and display the contents on the web > > >>using a perl script, however when the text file becomes too large > > >>my > script > > >>hangs. The log file is less than 8k when it does this. How can I > > >>allow > a > > >>larger text file to be displayed? Does anyone have a suggestion? > > Thanks already for all the input! Here is the code...please dont laugh > too loudly... Your code slurps the entire file into memory at once. Don't know why it'd choke at only 8k, but there are definitely more memory friendly ways to do it. > #opens rental.txt > open (LOG, "<rental.txt") || &ErrorMessage; > @logmessages = <LOG>; > close (LOG); > > #splits log file into seperate entries > foreach $event (@logmessages) { > @event = split (/=/, $event); Instead, handle your input one line at a time. I've generated tables big enough to crash web browsers this way. Is your input delimited by linebreaks, "=", or both? If it's only "=", the code below will still slurp the entire file the first time around the while loop. You can set $/ = "=", and the while loop will split on that (you could dispense with the inner loop). Cameron Ashby +----------+ #opens rental.txt open (LOG, "<rental.txt") || &ErrorMessage; print "\n<html><head><title>rentals</title></head><body><table>\n"; #splits log file into seperate entries by linebreak while ($event = <LOG>) { @event = split (/=/, $event); foreach $pevent (@event) { @pevent = split (/,/, $pevent); if ( ($pevent[0]=~/$aptsize/) && ($pevent[1]=~/$orent/) && ($pevent[2]=~/$zone/) && ($pevent[3]=~/$pricerange/) && ($pevent[4]=~/$month/) && ($pevent[5]=~/$pets/) && ($pevent[6]=~/$smoke/) ) { print "<tr><td>"; print join("</td><td>", @pevent); print "</td></tr>\n"; } } } close (LOG); print "</table></body></html>\n"; ===== Want to unsubscribe from this list? ===== Send mail with body "unsubscribe" to macperl-request@macperl.org