> From: Clifford Hammerschmidt [mailto:cliff@marketingtips.com] > Sent: Wednesday, March 15, 2000 16:55 > To: Fergal Daly; merlyn@stonehenge.com; Nathan Torkington > Cc: fwp@technofile.org > Subject: Re: [FWP] beauty vs. brains > > > True, you have to use something like (NT command line): > > W:\>perl -e "open(FILE,\"test.txt\"); > while(read(FILE,$buf,0xffff)) {$file .= $buf;} print $file;" > > because at some point the C code must be using "read" or "fgets", which > require allocated buffers equal to their length limits. Certainly not fgets, which looks for a newline. ... > 'buf_read' => q{ > open FILE, $fn; > while(read(FILE,$buf,0xffff)) { > $file .= $buf; > } > } That is bogus, because you forgot to clear $file for each iteration of the benchmark. > Size of test.txt: 210013 > Benchmark: timing 100 iterations of buf_read, do+local, local, read... > buf_read: 51 wallclock secs (30.79 usr + 17.34 sys = 48.14 CPU) So this result is meaningless. And the others don't have enough iterations to be meaningful. > do+local: 1 wallclock secs ( 0.64 usr + 0.32 sys = 0.96 CPU) > local: 1 wallclock secs ( 0.51 usr + 0.25 sys = 0.76 CPU) > read: 1 wallclock secs ( 0.41 usr + 0.23 sys = 0.64 CPU) > > > I'd stick with local, or read with a -s. Not on the basis of what you have shown. -- Larry Rosler Hewlett-Packard Laboratories http://www.hpl.hp.com/personal/Larry_Rosler/ lr@hpl.hp.com ==== Want to unsubscribe from Fun With Perl? Well, if you insist... ==== Send email to <fwp-request@technofile.org> with message _body_ ==== unsubscribe