At 01:11 +0000 6/17/1999, Richard K. Moore wrote: >I'm curious as to why people haven't been advising the use of a hash (or >two). Alan's application seems to be one with only a small amount of data. >Is the Schwartzian Transform incredibly faster as a solution for large >amounts of data? I had essentially the same thought since hashes are very fast and, until earlier today, I had never even heard of a Schwartzian Transform. Efficiency is a splendid thing, but short of making time back up, I don't know how my perl stuff could run much faster than it does now using extremely ordinary hash and array techniques. In Solaris, I can execute 21 successive web file maintenance routines on about 180 linked files in 3 directories in less than 40 seconds. The text is maybe 5-6 million characters in the aggregate and many of the routines are not trivial. When cgi's are run later, the return of results appears to be limited by dialup bandwidth more than anything else. For several years, I was MIS director or a payphone company that used FoxPro- one of our programmers made some really nice, elegant stuff that I admired, but her production work was always overcoded and undertested. Another one who was a retired air force fighter pilot took a much more functional approach and made things that worked and showed up on time and roared even if they were largely based on brute force. Who do you guess the guys who sign the checks told me to get rid of? Maybe Schwartzian Transform is cool and maybe it isn't, but it doesn't seem likely to put any more money in your pocket in either case. Richard Gordon -------------------- Gordon Consulting & Design Database Design/Scripting Languages mailto:richard@richardgordon.net http://www.richardgordon.net 770.565.8267 ===== Want to unsubscribe from this list? ===== Send mail with body "unsubscribe" to macperl-request@macperl.org