I'm planning to use MacPerl to help format HTML exported from a newspaper front end system (i.e. big database of text files). I'd like the group's opinion on the best way to setup the conversion system for the sake of reliability and ease of future changes. Most of the time the front end system would be told by a person -- export this file to a folder on a 'Transporter' Mac, making a generic HTML conversion along the way. The MacPerl system (living on the 'Transporter' Mac) would then customize the HTML and move (via normal finder copy, not ftp) the page to the right place on the web server (also a Mac). There would be about a dozen specific pages on the web site that would be updated this way. Note -- There is a possibility that the some of the exporting could be based on an automated database query and could involve more than one file at a time. The decisions I have (that I am aware of) are: 1) Use folder actions to trigger scripts or a cron utility? 2) Export all the files to one folder -- changing the names to reflect where in our web site they should go -- or export to separate folders based on destination and let MacPerl handle the renaming? 3) Use specialized scripts to handle each individual page or have one master script that makes decisions based on what folder the file came from or what its name is? By the way, am I right in assuming that loading cgi generated pages from a Mac server (8.1 AppleShare 6.0) would be much slower than pages generated by a second Mac and then copied over, where the pages would normally be changed only once a day? David Rouse ***** Want to unsubscribe from this list? ***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch