At 02:03 PM 12/10/97 -0600, Mark Manning/Muniz Eng. wrote: >According to Jon S. Jaques: >> >> Hi All, >> >> I had to dredge this message out from a few days >> ago... hope I don't back-track too much, but I >> have a MacPerl [semi] newbie question. > >Ok, how about this: > >Find::File >way you perform a two-pass file transfer. The first one >simply scans the directories for subdirectories and places >them all into a directory array. I don't think, actually, that Find::File is for me. I have found that if you load your array *very* carefully, then inspection will reveal a perfect pattern: folder, file, file, folder, file, etc. In other words, the folder gets listed first, and then comes it's files. Only one test is required: Is it a folder or not. This simplicity also eliminates worries about homogenous network environments making funky files, creator-types, and other such wierdeties. Reverse this array if you want to delete the files, because you will have your files first, and then the folder associated with them, starting from the deepest levels. Don't use 'reverse sort,' however, if you can avoid it, because it is slow and prone to memory considerations. Just use a 'for' loop, and iterate from the top down. >Then you sort the >directory array. Then you do each of the non-directory >files for each directory and create the subdirectories. In >this way you only have to keep a small number of files in >memory at one time. >This eliminates the memory problem. Yes, and no... Doesn't the sort mechanism make temporary duplicates in memory? (As mentioned above) Granted, the number of folders would be much much smaller, but sorting on the PB is a huge time waister. >The time to transfer should stay about the same though. No The extra passes via MacPerl are just too expensive, time-wise. I still build the variables, but just for reporting purposes. (While those arrays are being built, the files are actually copying). >way around that unless you can equip your PBs with an >Ethernet device (which IS available for PBs which just use >the Appletalk connection). We do have PCMCIA Ethernet devices on all PBs but one, a 540c, which uses an external device... Do you think that the external devices are faster? I can test that concept on the next update of the websites... >We have such a set-up here at >work. We go through a Gator box which pumps information >across the line at about 7-10K per second. My thought was to use an http server on an NT or Linux server, which can serve up the entire list of 9-10,000 files in a minute or two, maybe less, due to available clock cycles. Using print statements in Perl, the output would begin immediately, and thus would be pushed down the Mac's little thro... ahem(!) pipeline at a much greater rate (theoretically) than if *it* was handling all of the I/O. Maybe even acheiving the full 10mb Ethernet rate? Not likely, I suppose. <g> Off topic, I know, but what "is" the Gator box? A router? Or serial? >Fastest I've >seen (no other network traffic in the lab) was around 36K >per second. Gator boxes are available from Cayman, Inc. Probably serial... More reporting... Good stuff! Using MacPer::Ask, Answer, and Pick, I've created a singular entry point into the actual workings of the script, so that every time the script is run, the work is timed and then reported to the user. >"++" to some counter. Like so: > > $numFiles = 0; > while( <we have files> ){ > $numFiles++; > . > . > . > } Hope somebody finds this usefull! When the script I'm working on is a little more mature, I want to put it up on my PerlRing site at http://www.grovehillsys.com/scripting, but the fact that it can delete files, as well (in one of it's 'modes') makes it a little dangerous. --Jon S. Jaques --Jon S. Jaques --The Grove Hill Pages --http://www.grovehillsys.com --jjaques@grovehillsys.com --jjaques@illuminata.com --jjaques@almus.com ***** Want to unsubscribe from this list? ***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch