funny...I was kinda looking forward to an automated system. There are over 500 pages on our site and over 1000 images in 42 directories. i would just like a program to read in all the .htm, .pl, .cgi, .html, .shtml extensions and i can parse them for the .gif and .jpg's then i could make a list that we could work from to manually delete the unused files. The problem is i dont know how to make a listing of all the files on my site and cycle through each filename in order to parse them using perl. The parsing is easy! thanks jann >Remove all images from the directory. >Fire up the pages. >Replace only used images. > >>>> JANN @ SMTP (J. Linder) {jann@JANN.COM} 06/25/97 03:21pm >>>> >On the same but different line, How about a good utility to check >that the >only files on a site are actually files that are referenced in your >site's >scripts/html files? > > >ie: lots and lots of images in our images directory...but i want to >remove >the ones that are not in use. > >Thanks > >jann > >> >>We have a lot of links in our web site (http://www.comsource.net). > Is there >>a GOOD utility that would check for broken links? >> >><!-- It is so hot today that nobody is out to cut trees. --> >> >>David Phan <dphan@comsource.net> >>Systems/Internet Programmer >>ComSource, Inc. http://www.comsource.net/ >>____________________________________________ ***** Want to unsubscribe from this list? ***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch