[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

Re: [MacPerl] Re: QUESTION: Finding missing links -Reply



thanks...will try the perl 'bot.

I have libwww installed so it should (and i reiterate *should*) be no problem.

Thanks--will let you know later tonite.

jann


>"J. Linder" <jann@jann.com> writes:
>}funny...I was kinda looking forward to an automated system.  There are over
>}500 pages on our site and over 1000 images in 42 directories.
>}
>}i would just like a program to read in all the .htm, .pl, .cgi, .html,
>}.shtml extensions and i can parse them for the .gif and .jpg's
>}
>}then i could make a list that we could work from to manually delete the
>}unused files.
>
>Sounds like you want a site checking robot.  I know of two possibilities.
>The first is a shareware program, Big Brother,
><http://pauillac.inria.fr/~fpottier/mac-soft.html.en>.  I use it as a
>simple link checker for my bookmark files, but it has the ability to
>recurse, so it should be possible to use it as a site checker.  The second
>is Checkbot, <http://dutifp.twi.tudelft.nl:8000/checkbot/>, which is
>written in Perl and requries libwww-perl-5.  I have no idea whether it
>works under MacPerl or not, but with enough effort you should be able to
>get it working.  At the very least it should show you the way to write your
>own robot.
>
>}
>}The problem is i dont know how to make a listing of all the files on my
>}site and cycle through each filename in order to parse them using perl.
>}The parsing is easy!
>}
>}thanks
>}
>}jann
>}
>}
>
>---
>Paul J. Schinder
>NASA Goddard Space Flight Center
>Code 693, Greenbelt, MD 20771
>schinder@pjstoaster.pg.md.us


**
professional site:  http://www.jann.com/
personal site:       http://www.jann.com/jann/
email:                  mailto:webmaster@jann.com
**



***** Want to unsubscribe from this list?
***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch