[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

Re: [MacPerl] Web crawler in Perl?



On Tue, 22 Feb 2000 15:22:30 -0500, someone going by the name of "Jonathan
W. Daniel" <Daniel.71@osu.edu> chiseled the following on a stone wall:

>I'd love to trade ideas on the approach for building this type of search
>engine, but again can't give you advice on where to find existing robots.

I'm not an expert by any means, but the nifty book _Perl Cookbook_ (from
O'Reilly) has a bunch of web-related items in its Chapter 20, including
some pointers on creating them.  The main gist is "use the module
LWP::RobotUA".  (This module reads and obeys robots.txt files, which are a
way for a site to control where automatic robots can go.  Of course,
obeying them is voluntary, but recommended.)  I haven't used that
particular facet of LWP, but I have used a bit of more basic LWP in MacPerl.

There are a lot of good recipes (small programs with commentary) in the
Cookbook.  Those who are wanting to surf the web with MacPerl would do well
to get it, although it may not be worth it if that's the only reason.  But
there's a huge variety of stuff in there.


--
 Tim Bailey / tim@moonrise.org |\/  "We can't all be heroes because somebody
 telestro@earthlink.net        |\/  has to sit on the curb and clap as they
 http://www.moonrise.org       |                    go by."
 --(314 days to the millennium!)--               -- Will Rogers




# ===== Want to unsubscribe from this list?
# ===== Send mail with body "unsubscribe" to macperl-request@macperl.org