[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

Re: [MacPerl] get urls



At 11:54 AM +0900 4/13/99, Peter Hartmann wrote:
>At 20:01 Uhr -0400 12.04.1999, Paul J. Schinder wrote:
>>On Mon, Apr 12, 1999 at 07:28:53PM -0500, Rick wrote:
>>} is there a way to have netscape or any web browser grab a web site from
>>} the net and either send it to macperl or save it as source?  if i can
>>} send it to macperl, how would i read it line by line?  like this?:
>>
>>Don't use a browser, use MacPerl:
>
>Yes, as long as you want to download only a few URLs. In my
>experience it is much faster (albeit more complicated) to call
>Netscape since it can do multiple simultaneous connections.
>
>I am using a short script to automatically download 80 to 100 pages
>of a German newspaper every morning. It's not very well commented,
>but works very well, for about a year now and should be good enough
>for a starter. Interested parties mail me off-list.
>
Yet another option is WebMiner from Brookline http://www.brooklinesw.com
WebMiner will download URLs and extract specific items from the page. I've
used it with AppleScript and as soon as I get back up to speed with
Chris's "Glue.pm" I'm going to try it with MacPerl.

Dave




===== Want to unsubscribe from this list?
===== Send mail with body "unsubscribe" to macperl-request@macperl.org