[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

[MacPerl] Bug in Mac libwww-perl-5.10 LWP::RobotUA?



Hi,

Has anyone noticed the following strange behaviour when using the macPerl
port of libwww-perl-5.10 LWP::RobotUA?:

It seems than any request generates a 503 error (Service Unavailable)
regardless of whether there's a robots.txt file or not (on our own servers
the robots.txt request is registered in the logs). The 503 error (which
seems to be incorrect), of course, halts the process. I presume that,
normally, the original or any acceptable requests (after parsing the
robots.txt file if one exists) would be processed. BTW: I tried fetching
files from numerous servers (including the evil empire which does have a
robots.txt file) with the same result.

It could be a misconfiguration on my end but I think I have everything
configured properly and most of the other modules/classes seem to work OK
after some brief initial testing.

Sample script included below. Any pointers/fixes appreciated.

Thanks,

John


use LWP::RobotUA;
use HTTP::Request;
use HTTP::Response;

$url = 'http://www.foobar.com/';

my $ua = new LWP::RobotUA('My_RobotUA', 'my@email.com');

my $request = new HTTP::Request('GET', $url);
my $response = $ua->request($request);
if ($response->is_success) {
    print $response->content;
    } else {
    print $response->error_as_HTML;
    }



.....................................................................
John Irving
HyperConnect Online Communications
john@hyperconnect.com
http://www.hyperconnect.com/



***** Want to unsubscribe from this list?
***** Send mail with body "unsubscribe" to mac-perl-request@iis.ee.ethz.ch