[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

[MacPerl-WebCGI] Re: how to prevent time-out with slow outputstream



I said:

>>the .acgi suffix.  I say unfortunately in that this ALSO signals to
>>QPQ that your CGI can handle a second request while the first is
>>still processing, which is very much not true and results in anything
>>from a dropped request to a crashed server.

Ian Cabell replied:

>I've heard this before, and I recall something on the MacPerl site 
>mentioning it too.  However, I haven't seen all that much about it 
>and I was wondering if there was some more info somewhere.

I wrote a column on MacPerl CGI programming for Perl Month and it 
discusses this issue some:

http://www.perlmonth.com/columns/mac_perl/mac_perl.html?issue=11

>Maybe a work-around with a master program directing others when they're free?

This can be necessary and does work if you use the .cgi extension (in 
which case ONE CGI on the server may be fairly stable, but MORE THAN 
ONE MacPerl CGI has the same problems as you encounter if you use the 
.acgi extension.)  In that case, a one master CGI that calls 
subroutines is the solution.

If you use the .acgi extension, e.g. to enable server push, there is 
no solution of which I am aware.

>Or a "status" command that would return whether or not another 
>program is running?  Or other ways that I haven't thought of? =)

I haven't though of them either :-)

>Unix Perl, of course, doesn't have this problem.  Is the problem 
>solely because of the Mac's limited threading?

As I understand it, the MacOS *does* support threading, but Perl 
5.004 (on which MacPerl is based) does not.  The problem is that 
MacOS does not support multiple copies of the same program running at 
the same time.  Under Unix, the second hit to a CGI simply launches a 
second copy of Perl to run the second script, and all is well, EVEN 
if you are using Perl 5.004.  Under MacOS, the second hit tries to 
run the second script with the first copy of MacPerl, and that fails 
(as I believe it would with Unix, should Unix be so foolish to try 
such a thing).

Two questions I have for the group:

1) It is my understanding that if the threading built into the newer 
versions of Perl is used to handle multiple CGI hits, this would not 
be an automatic fix to the problem, but would require modification of 
the script.  Is this true?

2) It is my understanding that Windows is like Unix in that it can 
run multiple copies of the same program.  Is this a non-problem for 
Windows the way it is for Unix; are we Mac users alone in our travail?

>Or is this something that could be fixed in a future version of MacPerl?

s/Or/And/

Yes, the next version of MacPerl will support threading AFAIK, though 
see above about the (possible) necessity of rewriting scripts.

Two final comments:

1) I will be posting another message giving a step by step 
description of how to do server push with a MacPerl CGI, and in that 
you will note that the different API's of UNIX and MacPerl mean that 
MacPerl CGI scripts need to be written differently to enable this. 
This is a totally different problem than the inability to run 
multiple simultaneous scripts (CGIs).

2) It is my totally unsubstantiated impression that nobody is overly 
excited about fixing this problem, since the next version of MacOS, 
OS-X, *is* Unix (or at least Unix family) and therefore this problem 
vanishes.

-David-




David Steffen, Ph.D.
President, Biomedical Computing, Inc. <http://www.biomedcomp.com/>
Phone: (713) 610-9770 FAX: (713) 610-9769 E-mail: steffen@biomedcomp.com

==== Want to unsubscribe from this list?
==== Send mail with body "unsubscribe" to macperl-webcgi-request@macperl.org