[Date Prev][Date Next][Thread Prev][Thread Next] [Search] [Date Index] [Thread Index]

Re: [MacPerl] Porting Un*x perl scripts to MacPerl for CGI



Kent Cowgill writes:
|Most notable are those with using such things I take for granted in
|linux such as `ls`

Yes, any time a script calls system programs, either using backticks or
the system function, you'll have problems porting it. Even if you have
Toolserver, chances are the output format of anything you call ls will be
different than the output the script expects. About all you can do is
call a replacement sub conditionally:

@x = $^O eq 'MacOS' ? doLs () : `ls`;

and have doLs use opendir/readdir to build the sort of output you get
from ls on Linux.

|and working with directories

I'm not sure what you want to do, but opendir/readdir/closedir, chdir,
rmdir, mkdir all work fine in MacPerl.

|foreach $FILE (@FILES)
|{
|    $count = 0;
|    open(FILE,"$FILE");
|    @LINES = <FILE>;
|    close(FILE);
|[- irrelevant code snipped -]
|}
|... something seems to break somewhere.

About the only thing that could go wrong with the above is if $FILE
can't be opened, or if it's large. You can look for the first problem
by checking open's return value (which you should always be doing,
whether things appear to be working or not). You can avoid the second
problem by processing the file a line at a time, or check for it by
wrapping the read from the file in an eval and checking $@.

This probably won't help much, but I have a similar sort of script that
processes about 4,000 files, 2/3 of them binary, the rest text, and it
works just fine under both MacPerl and Unix perl. The only consideration
I give to operating system is to use ':' for the path separator on the
Mac and '/' on Unix.

Brian