I've written a script that forks using this module up to (but no more
than) ten times to download files using WWW::Mechanize.
While this works with the source code using ActiveState it doesn't work
when I use pp to package the file.
I've attached a SS of the gpf thrown, included the error signature and
the appcompat.txt file that Windows generates.
It forks the first set of 10 downloads then as soon as it's done that
it GPF's, while the perl source script forks 10 downloads, and then
continues forking until the entire set of urls in PROGIDS has been
exhausted.
Is there an inherent problem with using par packaged EXEs with
Parallel::ForkManager?
The script performs similar to the following:
use WWW::Mechanize;
use Parallel::ForkManager;
$| = 1;
$source = shift (@ARGV);
open PROGIDS, "<", $source;
my $output = shift (@ARGV);
unlink $output;
my $downloadattempts = 0;
my $filename = "";
my $mech = WWW::Mechanize->new(timeout => 600);
my $count = 0;
while (<PROGIDS>) {
$count++;
}
close PROGIDS;
if ($count > 10) { $pm = new Parallel::ForkManager(10); } else { $pm =
new Parallel::ForkManager($count); }
open PROGIDS, "<", $source;
print "Downloading:\n";
while (<PROGIDS>) {
chomp $_;
my $pid = $pm->start and next;
STARTDOWNLOAD:
$filename = $file.".xls";
$mech->get ( $_, ':content_file' => $filename );
$me = $mech->status;
unless ($me eq "200") { $downloadattempts++;
if ($downloadattempts < 3) {
goto STARTDOWNLOAD; } else { exit(); }
$pm->finish;
}
$pm->wait_all_children;
print "Done.\n";