Subject: | Memory Leak |
When trying to parse multiple XML files in one run I came across a
memory leak. See the attached example script and data file. There seems
to be no way to not make PERL continue to use memory. Given enough data
this script will die from memory bloat. I have seeked the wisdom of the
PERL monks and they where at a loss. They said to report a bug ticket.
So here it is :)
I may be doing this all wrong, in which I would love to be corrected so
I can get my program to work as I need it to. I am trying to parse about
85 30+MB files in a single run with no success.
Subject: | test_10000000_10099999.xml |
Message body is not shown because it is too large.
Subject: | memory_example.pl |
#!/usr/bin/perl -w
use strict;
use XML::Twig;
my $inFile = 'test_10000000_10099999.xml';
if ( ! $inFile ) {
die("No input file specified");
}
if ( ! -f $inFile ) {
die("file '$inFile' not found");
}
for(my $x=0;$x<6;++$x) {
print "Doing $x\n";
process($inFile);
}
exit 0;
#
# Process the file
#
sub process {
$inFile =~ /data_(\d+)_(\d+)/;
my $t= new XML::Twig( TwigHandlers=> { BIOG => \&BIOG },
);
$t->parsefile( $inFile );
$t->purge();
$t->dispose(); # Try to Free memory but does not work...
}
#
# BIOG is XML element we are triggering
#
sub BIOG {
my ($t, $BIOG)= @_;
if ( ! checkBiog($BIOG->field('BIOG_NBR')) ) {
print "Missing ". $BIOG->field('BIOG_NBR') . "\n";
}
$t->purge();
$t->dispose();
return 1;
}
#
# Check database for ID
#
sub checkBiog {
my ($biog) = @_;
return 1;
}