Subject: | Segmentation fault parsing fairly large XML file |
Hi, I found when processing an XML file a few megabytes big that perl
would segfault every time. I have reduced it to a minimal test case for
XML::Parser and one for XML::Parser::Expat. However, the segfault
depends on the size of the XML document, so you might need to tweak the
$count variable in the test case to reproduce it on your system. My
system is Fedora 15 x86_64 and I am using the distribution-supplied
packages for perl and XML::Parser.
First here is a program that uses XML::Parser and segfaults every time:
use XML::Parser;
sub f {
my $p = shift;
my $x = $p->{x};
my $e = {};
$p->{x} = $e;
$x->{y} = $e if $x;
$p->{z} = $e if not $x;
}
my $count = 18400; # sufficient to give segfault on my machine, if not,
try increasing
$c = '<Y>' x $count;
$n = XML::Parser->new(%args);
$n->setHandlers(Start => \&f);
$n->parse($c);
This test case can be reduced further to some code that calls
XML::Parser::Expat directly:
require XML::Parser::Expat;
sub f {
my $p = shift;
my $x = $p->{x};
my $e = {};
$p->{x} = $e;
$x->{y} = $e if $x;
$p->{z} = $e if not $x;
}
my $count = 18400; # sufficient to give segfault on my machine, if not,
try increasing
my $c = '<Y>' x $count;
my $expat = XML::Parser::Expat->new;
$expat->setHandlers(Start => \&f);
$expat->parse($c);
Please try out these test cases and let me know if you are able to
reproduce the crash.