One more note, from the IO::File class:
# There is no need for DESTROY to do anything, because when the
# last reference to an IO object is gone, Perl automatically
# closes its associated files (if any).
So, if I get this right, as long the associated IO handle is not jet out
of scope, the file isn't closed. As a matter of fact, this seems to be
the problem:
I tried with this code:
Show quoted text> for ($i = 0; $i < 50000; $i++)
> {
> my $in = IO::File->new('query.xml', "r");
> my $out = IO::File->new('/dev/null', "w");
>
> # my $proc = Midcom::Plucene::RequestProcessor->new($in, $out);
> # $proc->Process();
> }
When you leave the two commented out segments alone, everything works
fine. If not, the references get lost somehow.
The RequestProcessor class releases all Plucene handles upon completion,
but when looking into /proc after the too many files open error, I see
that not only the query.xml and /dev/null files are open multiple times,
but also the files from the plucene index.
I admit, that I might have errors in my code, I'd appreciate any
suggestions. The tarball attached to this comment holds the code I'm
using at this time that triggers this error.
Try executing ./bench.pl, you need LibXML and XML::Writer installed.