Subject: | Archive::Tar memory usage during extraction |
Date: | Thu, 19 May 2011 20:02:53 +0200 (CEST) |
To: | bug-archive-tar [...] rt.cpan.org |
From: | "Misi CPAN Mladoniczky" <miz_cpan [...] rrr.se> |
Hi,
I am having problem getting Archive::Tar to extract without consuming too
much memory.
The starting file is called orig.tar.gz and is 847M in size.
The first file of the tar contains another file called firstfile.tar.gz
and is 585M in size.
I have used both of these syntaxes:
- Archive::Tar->extract_archive($file, 1)
- Archive::Tar->iter($file ,1)
In both cases, the memory consumtion goes up to 4G during the extraction
of the first large file (585M), and then dropps back to around 2.5G.
This just seems to be too much. I would think that it should never really
need to exceed 585M by that much, if you refrain from doing any random
access.
What do you think about this?
I can live with the fact that the Archive::Tar is a little bit slow, but
there must be a way to improve memory consumption.
On my 64-bit system, the perl-process actually dies immediately after the
extraction.
On a 32-bit system I guess I will have even more of a problem.
Best Regards - Misi