Skip Menu |

This queue is for tickets about the Archive-Tar CPAN distribution.

Report information
The Basics
Id: 43609
Status: resolved
Priority: 0/
Queue: Archive-Tar

People
Owner: Nobody in particular
Requestors: S.M.Rispens [...] sron.nl
Cc:
AdminCc:

Bug Information
Severity: (no value)
Broken in: (no value)
Fixed in: (no value)



Subject: Memory problem with Archive::Tar
Date: Wed, 25 Feb 2009 15:16:41 +0100
To: <bug-Archive-Tar [...] rt.cpan.org>
From: "Sietse Rispens" <S.M.Rispens [...] sron.nl>
Dear developers, In the perl scripts that I use I see a problem with memory management in Archive::Tar. It seems that memory is not returned after usage. See also http://www.perlmonks.org/?node_id=739495. Please find attached a scripts with which I can generate the error "Out of memory!" on my PC. The script creates an archive with a text file. After that it iterates to create, extract and clear an Archive::Tar object for the archive. The script takes two arguments, the first is the size of the textfile to be created in megabytes, the second the maximum number of iterations. If the size of the text file is chosen correctly, it generates on my PC the "Out of memory!" error after a few iterations (80MB no error, 90 MB in the 4th iteration, 100MB in the 2nd iteration, 150MB and more in the first iteration, so the choice comes rather precise). About my system, it is debian lenny, with 1 GB memory. Show quoted text
>> perl -v
This is perl, v5.10.0 built for i486-linux-gnu-thread-multi Show quoted text
>> uname -a
Linux horizon 2.6.22.6 #1 SMP Wed Oct 10 15:10:49 CEST 2007 i686 GNU/Linux Show quoted text
>> perl -mArchive::Tar -e 'print $Archive::Tar::VERSION."\n"'
1.38 On a different machine with Archive::Tar version 1.44 I saw the same problem. I hope this information is enough to identify the problem. Best regards, Sietse Rispens ---- Sietse Rispens SRON Netherlands Institute for Space Research Earth Oriented Science Division s.m.rispens@sron.nl +31 (0)30 253 5646

Message body is not shown because sender requested not to inline it.

Yep I too am running into this issue, though going down a rev to 1.42 seemed to allow it to run. However, memory would only tend to go when when accessing bzip2 archives greater than 5MB.
Hi, On Wed Feb 25 09:17:52 2009, S.M.Rispens@sron.nl wrote: Show quoted text
> Dear developers, > > In the perl scripts that I use I see a problem with memory management > in Archive::Tar. It seems that memory is not returned after usage. > See also http://www.perlmonks.org/?node_id=739495. > > Please find attached a scripts with which I can generate the error > "Out of memory!" on my PC. The script creates an archive with a > text file. After that it iterates to create, extract and clear an > Archive::Tar object for the archive. The script takes two > arguments, the first is the size of the textfile to be created in > megabytes, the second the maximum number of iterations. If the size > of the text file is chosen correctly, it generates on my PC the > "Out of memory!" error after a few iterations (80MB no error, 90 MB > in the 4th iteration, 100MB in the 2nd iteration, 150MB and more in > the first iteration, so the choice comes rather precise). > > About my system, it is debian lenny, with 1 GB memory.
> >> perl -v
> This is perl, v5.10.0 built for i486-linux-gnu-thread-multi
> >> uname -a
> Linux horizon 2.6.22.6 #1 SMP Wed Oct 10 15:10:49 CEST 2007 i686 > GNU/Linux
> >> perl -mArchive::Tar -e 'print $Archive::Tar::VERSION."\n"'
> 1.38 > > On a different machine with Archive::Tar version 1.44 I saw the same > problem. > > I hope this information is enough to identify the problem.
First of all, thanks for reporting. Unfortunately, I can't reproduce this issue, so we'll have to dig a bit further. I have tried the script in the following configurations: 10 Mb file, 50 iterations 100 MB file, 20 iterations I have also tried a tar file in 3 different configurations: * -czf (gzip) * -cjf (bzip2) * -cf (tar) All showed consistent (stable) memory usage after the first iteration, which is to be expected. I have added 2 diagnostics, to be sure of my findings. Before the first iteration i have added: print "Before iteration: " . `ps -p $$ -o %mem`; At the end of the last for loop, after an iteration is completed, i have added: print "Memory consumption after iteration: $i\n".`ps -p $$ -o %mem`; My box is an OSX box, so the flags to PS may differ for you; the goal is to get the memory usage of the current process. HOWEVER, both you and the other poster tested with Gzip/Bzip2 compressed archives and that may just depend on what versions of the IO::Compress::* modules you have installed. (Note also that all the IO::Compress modules should be of the same version). The version I am using is: 2.015 Perhaps you are using a different version, or can reproduce the memory leak using just the IO::Compress modules? If you have any more feedback, I'm happy to look into this further. Cheers,
Subject: Re: [rt.cpan.org #43609] Memory problem with Archive::Tar
Date: Fri, 06 Mar 2009 13:43:17 +0100
To: <bug-Archive-Tar [...] rt.cpan.org>
From: "Sietse Rispens" <S.M.Rispens [...] sron.nl>
Hi Jos, You are right about the IO::Compress modules. I upgraded the IO::Compress::Gzip module from 2.012 to 2.015 and the memory problem has now been solved. Thanks a lot! Sietse
Subject: Re: [rt.cpan.org #43609] Memory problem with Archive::Tar
Date: Fri, 6 Mar 2009 13:48:03 +0100
To: bug-Archive-Tar [...] rt.cpan.org
From: "Jos I. Boumans" <jos [...] dwim.org>
On Mar 6, 2009, at 1:43 PM, Sietse Rispens via RT wrote: Show quoted text
> You are right about the IO::Compress modules. I upgraded the > IO::Compress::Gzip module from 2.012 to 2.015 and the memory > problem has now been solved.
Glad to help. I can't find a hind in the CHANGES file that mentions this, but I'll update the required IO::Compress::* modules to 2.015 just in case. Cheers, -- Jos Boumans 'Real programmers use "cat > a.out"'