Skip Menu |

This queue is for tickets about the Future-AsyncAwait CPAN distribution.

Report information
The Basics
Id: 128222
Status: resolved
Priority: 0/
Queue: Future-AsyncAwait

People
Owner: Nobody in particular
Requestors: leonerd-cpan [...] leonerd.org.uk
Cc:
AdminCc:

Bug Information
Severity: (no value)
Broken in: 0.19
Fixed in: 0.21



Subject: Fix some memory leaks
I know it leaks a lot right now, because if nothing else I never free() the SuspendedState structure. I'm sure there's lots of others. -- Paul Evans
$ debugperl5.26.2 Build && PERL_MEM_LOG=3sc debugperl5.26.2 -Mblib t/31destroy.t 3>perlmem.log Building Future-AsyncAwait PP_AWAIT resuming SvREFCNT(awaiting) == 2 PP_AWAIT resuming SvREFCNT(returning) == 1 SvREFCNT(SvRV(f)=0x55cba46df708) == 2 before PP_LEAVEASYNC SvREFCNT(SvRV(f)=0x55cba46df708) == 2 after PP_LEAVEASYNC ok 1 - $f1 should have one ref ok 2 - Not destroyed before $fret->get ok 3 - Not destroyed after $fret->get not ok 4 - $fret should have one ref # Failed test '$fret should have one ref' # at t/31destroy.t line 40. # expected 1 references, found 2 # SV address is 0x55cba46df708 # Writing heap dump to t/31destroy-4.pmat not ok 5 - Destroyed by dropping $fret # Failed test 'Destroyed by dropping $fret' # at t/31destroy.t line 44. 1..5 # Looks like you failed 2 tests of 5. $ pmat t/31destroy-4.pmat Perl memory dumpfile from perl 5.26.2 non-threaded Heap contains 21507 objects Show quoted text
pmat> show 0x55cba46df708
HASH(2)=Future at 0x55cba46df708 with refcount 2 size 168 bytes blessed as Future 2 values (use 'values' command to show) Show quoted text
pmat> identify 0x55cba46df708
HASH(2)=Future at 0x55cba46df708 is: ├─(via RV) the lexical $fret at depth 1 of CODE() at 0x55cba41155c0=main_cv, which is: │ └─the main code └─the referrant of REF() at 0x55cba4250158, which is: └─not found $ grep -A5 55cba4250158 perlmem.log ... new_SV: sv.c:5680:Perl_newSV: 55cba4250158 frame[0]: 0x55cba30a1e11 Perl_newSV at /home/leo/perl5/perlbrew/perls/perl-5.26.2/bin/perl5.26.2 +0x153e11 frame[1]: 0x55cba3086447 Perl_leave_adjust_stacks at /home/leo/perl5/perlbrew/perls/perl-5.26.2/bin/perl5.26.2 +0x138447 frame[2]: 0x55cba3086b4a Perl_pp_leavesub at /home/leo/perl5/perlbrew/perls/perl-5.26.2/bin/perl5.26.2 +0x138b4a ... which isn't a lot of help. -- Paul Evans
Various improvements and fixes in attached patch. Not 100% sure this fixes all the bugs, but certainly some of them are done now. -- Paul Evans
Subject: rt128222.patch
=== modified file 'lib/Future/AsyncAwait.xs' --- lib/Future/AsyncAwait.xs 2019-01-14 22:49:51 +0000 +++ lib/Future/AsyncAwait.xs 2019-01-15 18:49:30 +0000 @@ -239,12 +239,14 @@ * them */ +static int magic_free(pTHX_ SV *sv, MAGIC *mg); + static MGVTBL vtbl = { NULL, /* get */ NULL, /* set */ NULL, /* len */ NULL, /* clear */ - NULL, /* free - TODO?? */ + magic_free, }; #ifdef HAVE_DMD_HELPER @@ -287,6 +289,27 @@ return ret; } +static int magic_free(pTHX_ SV *sv, MAGIC *mg) +{ + SuspendedState *state = mg->mg_ptr; + + if(state->awaiting_future) { + fprintf(stderr, "TODO: free ->awaiting_future\n"); + } + + if(state->returning_future) { + fprintf(stderr, "TODO: free ->returning_future\n"); + } + + if(state->frames) { + fprintf(stderr, "TODO: free ->frames\n"); + } + + Safefree(state); + + return 1; +} + #define suspend_block(frame, cx) MY_suspend_block(aTHX_ frame, cx) static void MY_suspend_block(pTHX_ SuspendedFrame *frame, PERL_CONTEXT *cx) { @@ -1083,6 +1106,7 @@ Safefree(frame); } + state->frames = NULL; } /* @@ -1109,7 +1133,7 @@ *(svp+1) = *svp; } if(f) - *bottom = SvREFCNT_inc(f); + *bottom = f; else *bottom = sv_2mortal(newSVpvn("Future", 6)); SP++; @@ -1137,7 +1161,7 @@ PUSHMARK(SP); if(f) - PUSHs(SvREFCNT_inc(f)); + PUSHs(f); else mPUSHp("Future", 6); mPUSHs(newSVsv(failure)); @@ -1258,8 +1282,10 @@ SV **oldsp = PL_stack_base + cx->blk_oldsp; SuspendedState *state = suspendedstate_get(find_runcv(0)); - if(state && state->returning_future) + if(state && state->returning_future) { f = state->returning_future; + state->returning_future = NULL; + } if(SvTRUE(ERRSV)) { ret = future_fail(f, ERRSV); @@ -1272,9 +1298,12 @@ while(SP > oldsp) POPs; - PUSHs(ret); + mPUSHs(ret); PUTBACK; + if(f) + SvREFCNT_dec(f); + return PL_op->op_next; } @@ -1379,6 +1408,7 @@ state = suspendedstate_new(curcv); TRACEPRINT(" SUSPEND cloned CV->%p\n", curcv); + sv_2mortal(curcv); } else { TRACEPRINT(" SUSPEND reuse CV\n");
On Tue Jan 15 13:55:02 2019, PEVANS wrote: Show quoted text
> Not 100% sure this fixes all the bugs, but certainly some of them are > done now.
Yeah; not totally fixed but rathermuch improved: At v0.19: t/32memory-growth.t .... 1/? # Failed test 'async/await does not grow memory' # at t/32memory-growth.t line 27. # Lost 12529664 bytes of memory over 10000 calls, average of 1252.97 per call At v0.19+bzr202: t/32memory-growth.t .... 1/? # Failed test 'async/await does not grow memory' # at t/32memory-growth.t line 27. # Lost 3514368 bytes of memory over 10000 calls, average of 351.44 per call t/32memory-growth-1-after.pmat Kind Count (blessed) Bytes (blessed) ARRAY 11346(+1) 2 742.7 KiB(+64 bytes) 136 bytes CODE 1428 178.5 KiB GLOB 2106 312.6 KiB HASH 10276(+1) 10018(+1) 1.8 MiB(+192 bytes) 1.6 MiB(+168 bytes) INVLIST 49 41.1 KiB IO 12 12 1.9 KiB 1.9 KiB PAD 939 197.0 KiB REF 20605(+2) 482.9 KiB(+48 bytes) REGEXP 190 3 41.6 KiB 672 bytes SCALAR 24780(+3) 872.1 KiB(+231 bytes) STASH 152 127.4 KiB ------- --------- ------------- --------------------- --------------------- (total) 71883(+7) 10035(+1) 4.8 MiB(+535 bytes) 1.6 MiB(+168 bytes) I'd be willing to bet that single extra blessed HASH is a Future instance, and therefore from there the entire tree of leaked SVs ought to be fairly easy to find. -- Paul Evans
On Tue Jan 15 14:06:56 2019, PEVANS wrote: Show quoted text
> I'd be willing to bet that single extra blessed HASH is a Future > instance, and therefore from there the entire tree of leaked SVs ought > to be fairly easy to find.
$ pmat-leakreport -1 t/32memory-growth-1{,-after,-after}.pmat LEAK[2] SCALAR(UV) at 0x562bcb3e7630 LEAK[2] HASH(0) at 0x562bcb3e7b10 LEAK[2] UNDEF() at 0x562bcb3e7798 LEAK[2] HASH(2) at 0x562bcb3e7528 LEAK[2] ARRAY(0) at 0x562bcb3e7438 LEAK[2] SCALAR(UV) at 0x562bcb3e7750 LEAK[2] REF() at 0x562bcb3e76f0 LEAK[2] UNDEF() at 0x562bcb3e7738 LEAK[2] SCALAR(UV,PV) at 0x562bcaf40210 Of these: Show quoted text
pmat> show 0x562bcb3e7528
HASH(2)=Future at 0x562bcb3e7528 with refcount 1 size 168 bytes blessed as Future 2 values (use 'values' command to show) Show quoted text
pmat> values 0x562bcb3e7528
{ready} SCALAR(UV) at 0x562bcb3e7750 = 1 {result} REF() at 0x562bcb3e76f0 => ARRAY(0) at 0x562bcb3e7438 Show quoted text
pmat> identify 0x562bcb3e7528
HASH(2)=Future at 0x562bcb3e7528 is: └─the referrant of REF() at 0x562bcb3e74f8, which is: └─not found Show quoted text
pmat> identify 0x562bcb3e7630
SCALAR(UV) at 0x562bcb3e7630 is: └─value {Future} of HASH(8) at 0x562bca866098=stashcache, which is: └─the stash cache Show quoted text
pmat> identify 0x562bcb3e7b10
HASH(0) at 0x562bcb3e7b10 is: └─the lexical %params at depth 1 of CODE(PP) at 0x562bcaa3fb00, which is: └─the symbol '&Test2::Event::Bail::new' Show quoted text
pmat> identify 0x562bcb3e7798
UNDEF() at 0x562bcb3e7798 is: └─the lexical $f1 at depth 1 of CODE(PP) at 0x562bca8b0140, which is: ├─(via RV) the lexical $code at depth 1 of CODE(PP) at 0x562bcaebc4d8, which is: │ └─the symbol '&Test::MemoryGrowth::no_growth' ├─the lexical & at depth 1 of CODE() at 0x562bca866530=main_cv, which is: │ ├─the main code │ └─the scope of CODE(PP) at 0x562bcaf70df0, which is: │ └─the symbol '&main::identity' └─the referrant of REF() at 0x562bca889fb0, which is: └─not found Show quoted text
pmat> identify 0x562bcb3e7738
UNDEF() at 0x562bcb3e7738 is: └─the lexical $aborted at depth 1 of CODE(PP) at 0x562bcabb40f8, which is: └─the symbol '&Test::Builder::context' Show quoted text
pmat> show 0x562bcaf40210
SCALAR(UV,PV) at 0x562bcaf40210 with refcount 1 size 65 bytes UV=139665280621664 # 0x7f065b705c60 PV="139665280621664" PVLEN 15 Show quoted text
pmat> identify 0x562bcaf40210
SCALAR(UV,PV) at 0x562bcaf40210 is: └─not found 0x7f065b705c60 is plausibly the &vtbl address and -might- be leakage from my new DMD_helper.h Thus the structure is: HASH(2) at 0x562bcb3e7528 -- the leaking Future itself SCALAR(UV) at 0x562bcb3e7750 REF() at 0x562bcb3e76f0 ARRAY(0) at 0x562bcb3e7438 SCALAR(UV) at 0x562bcb3e7630 -- stash cache; probably internals? HASH(0) at 0x562bcb3e7b10 -- %params lexical; probably test noise UNDEF() at 0x562bcb3e7798 -- $f1; might be test noise? UNDEF() at 0x562bcb3e7738 -- $aborted; more test noise SCALAR(UV,PV) at 0x562bcaf40210 -- DMD_helper.h leakage? But verymuch the focus here is in finding that leaking REF() at 0x562bcb3e74f8 -- Paul Evans
On Tue Jan 15 14:06:56 2019, PEVANS wrote: Show quoted text
> At v0.19+bzr202: > > t/32memory-growth.t .... 1/? > # Failed test 'async/await does not grow memory' > # at t/32memory-growth.t line 27. > # Lost 3514368 bytes of memory over 10000 calls, average of 351.44 per > call
v0.19+bzr203: # Failed test 'async/await does not grow memory' # at t/32memory-growth.t line 28. # Lost 270336 bytes of memory over 10000 calls, average of 27.03 per call # Writing heap dump to t/32memory-growth-1.pmat Seems to be a single SV now: $ pmat-diff t/32memory-growth-1{,-after}.pmat A B UNDEF() at 0x55a216fab828 SCALAR(UV,PV) at 0x55a21756c780 SCALAR(UV,PV) at 0x55a2175b2b08 HASH(0) at 0x55a217622d08 SCALAR(UV,PV) at 0x55a217654238 HASH(0) at 0x55a21765a478 UNDEF() at 0x55a217790490 --- 3 unique to A 4 unique to B 21941 common leo@shy:~/src/perl/Future-AsyncAwait [bzr] $ pmat t/32memory-growth-1-after.pmat Perl memory dumpfile from perl 5.28.1 threaded Heap contains 21945 objects Show quoted text
pmat> show 0x55a21756c780
SCALAR(UV,PV) at 0x55a21756c780 with refcount 1 size 65 bytes UV=140521109671008 PV="140521109671008" PVLEN 15 Show quoted text
pmat> show 0x55a2175b2b08
SCALAR(UV,PV) at 0x55a2175b2b08 with refcount 1 size 65 bytes UV=140521109676256 PV="140521109676256" PVLEN 15 A UV+PV that appears to contain an address of some sort. More PERL_MEM_LOG required... -- Paul Evans
On Tue Jan 15 20:16:40 2019, PEVANS wrote: Show quoted text
> A UV+PV that appears to contain an address of some sort. More > PERL_MEM_LOG required...
Huh. $ pmat-diff A1.pmat A2.pmat A B SCALAR(UV,PV) at 0x555555886ac8 SCALAR(UV,PV) at 0x555555886b88 SCALAR(UV,PV) at 0x5555558929d0 SCALAR(UV,PV) at 0x555555e7eb18 Show quoted text
pmat> show 0x5555558929d0
SCALAR(UV,PV) at 0x5555558929d0 with refcount 1 size 65 bytes UV=140737346427792 PV="140737346427792" PVLEN 15 Show quoted text
pmat> show 0x555555e7eb18
SCALAR(UV,PV) at 0x555555e7eb18 with refcount 1 size 65 bytes UV=140737346421136 PV="140737346421136" PVLEN 15 (gdb) info symbol 140737346427792 pp_await in section .text of /home/leo/src/perl/Future-AsyncAwait/blib/arch/auto/Future/AsyncAwait/AsyncAwait.so (gdb) info symbol 140737346421136 pp_leaveasync in section .text of /home/leo/src/perl/Future-AsyncAwait/blib/arch/auto/Future/AsyncAwait/AsyncAwait.so I am highly puzzled. Why do we have an SvUV with added PV containing the addresses of either pp_await or pp_leaveasync being leaked? -- Paul Evans
SV origin isn't a lot of help: $ PERL_MEM_LOG=3sc bleadperl -Mblib t/32memory-growth.t 3>perl-mem-log.txt ... $ pmat-diff t/32memory-growth-1{,-after}.pmat A B UNDEF() at 0x55797915bd70 HASH(0) at 0x55797934b4e0 UNDEF() at 0x5579793977b8 SCALAR(UV,PV) at 0x5579793f08a0 SCALAR(UV,PV) at 0x5579793fec00 HASH(0) at 0x5579793ff4a0 $ grep -A5 5579793fec00 perl-mem-log.txt | tail -5 ... new_SV: sv.c:5649:Perl_newSV: 5579793fec00 frame[0]: 0x55797822a361 Perl_newSV at /home/leo/perl5/perlbrew/perls/bleadperl/bin/perl5.29.8 +0x167361 frame[1]: 0x55797820db47 Perl_leave_adjust_stacks at /home/leo/perl5/perlbrew/perls/bleadperl/bin/perl5.29.8 +0x14ab47 frame[2]: 0x55797820e26a Perl_pp_leavesub at /home/leo/perl5/perlbrew/perls/bleadperl/bin/perl5.29.8 +0x14b26a I didn't create them directly; they were made by leave_adjust_stacks() as part of pp_leavesub(). -- Paul Evans
OK - progress. A lot of theabove confusion about UVs pointing at pp_await or pp_leaveasync comes down to a bug in Devel::MAT::Dumper, which gets confused by the new mortal SVs that the OP_CLASS() macro allocates. When it walks the optree for SVs, it allocates these mortal SVs as a side-effect of the way OP_CLASS() is implemented on custom ops (such as the await or leaveasync nodes in the optree). This caused it to dump those extra SVs, when they weren't actually part of the heap. With the appropriate SAVETMPS/FREETMPS pair inserted, these no longer appear. Instead, a somewhat more likely leak candidate turns up: $ pmat-diff t/32memory-growth-1{,-after}.pmat A B HASH(0) at 0x5555559d10f0 UNDEF() at 0x55555640a5f0 HASH(0) at 0x555556426598 UNDEF() at 0x5555564d4280 SCALAR(UV) at 0x55555654eb68 --- 2 unique to A 3 unique to B 30190 common That SCALAR(UV). $ pmat t/32memory-growth-1-after.pmat Perl memory dumpfile from perl 5.29.8 non-threaded Heap contains 30193 objects Show quoted text
pmat> show 0x55555654eb68
SCALAR(UV) at 0x55555654eb68 with refcount 1 size 56 bytes debug serial 461304 created at t/32memory-growth.t:25 UV=93825008986928 Show quoted text
pmat> identify 0x55555654eb68
SCALAR(UV) at 0x55555654eb68 is: └─value {Future} of HASH(9) at 0x5555559a3cb8=stashcache, which is: └─the stash cache Now this gets weirder: in the -after file: Show quoted text
pmat> show stashcache
HASH(9) at 0x5555559a3cb8=stashcache with refcount 1 ... but in the -before file: Show quoted text
pmat> show stashcache
HASH(8) at 0x5555559a3cb8=stashcache with refcount 1 ... Comparing the values of them shows a lack of "Future" in the before file. But that "before" was simply before the final iteration of the code block for pmat-diff purposes. The code had been run many many times before that, and apparently the value was missing. Something, it seems, is removing values from the stash cache. -- Paul Evans
Show quoted text
> Comparing the values of them shows a lack of "Future" in the before > file. > > But that "before" was simply before the final iteration of the code > block for pmat-diff purposes. The code had been run many many times > before that, and apparently the value was missing. Something, it > seems, is removing values from the stash cache.
Turns out -that- is likely due to behaviour of Test2 which is running Test::MemoryGrowth. By running one code iteration just before we take the "before" snapshot and again one more before the "after", we remove that difference as well. The difference now becomes: t/32memory-growth-1-after.pmat Kind Count (blessed) Bytes (blessed) ARRAY 1752 2 239.1 KiB 224 bytes CODE 1941 303.3 KiB GLOB 2961 532.1 KiB HASH 334 8 384.8 KiB 3.5 KiB INVLIST 66 56.2 KiB IO 12 12 2.2 KiB 2.2 KiB PAD 1226 222.0 KiB REF 1253 68.5 KiB REGEXP 206 3 51.5 KiB 768 bytes SCALAR 20269 1.4 MiB(+116 bytes) SCALAR(IV) 3 168 bytes SCALAR(NV) 63 4.1 KiB SCALAR(NV,PV) 3 294 bytes SCALAR(PV) 7671(+1) 777.1 KiB(+158 bytes) SCALAR(UV) 2941 161.4 KiB SCALAR(UV,NV) 4 352 bytes SCALAR(UV,NV,PV) 8(+1) 814 bytes(+128 bytes) SCALAR(UV,PV) 11(-1) 1.1 KiB(-114 bytes) UNDEF() 9565(-1) 523.1 KiB(-56 bytes) STASH 188 206.4 KiB ------------------ --------- --------- --------------------- --------- (total) 30208 25 3.5 MiB(+116 bytes) 6.7 KiB -- Paul Evans
Further adjustment, capturing the heap before running the main count of iterations, and upping the count to 100k iterations, I now get: not ok 1 - async/await does not grow memory # Failed test 'async/await does not grow memory' # at t/32memory-growth.t line 31. # Lost 3108864 bytes of memory over 100000 calls, average of 31.09 per call # Writing heap dump to t/32memory-growth-1.pmat # Writing heap dump after one more iteration to t/32memory-growth-1-after.pmat # Tests were run but no plan was declared and done_testing() was not seen. It's very definitely leaking - I suspect the number to be around 32bytes per call give or take block-sized rounding. $ pmat-counts t/32memory-growth-1{-prior,-after}.pmat t/32memory-growth-1-prior.pmat Kind Count (blessed) Bytes (blessed) ARRAY 1751 2 238.2 KiB 224 bytes CODE 1940 303.1 KiB GLOB 2939 528.1 KiB HASH 333 8 385.7 KiB 3.4 KiB INVLIST 66 56.2 KiB IO 12 12 2.2 KiB 2.2 KiB PAD 1225 221.9 KiB REF 1249 68.3 KiB REGEXP 207 3 51.8 KiB 768 bytes SCALAR 20322 1.4 MiB STASH 188 205.7 KiB ------- ----- --------- --------- --------- (total) 30232 25 3.4 MiB 6.6 KiB t/32memory-growth-1-after.pmat Kind Count (blessed) Bytes (blessed) ARRAY 1752(+1) 2 239.1 KiB(+920 bytes) 224 bytes CODE 1940 303.1 KiB GLOB 2961(+22) 532.1 KiB(+4.0 KiB) HASH 334(+1) 8 384.9 KiB(-769 bytes) 3.5 KiB(+48 bytes) INVLIST 66 56.2 KiB IO 12 12 2.2 KiB 2.2 KiB PAD 1225 221.9 KiB REF 1252(+3) 68.5 KiB(+168 bytes) REGEXP 207 3 51.8 KiB 768 bytes SCALAR 20278(-44) 1.4 MiB(-886 bytes) STASH 188 206.4 KiB(+760 bytes) ------- ---------- --------- --------------------- -------------------- (total) 30215(-17) 25 3.5 MiB(+4.1 KiB) 6.7 KiB(+48 bytes) Using a mere 4.1KiB more memory in 17 fewer SVs after 100k iterations. I suspect this remaining leak is not at the perl SV level, but at the C malloc level. -- Paul Evans
On Sat Feb 02 11:26:42 2019, PEVANS wrote: Show quoted text
> Using a mere 4.1KiB more memory in 17 fewer SVs after 100k iterations. > I suspect this remaining leak is not at the perl SV level, but at the > C malloc level.
Turned out to be an easy one; a lack of Safefree(state->padslots); during restore. With that fixed, it now passes the test :) No detectable leak even after 100k iterations. -- Paul Evans
This seems to be resolved now. Any newly discovered leaks can get their own bug tickets. -- Paul Evans