Skip to content
This repository has been archived by the owner on Apr 22, 2023. It is now read-only.

leak in native crypto code on latest git master? #8416

Closed
mman opened this issue Sep 20, 2014 · 25 comments
Closed

leak in native crypto code on latest git master? #8416

mman opened this issue Sep 20, 2014 · 25 comments

Comments

@mman
Copy link

mman commented Sep 20, 2014

Please see #8348 for background info... running on latest node master appears to be leaking in native crypto extension... RSS growing until the process is killed, heap remaining stable... attached is a chart showing memory usage before and after I git pulled to the latest master three days ago (the X axis is days in September)...

I have today quickly reverted to latest stable 0.10.x node and it works nicely again...

I will not be able to run the master under valgrind for the next couple of days... just bringing the issue to your attention as soon as possible...

screenshot 2014-09-20 21 25 01

@indutny
Copy link
Member

indutny commented Sep 20, 2014

Sorry, is it about crypto or TLS?

@mman
Copy link
Author

mman commented Sep 20, 2014

Sorry, I meant TLS, exhibiting via the simple HTTPS get handler…

On 20. 9. 2014, at 21:43, Fedor Indutny notifications@github.com wrote:

Sorry, is it about crypto or TLS?


Reply to this email directly or view it on GitHub #8416 (comment).

@indutny
Copy link
Member

indutny commented Sep 22, 2014

Will look into it tomorrow

indutny added a commit to indutny/node that referenced this issue Sep 23, 2014
TLSCallbacks should be destroyed at the same time with StreamWrap, but
only if both objects are not referenced. Otherwise memory leaks or
crashes will happen.

fix nodejs#8416
indutny added a commit to indutny/node that referenced this issue Sep 23, 2014
Call `MakeWeak()` to destruct TLSCallbacks when the js-object dies.

fix nodejs#8416
@indutny
Copy link
Member

indutny commented Sep 23, 2014

Should be fixed by #8424, cc @trevnorris

@mman
Copy link
Author

mman commented Sep 23, 2014

pulled this fix into my master and deployed, will let you know after couple hours how it goes...

thanks
Martin

@mman
Copy link
Author

mman commented Sep 23, 2014

Looks like this has not fixed the problem... it's doing the same thing...

screenshot 2014-09-23 15 08 45

@indutny
Copy link
Member

indutny commented Sep 23, 2014

@mman Does it leak memory when you just create a server like this:

var https = require('https');
var fs = require('fs');

https.createServer({
  key: fs.readFileSync('./test/fixtures/keys/agent1-key.pem'),
  cert: fs.readFileSync('./test/fixtures/keys/agent1-cert.pem')
}, function(req, res) {
  res.end('hello world');
}).listen(1443, function() {
  console.log('Listening...');
});

And run ab or siege against it?

@mman
Copy link
Author

mman commented Sep 23, 2014

I can't verify this easily now, but this is pretty much what my code looks like. I have couple more instances where get is called less often and they leak as well, only proportionally slower to how often their get handler gets called...

@indutny
Copy link
Member

indutny commented Sep 23, 2014

This is really surprising... I can confirm that the leak was happening prior to that patch, and I do see that it is gone on this basic example (at least in my tests).

I'll see if there is anything else that could be leaking. Meanwhile, could you please ensure that you have rebuilt the binary and are using just-built-one for that server?

@mman
Copy link
Author

mman commented Sep 23, 2014

Hi again,

I have left the app up and running and it seems to be stable, the rss stopped at 400MB which is way more than on 0.10, heap stable (see attached screenshot)... I have also extracted the code and run it for couple hours under valgrind... and no leaks... so I guess the only question that remains is why the RSS is growing so much and if its expected...

screenshot 2014-09-23 23 00 25

@mman
Copy link
Author

mman commented Sep 24, 2014

Looks indeed good after couple more hours, I have many processes with your fix all now running flat, the only thing that changes is the increased RSS size which I can't explain... but stable again...

thanks a lot,
Martin

@indutny
Copy link
Member

indutny commented Sep 24, 2014

@mman could you please try this patch #8438 ? I wonder if it'll help you decrease RSS.

@indutny
Copy link
Member

indutny commented Sep 24, 2014

Ooops, nvm. It is leaking...

@indutny
Copy link
Member

indutny commented Sep 24, 2014

Nvm again! :) It is because I forgot to land #8424

indutny added a commit that referenced this issue Sep 24, 2014
Call `MakeWeak()` to destruct TLSCallbacks when the js-object dies.

fix #8416

Reviewed-By: Fedor Indutny <fedor@indutny.com>
indutny added a commit to indutny/node that referenced this issue Sep 24, 2014
indutny added a commit to indutny/node that referenced this issue Sep 24, 2014
@mman
Copy link
Author

mman commented Sep 24, 2014

Hi Fedor, could you summarize for me what patches to apply... I'm little confused now...

indutny added a commit to indutny/node that referenced this issue Sep 24, 2014
@indutny
Copy link
Member

indutny commented Sep 24, 2014

@mman just this one indutny@ddd1132 on the top of the latest v0.12

@indutny
Copy link
Member

indutny commented Sep 24, 2014

I meant v0.12 branch

@indutny
Copy link
Member

indutny commented Sep 24, 2014

@mman any news?

@mman
Copy link
Author

mman commented Sep 25, 2014

I had some troubles running my code on v0.12 branch, it would break the npm in some strange way that it fails to npm install my dependencies blaming untar errors..., reverting to master fixed this again... will do some more experiments today. Can I apply your changeset against master?

@indutny
Copy link
Member

indutny commented Sep 25, 2014

Yeah, sure! But don't forget to apply indutny@4845377 too.

@mman
Copy link
Author

mman commented Sep 25, 2014

Applied to master, restarted and it looks much better now, RSS way down to where it used to be on 0.10, good job... anything else you need me to test?

screenshot 2014-09-25 17 26 47

@indutny
Copy link
Member

indutny commented Sep 25, 2014

Thank you very much!

Do you have request-per-second stats? :) Other than this - nothing!

@mman
Copy link
Author

mman commented Sep 25, 2014

It's on 1.5 req/sec. The RSS has so far grown to 80MB.

@indutny
Copy link
Member

indutny commented Sep 25, 2014

@mman I mean, how have it changed?

indutny added a commit to indutny/node that referenced this issue Sep 25, 2014
Don't allocate any BIO buffers initially, do this on a first read from
the TCP connection. Allocate different amount of data for initial read
and for consequent reads: small buffer for hello+certificate, big buffer
for better throughput.

see nodejs#8416
indutny added a commit that referenced this issue Sep 26, 2014
Don't allocate any BIO buffers initially, do this on a first read from
the TCP connection. Allocate different amount of data for initial read
and for consequent reads: small buffer for hello+certificate, big buffer
for better throughput.

see #8416
@trevnorris
Copy link

Think 2122a77 has resolved this issue.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants