-
Notifications
You must be signed in to change notification settings - Fork 7.3k
leak in native crypto code on latest git master? #8416
Comments
Sorry, is it about crypto or TLS? |
Sorry, I meant TLS, exhibiting via the simple HTTPS get handler…
|
Will look into it tomorrow |
TLSCallbacks should be destroyed at the same time with StreamWrap, but only if both objects are not referenced. Otherwise memory leaks or crashes will happen. fix nodejs#8416
Call `MakeWeak()` to destruct TLSCallbacks when the js-object dies. fix nodejs#8416
Should be fixed by #8424, cc @trevnorris |
pulled this fix into my master and deployed, will let you know after couple hours how it goes... thanks |
@mman Does it leak memory when you just create a server like this: var https = require('https');
var fs = require('fs');
https.createServer({
key: fs.readFileSync('./test/fixtures/keys/agent1-key.pem'),
cert: fs.readFileSync('./test/fixtures/keys/agent1-cert.pem')
}, function(req, res) {
res.end('hello world');
}).listen(1443, function() {
console.log('Listening...');
}); And run |
I can't verify this easily now, but this is pretty much what my code looks like. I have couple more instances where get is called less often and they leak as well, only proportionally slower to how often their get handler gets called... |
This is really surprising... I can confirm that the leak was happening prior to that patch, and I do see that it is gone on this basic example (at least in my tests). I'll see if there is anything else that could be leaking. Meanwhile, could you please ensure that you have rebuilt the binary and are using just-built-one for that server? |
Hi again, I have left the app up and running and it seems to be stable, the rss stopped at 400MB which is way more than on 0.10, heap stable (see attached screenshot)... I have also extracted the code and run it for couple hours under valgrind... and no leaks... so I guess the only question that remains is why the RSS is growing so much and if its expected... |
Looks indeed good after couple more hours, I have many processes with your fix all now running flat, the only thing that changes is the increased RSS size which I can't explain... but stable again... thanks a lot, |
Ooops, nvm. It is leaking... |
Nvm again! :) It is because I forgot to land #8424 |
Call `MakeWeak()` to destruct TLSCallbacks when the js-object dies. fix #8416 Reviewed-By: Fedor Indutny <fedor@indutny.com>
Hi Fedor, could you summarize for me what patches to apply... I'm little confused now... |
@mman just this one indutny@ddd1132 on the top of the latest v0.12 |
I meant v0.12 branch |
@mman any news? |
I had some troubles running my code on v0.12 branch, it would break the npm in some strange way that it fails to npm install my dependencies blaming untar errors..., reverting to master fixed this again... will do some more experiments today. Can I apply your changeset against master? |
Yeah, sure! But don't forget to apply indutny@4845377 too. |
Thank you very much! Do you have request-per-second stats? :) Other than this - nothing! |
It's on 1.5 req/sec. The RSS has so far grown to 80MB. |
@mman I mean, how have it changed? |
Don't allocate any BIO buffers initially, do this on a first read from the TCP connection. Allocate different amount of data for initial read and for consequent reads: small buffer for hello+certificate, big buffer for better throughput. see nodejs#8416
Don't allocate any BIO buffers initially, do this on a first read from the TCP connection. Allocate different amount of data for initial read and for consequent reads: small buffer for hello+certificate, big buffer for better throughput. see #8416
Think 2122a77 has resolved this issue. |
Please see #8348 for background info... running on latest node master appears to be leaking in native crypto extension... RSS growing until the process is killed, heap remaining stable... attached is a chart showing memory usage before and after I git pulled to the latest master three days ago (the X axis is days in September)...
I have today quickly reverted to latest stable 0.10.x node and it works nicely again...
I will not be able to run the master under valgrind for the next couple of days... just bringing the issue to your attention as soon as possible...
The text was updated successfully, but these errors were encountered: