-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory issue when using composite function #2759
Comments
Hi, I've been unable to reproduce this locally. Please can you provide sample images that trigger this problem. Are you able to reproduce this when using non-SVG images? It's possible you might have run into libvips/libvips#2293 |
Thank you for the update, I'm still unable to reproduce this based on the above image and code. Here's what I see locally using the above image as const fs = require('fs');
const path = require('path');
const sharp = require('sharp');
setInterval(() => {
console.log(process.memoryUsage().rss);
}, 10000);
async function test() {
const background = fs.readFileSync(path.resolve('in.jpg'));
sharp.cache(false);
for (let i = 0; i <= 2000; i++) {
await sharp(background)
.composite([{ input: background }])
.toBuffer();
}
}
test();
Given you're using Docker, please can you create a separate repo with a |
Thanks for the reply, it helped me track down what I can do to get around this issue for my use case! This seems to be an issue on Node version 12 (I only checked 12, 14, and 16 just now - 14 and 16 did not have this issue). So there does seem to be some sort of memory leak on Node 12, but for my purposes, I can easily switch to either 14 or 16. Here is a Dockerfile that should reproduce the issue for you: FROM node:12
COPY . /
RUN yarn
CMD node index.js index.js is the same as your last comment: const fs = require('fs');
const path = require('path');
const sharp = require('sharp');
setInterval(() => {
console.log(process.memoryUsage().rss);
}, 10000);
async function test() {
const background = fs.readFileSync(path.resolve('in.jpg'));
sharp.cache(false);
for (let i = 0; i <= 2000; i++) {
await sharp(background)
.composite([{input: background}])
.toBuffer();
}
}
test(); package.json is just the sharp package: {
"dependencies": {
"sharp": "^0.28.3"
}
} And the in.jpg is the same we have been using. Then I run it using Thanks for helping me track down how I can fix this in my project. Would you like me to keep the issue open for Node 12? |
Thanks, there was a bug in Node.js 12 and earlier that prevented garbage collection from running when inside a cgroup-constrained container, which was fixed via nodejs/node#27508 |
Are you using the latest version? Is the version currently in use as reported by
npm ls sharp
the same as the latest version as reported bynpm view sharp dist-tags.latest
?npm ls sharp
-> 0.28.3npm view sharp dist-tags.latest
-> 0.28.3What are the steps to reproduce?
When compositing multiple images, some of the memory does not seem to be fully released back when garbage collection is run. This isn't a problem when only doing a few images, but it is a problem when running a web server that can conceivably compose thousands of images before it is shut down.
I have read all of the other threads on memory usage with this library and have tried the fixes listed in them. I have tried changing the LD_PRELOAD env to libjemalloc, I have tried using Alpine instead of Debian-based systems, I have even tried running it on ARM processors, and it still happens across the board.
What is the expected behaviour?
When sharp is finished composing two images, the memory is released back to the system instead of building up.
Are you able to provide a minimal, standalone code sample, without other dependencies, that demonstrates this problem?
I realize that this is printing out RSS numbers, which might not be totally accurate when it comes to showing how much memory is freed vs how much is collected - but I also monitor the ram usage using htop and
docker stats
(since I'm running this in a container) and it is consistently climbing across the board. Also, when I do any other operation to the image besides compose it, the RSS (and htop and docker stats) memory using will immediately fall back down after finishing this loop.Also, this problem is much more pronounced when larger (and different) files are loaded, like in the web server application that I am using it for. In that case, I usually can't get past 300-400 requests before my ram usage is between 500MB and 1G.
Are you able to provide a sample image that helps explain the problem?
Any two images composed together should cause this issue
What is the output of running
npx envinfo --binaries --system
?Debian
Alpine
The text was updated successfully, but these errors were encountered: