Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory usage on save operations with ~23000 bookmarks #2032

Closed
dsalo opened this issue Nov 3, 2023 · 9 comments · Fixed by #2083
Closed

High memory usage on save operations with ~23000 bookmarks #2032

dsalo opened this issue Nov 3, 2023 · 9 comments · Fixed by #2083

Comments

@dsalo
Copy link

dsalo commented Nov 3, 2023

I can't save a bookmark to my Shaarli at linkspam.dsalo.info because it blitzes the 128M of memory designated in php.ini. I'm working on raising that (I need my webhost to restart Apache; I don't have the privs to), but in the meantime -- could this be a less memory-intensive task?

Possible complicating factor: I have almost 23,000 bookmarks.

Error message on save: Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 21128048 bytes) in /home/dsaloinf/linkspam/application/bookmark/BookmarkIO.php on line 124

Please let me know if there's more information I can usefully give you.

@virtadpt
Copy link

virtadpt commented Nov 3, 2023

I have almost 23,000 bookmarks.

I think that's your problem right there. How big is your datastore.php file?

@nodiscc nodiscc changed the title Saving a bookmark devours memory Saving a bookmark devours memory (~23000 bookmarks) Nov 3, 2023
@dsalo
Copy link
Author

dsalo commented Nov 3, 2023

6.42 MB. I've been bookmarking stuff for a long, long time.

If I read the BookmarkIO.php code correctly, the entire datastore gets read into memory each time a link is added? IDK how best to solve this, but maybe splitting the datastore once it reaches a problematic size would help?

I appreciate your attention, thanks very much.

@nodiscc
Copy link
Member

nodiscc commented Nov 4, 2023

the entire datastore gets read into memory each time a link is added

Correct.

maybe splitting the datastore once it reaches a problematic size would help

This would (probably?) not be an easy fix, and would add complexity to a critical part of the program (saving data), when a simple configuration fix exists (increase PHP max allowed memory size).

134217728 + 21128048 = 155345776 bytes = ~148MiB. That doesn't seem excessive at all for 23k bookmarks :) Especially considering this amount is only used on save operations.

I'm in favor of simply documenting the fix for this error (Fatal error: Allowed memory size of 134217728 bytes exhausted) in https://shaarli.readthedocs.io/en/master/Troubleshooting.html.

Using PHP-FPM this is controlled by the directive php_admin_value[memory_limit] = xxxM in /etc/php/PHPVERSION/fpm/pool.d/www.conf.j2 on a stock Debian installation.

@dsalo
Copy link
Author

dsalo commented Nov 10, 2023

So... I increased memory_limit all the way to 1024M, and I'm still getting the same error. Either something in the PHP stack is lying to me, or something's gone wrong with Shaarli's memory use. Any suggestions on how I might tell the difference?

@nodiscc
Copy link
Member

nodiscc commented Nov 11, 2023

  • Please post the full error message Fatal error: Allowed memory size of...
  • Please create a file zzz-info.php in your shaarli directory with the following contents:
<?php
phpinfo();
  • Please access https://YOUR_SHAARLI_DOMAIN/zzz-info.php from a web browser and paste the value of the memory_limit row
    image
  • You can delete zzz-info.php afterwards

@nodiscc
Copy link
Member

nodiscc commented Nov 20, 2023

@dsalo hi, any updates?

@dsalo
Copy link
Author

dsalo commented Nov 20, 2023

Ope, sorry! Not yet -- I finagled the error by deleting a few hundred of my oldest/404est bookmarks, so it may be a few weeks/months before I trigger it again. What I remember, though, is that the error message I got was identical to the one I quoted in my initial report.

(Future-dev suggestion: a 404-hunting tool would be awfully nice, especially if it played nicely with the "look it up on Wayback" function, which is awesome by the way. There are a couple of add-ons that supposedly do it, but they're pretty old and I'm not sure they work with current Shaarli.)

I can get the phpinfo() for you, though:
phpinfo.zip

@nodiscc
Copy link
Member

nodiscc commented Nov 21, 2023

Thanks; memory_limit is indeed set to 1024M now, so you should get a different number in the Allowed memory size of 134217728 bytes exhausted message. Else, the limit may be elsewhere. If this is shared hosting, it may be tricky to debug without the assistance of your hosting provider.

it may be a few weeks/months before I trigger it again

We could create a script that populates a test instance with hundreds of thousand of bookmarks using the REST API, and check memory usage as the datastore grows, I think it would provide a useful benchmark.

Anyway raising the PHP memory limit should work, I will make a note to document this in https://shaarli.readthedocs.io/en/master/Troubleshooting.html

a 404-hunting tool would be awfully nice

#1816 has a solution that uses a third-party tool I wrote. Let me know in that issue if you find any bugs or need assistance.

@nodiscc nodiscc changed the title Saving a bookmark devours memory (~23000 bookmarks) High memory usage on save operations with ~23000 bookmarks Nov 21, 2023
@nodiscc nodiscc self-assigned this Nov 21, 2023
@nodiscc nodiscc removed the question label Nov 21, 2023
@nodiscc nodiscc added this to the 0.13.0 milestone Nov 21, 2023
@dsalo
Copy link
Author

dsalo commented Nov 21, 2023

Terrific, thanks much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants