-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High load and internal server error on garbage in logs #3275
Milestone
Comments
Closed
Can confirm I’m having this issue as well, when it last occurred I did not think it would have been the escaped characters as the dashboard caught that stoped working until I removed that bad entire |
just watched this error happen first hand, cpu usage appears to be low at first occurrence but with increase over about 20 seconds till the cpu is pinned |
We found the issue. A fix is on its way. |
cyriltovena
added a commit
to cyriltovena/loki
that referenced
this issue
Feb 17, 2021
I've also added a test to prove the issue was happening and now is fixed. Fixes grafana#3275 Fixes grafana#3264 Fixes grafana#3020 Signed-off-by: Cyril Tovena <cyril.tovena@gmail.com>
owen-d
pushed a commit
that referenced
this issue
Feb 18, 2021
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
I'm using "Web Analytics Dashboard for NGINX" in grafana, with Loki as source. There are request displayed in recent requests panel. When a garbage request with unicode characters is present in logs, loki reports error and consumes multiple cores of CPU.
The query is:
When the request contains garbage i get an error.
The log line in this instance is:
The regular log is:
To Reproduce
Steps to reproduce the behavior:
{filename="/var/log/nginx/json_access.log", host="$host"} | json | line_format "request for {{.request_uri}} with HTTP status: {{.status}} "
Expected behavior
Some kind of failover, either properly displayed unicode string in browser, or just blank string. No unusual CPU load.
Environment:
Note:
This only happens if
log_format
is defined withescape=json
in nginx, if i useescape=default
then i getJSONParserErr
in__error__
label, and most fields don't get parsed.The text was updated successfully, but these errors were encountered: