Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature request: structured (json) logs to stdout, disable writing to a file system #548

Open
helmut72 opened this issue Mar 26, 2024 · 8 comments
Labels
feature request New feature request

Comments

@helmut72
Copy link

helmut72 commented Mar 26, 2024

Would be cool to have structured logs, maybe (optional) as json. Instead of this one (docker logs -f dagu):

Bildschirmfoto 2024-03-26 um 12 48 15

Would be also cool to disable writing log files with a configuration option. I send and store logs to my log server.

Thank you

@helmut72 helmut72 changed the title feature request structured (json) logs to stdout, disable writing to a file system feature request: structured (json) logs to stdout, disable writing to a file system Mar 26, 2024
@yohamta yohamta added feature request New feature request good first issue Good for newcomers labels Mar 26, 2024
@yohamta yohamta removed accepted good first issue Good for newcomers labels May 30, 2024
@halalala222
Copy link
Contributor

hi !@yohamta I want to know , for the log server is HTTP server ? For that we need to report logs to the HTTP server instead of writing log files. What do you think?

@yohamta
Copy link
Collaborator

yohamta commented Aug 21, 2024

Hi @halalala222, it sounds like an useful feature. Would you mind asking how do you want to use the log data sent to the HTTP server? I'd like to research on what other workflow engine/orchestrater (Airflow, Temporal, etc) do in that regard, so we can make the feature most useful.

@halalala222
Copy link
Contributor

halalala222 commented Aug 21, 2024

hi @yohamta !
Airflow
image
Provide log writing for the following services.
https://airflow.apache.org/docs/apache-airflow-providers/core-extensions/logging.html
Example :
image
But I haven't looked closely at how it is implemented yet.
also provides a custom processing option.
https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/logging-monitoring/logging-tasks.html#
image

@helmut72
Copy link
Author

helmut72 commented Aug 21, 2024

Just output to stdout and in a structured way. Like nearly all services do that runs in container. Common log shipper like fluent-bit, telegraf, ... can pickup the logs and do whatever is required.

https://12factor.net/logs

Dealing with log files, required volume and managing like logrotate/deleting is unusual n a container world:
https://overcast.blog/managing-container-stdout-stderr-logs-like-a-pro-e7d42ab0035e

@halalala222
Copy link
Contributor

@yohamta , so i try to just output to stdout with a structure way(like json) ? I think that the structured format of this log can be configured? What do you think?

@yohamta
Copy link
Collaborator

yohamta commented Aug 22, 2024

Hi @helmut72 @halalala222, thank you very much for the additional inputs. Very helpful.

Regarding log format, it's already implemented except for environment variable and documentation.
https://github.com/daguflow/dagu/blob/5422d0bc62aa281bb0d56797076446de8cff3b8b/internal/config/config.go#L56

I believe that Dagu outputs the logs to stdout by default when you run dagu start command. It would be great to add a feature that allows sending logs as an event stream to AWS S3, Alibaba OSS, or any specified HTTP endpoint.

New configuration keys:

  • DAGU_LOG_FORMAT
  • DAGU_LOG_DESTINATION (e.g., https://xxx, oss://xxx)

I'm a bit unsure about what kind of API request it should send; maybe just a POST with the body?

@halalala222
Copy link
Contributor

Hi @yohamta i only used Alibaba OSS,it provides the corresponding Golang SDK that can directly upload OSS objects.Requires corresponding sensitive data such as AK, SK, etc.
I believe that AWS S3 also has a corresponding SDK that can facilitate file uploads.
For any specified HTTP endpoint, Should we provide a unified file upload, or should we upload log data?

@helmut72
Copy link
Author

I believe that Dagu outputs the logs to stdout by default when you run dagu start command.

Yes, like I mentioned in my first post docker logs -f dagu with screenshot.

But this feature request:

  1. is about structured output in a machine readable format (key=value or json) instead of this formattings/"graphs" with "=========..."

  2. is about an option to output to stdout only instead of writing to a file too:
    "Would be also cool to disable writing log files with a configuration option."
    This reduce writing to local ssd/sd-cards. Also if any service runs as a Docker container, output to stdout is the real Docker way:
    https://overcast.blog/managing-container-stdout-stderr-logs-like-a-pro-e7d42ab0035e

I'm a bit unsure about what kind of API request it should send; maybe just a POST with the body?

Feature request from halalala222 is a complete different one and there are more than enough existing tools to send logs the millions log ingester. Nothing that Dagu should care about. There are so many nuances in configuration and log formats, that Dagu never can handle. Dagu is a scheduler and not a log shipper like Fluentd, fluent-bit, telegraf and so on...:
https://betterstack.com/community/guides/logging/log-shippers-explained/

Dagu should stay as it roots, become a great scheduler.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature request
Projects
None yet
Development

No branches or pull requests

3 participants