Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prometheus scrape compatibility #9984

Open
mhausenblas opened this issue May 12, 2022 · 28 comments
Open

Prometheus scrape compatibility #9984

mhausenblas opened this issue May 12, 2022 · 28 comments
Labels
comp:prometheus Prometheus related issues discussion needed Community discussion needed never stale Issues marked with this label will be never staled and automatically removed priority:p1 High

Comments

@mhausenblas
Copy link
Member

Is your feature request related to a problem? Please describe.
Currently, when writing relabel_configs in the collector config prometheusreceiver one can not use $ but has to use $$.

While I understand why this is the case today, I see that folks coming from Prometheus and wanting to use the collector as a drop-in replacement, run into this UX issue, for example most recently aws-observability/aws-otel-collector#1201

Describe the solution you'd like
Make scrape configs 100% Prometheus compatible.

@Aneurysm9 Aneurysm9 added the comp:prometheus Prometheus related issues label May 12, 2022
@codeboten
Copy link
Contributor

@dashpole @Aneurysm9 any thoughts on priority for this one?

@dashpole
Copy link
Contributor

I think this is important. See #4980. We had said at a SIG meeting (#5080 (comment)) that it would be possible to do this using config sources. Is that possible now, and this is just a documentation issue? Or does config sources not solve the problem of having to escape $?

@dashpole dashpole added the priority:p1 High label May 13, 2022
@bogdandrutu
Copy link
Member

bogdandrutu commented May 13, 2022

@dashpole what are all the examples in prometheus config where you use $?

Is that possible now, and this is just a documentation issue?

Because the expand PR is not approved, and because of backwards compatible reason ${1} will still interpret as expanding the 1 environment variable, unless we are willing to break users and say that they must always write as ${env:1} to expand the environment variable.

@dashpole
Copy link
Contributor

The examples are almost entirely inside the replacement section of relabel_configs. It is very commonly used. It might also be used in the regex as well.

My original thinking in #4980 was that being able to reference a separate file would do the trick. It would also enable other uses of the collector, such as using it with the Prometheus operator, where the operator expects to manage the Prometheus config file itself. I think we would lose the config "watching" functionality if we did that, though.

@bogdandrutu
Copy link
Member

I would suggest a different path. Implement a "MapConverter" (currently named config.MapConverterFunc) that we can use to replace "${DIGIT}" in any receivers::prometheus[/a-z]::config::scrapers_config[x]::replacements with "$${DIGIT}"?

@dashpole
Copy link
Contributor

Ooh, fancy. That works for me

@bogdandrutu
Copy link
Member

@dashpole if we are fine with this path, we should start by setting up the directory where we will have map converters in the contrib (keep in mind that we will have in the future providers as well). see https://github.com/open-telemetry/opentelemetry-collector/tree/main/config

@bogdandrutu
Copy link
Member

@dashpole another idea is to omit ${[0-9]+} from expanding.

@github-actions
Copy link
Contributor

github-actions bot commented Nov 9, 2022

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Nov 9, 2022
@mhausenblas
Copy link
Member Author

mhausenblas commented Nov 9, 2022

Any updates here?

@otterley
Copy link

How widespread is the actual impact of users relying on $1 as an environment variable to work? Seems like that would be extremely rare at best.

@dashpole
Copy link
Contributor

Its relatively common when using relabel rules. https://github.com/prometheus/prometheus/blob/main/documentation/examples/prometheus-kubernetes.yml is the recommended config for prometheus on k8s, for example, and contains a few $1's in examples.

@otterley
Copy link

@dashpole I think there's a misunderstanding. I am talking about $1 the environment variable as opposed to $1 the regex substitution token. See @bogdandrutu 's comment above.

@dashpole
Copy link
Contributor

Ah, yes. I did misunderstand your question. I've never seen anyone use $1 as an environment variable.

@otterley
Copy link

otterley commented Mar 7, 2023

I found another incompatibility as well. In Prometheus server, the relabel configuration applies an implicit "any" match (.*) for each rule. With the OTEL collector you have to supply it explicitly (regex: (.*)) or the rule won't match.

@mhausenblas
Copy link
Member Author

Thank you for reporting this @otterley

@mhausenblas
Copy link
Member Author

@bogdandrutu can we remove the stale label and have a convo on how we plan to address this please?

@atoulme atoulme added discussion needed Community discussion needed and removed Stale labels Mar 8, 2023
@atoulme
Copy link
Contributor

atoulme commented Mar 8, 2023

Please bring this issue for discussion at the next SIG meeting?

@github-actions
Copy link
Contributor

github-actions bot commented May 8, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label May 8, 2023
@Aneurysm9 Aneurysm9 removed the Stale label May 17, 2023
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Jul 17, 2023
@mhausenblas
Copy link
Member Author

not stale

@dashpole dashpole removed the Stale label Jul 17, 2023
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Sep 18, 2023
@mhausenblas
Copy link
Member Author

not stale

@dashpole dashpole removed the Stale label Sep 18, 2023
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Nov 20, 2023
@dashpole dashpole removed the Stale label Nov 20, 2023
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Jan 22, 2024
@ickymettle
Copy link

Thankful to find this GH issue after wasting a solid half a day trying to figure out why my relabel configs weren't working. As an experienced Prometheus user starting to integrate the OTEL collector into our infrastructure this is a significantly confusing footgun.

@dashpole dashpole removed the Stale label Feb 22, 2024
@ericmustin
Copy link
Contributor

This issue stumped me for a solid week.

Not having Yaml config interoperability for relabel configs is the mind killer.

I am tired.

@crobert-1 crobert-1 added the never stale Issues marked with this label will be never staled and automatically removed label Mar 13, 2024
@mx-psi
Copy link
Member

mx-psi commented Mar 13, 2024

I think open-telemetry/opentelemetry-collector/issues/9531 would help here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:prometheus Prometheus related issues discussion needed Community discussion needed never stale Issues marked with this label will be never staled and automatically removed priority:p1 High
Projects
None yet
Development

No branches or pull requests