Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Breakdown usage by managed application #7866

Closed
pebrc opened this issue Jun 3, 2024 · 4 comments · Fixed by #7966
Closed

Breakdown usage by managed application #7866

pebrc opened this issue Jun 3, 2024 · 4 comments · Fixed by #7966
Assignees
Labels
>enhancement Enhancement of existing functionality v2.14.0

Comments

@pebrc
Copy link
Collaborator

pebrc commented Jun 3, 2024

Currently the usage data reporting is summary only https://www.elastic.co/guide/en/cloud-on-k8s/current/k8s-licensing.html#k8s-get-usage-data

We should create a breakdown by managed application, as different license agreements per user can influence whether or not e.g. Logstash memory should be included in the ERU summary. The intent behind this change is to make it transparent to users which applications are being included in the calculatation and allow users that are on older license terms to subtract e.g. Logstash from the total displayed.

@pebrc pebrc added >enhancement Enhancement of existing functionality v2.14.0 labels Jun 3, 2024
@thbkrkr
Copy link
Contributor

thbkrkr commented Jul 24, 2024

Have you started to think more deeply about how we can implement this?

Where should we add these information?
In the existing configMap or in a new resource?

> kubectl -n elastic-system get configmap elastic-licensing -o json | jq .data
{
  "eck_license_level": "enterprise",
  "eck_license_expiry_date": "2022-01-01T00:59:59+01:00",
  "enterprise_resource_units": "1",
  "max_enterprise_resource_units": "10",
  "timestamp": "2020-01-03T23:38:20Z",
  "total_managed_memory": "64GiB",
  "total_managed_memory_bytes": "68719476736",
  
  "details" {
  	...
  }

}

Format / level of details?

 "details" {
	"elasticsearch": "32GB",
	"kibana": "8GB",
	"apm-server": "8GB"
	"enterprise-search": "8GB"
	"logstash": "8GB"
}

Should we go further into detail?

 "details" {
	"elasticsearch": [{"ns1/name1": "16GB"}, {"ns2/name2": "16GB"}],
	"kibana": [{"ns1/name1": "4GB"}, {"ns2/name2": "2GB"}, {"ns2/name3": "2GB"}],
	"apm-server": [{"ns1/name1": "4GB"}, {"ns2/name2": "4GB"}],
	"enterprise-search": [{"ns1/name1": "4GB"}, {"ns/2name2": "4GB"}],
	"logstash": [{"ns1/name1": "8GB"}],
}

@pebrc
Copy link
Collaborator Author

pebrc commented Jul 25, 2024

I was thinking:

  1. just add the sum for each application (that's what's relevant for licensing)
  2. add it to the same config map. I am not sure about the nested structure
{
  "eck_license_level": "enterprise",
  "eck_license_expiry_date": "2022-01-01T00:59:59+01:00",
  "enterprise_resource_units": "1",
  "max_enterprise_resource_units": "10",
  "managed_elasticsearch_memory": "32GB",
  "managed_kibana_memory": "8GB",
  "managed_apm_server_memory": "8GB",
  "managed_enterprise_search_memory": "8GB",
  "managed_logstash_memory": "8GB",
  "timestamp": "2020-01-03T23:38:20Z",
  "total_managed_memory": "64GiB",
  "total_managed_memory_bytes": "68719476736",
}

For consistency we should probably report both GiB and bytes, wdyt?

@thbkrkr
Copy link
Contributor

thbkrkr commented Jul 25, 2024

I like your proposal, without nested structure. About GiB and bytes, I would only add bytes. Finally, the only field in GiB would be total_managed_memory.

@barkbay
Copy link
Contributor

barkbay commented Jul 25, 2024

About GiB and bytes, I would only add bytes. Finally, the only field in GiB would be total_managed_memory.

Isn't #5465 also relevant to this "per application" use case? I wonder if, beyond the consistency aspect, having the 2 values ​​could not be useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
>enhancement Enhancement of existing functionality v2.14.0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants