-
Notifications
You must be signed in to change notification settings - Fork 581
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Give names to configs in e2e test framework #12649
Conversation
d3b7f8c
to
21a0481
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's just make sure that these aren't being used as keys. We can change the names without modifying the server, correct?
build_tools/python/e2e_test_framework/definitions/common_definitions.py
Outdated
Show resolved
Hide resolved
build_tools/python/e2e_test_framework/definitions/common_definitions.py
Outdated
Show resolved
Hide resolved
build_tools/python/e2e_test_framework/definitions/common_definitions.py
Outdated
Show resolved
Hide resolved
Abbreviated Benchmark Summary@ commit dd9191c6f8fff98337b7e441496592ce480b706b (vs. base 713f9851eda694abe664103bcafafcf846390546) No improved or regressed benchmarks 🏖️ No improved or regressed compilation metrics 🏖️ For more information: |
That's correct. |
…itions.py Co-authored-by: Geoffrey Martin-Noble <gcmn@google.com>
…itions.py Co-authored-by: Geoffrey Martin-Noble <gcmn@google.com>
This change gives the name to config objects in e2e test framework (especially the ones with unique IDs).
This is required to generate benchmark names in the new benchmark suites, and later will be used and shared across the benchmark tools and uploaded to the perf dashboard. Currently we generate the benchmark names in the benchmark tools, which is not an ideal place.
They can be also used in the CMake comments/names of the module generation rules to give friendly information about where the rules come from (see "Tracability in E2E Test Artifacts" in #12215 (comment))
Right now it generates the names from config fields such as architectures and tags. Ideally I want users to manually assign a more concise name for each config (we already do that to name the unique ID constants, might be able to reuse). But currently the perf dashboard heavily relies on the benchmark names to filter benchmarks, so the tags need to be part of the name. We need a dashboard to support filtering on metadata before we can move to a more concise naming schema.
LMLWYT about the naming schema
Name format:
Execution benchmark names:
Compilation benchmark names: