Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move benchmark config generation to build_e2e_test_artifacts #13291

Merged
merged 2 commits into from
Apr 26, 2023

Conversation

pzread
Copy link
Contributor

@pzread pzread commented Apr 25, 2023

When working on #13273, I noticed that benchmark configs and modules are uploaded in 3 different jobs. These artifacts can be uploaded to different GCS dirs when some of them are re-run due to failures (because re-run jobs will create GCS_DIR with new run attempts). As a result, users might not be able to download all benchmark artifacts from a single GCS dir URL, which can be confusing.

This PR changes the workflow to generate all benchmark modules and configs in build_e2e_test_artifacts to avoid such issue. All files are uploaded to ${GCS_URL}/e2e-test-artifacts, a single path to download all benchmark artifacts.

Besides the reason above, I think during the CMake build we should generate the benchmark configs under ${IREE_BUILD_DIR}/e2e_test_artifacts instead of calling export_benchmark_config.py separately. There are some questions about how to pass the benchmark presets/filters through CMake configuration, so I decided to defer that as the next step.

@pzread pzread added the benchmarks:x86_64 Run default x86_64 benchmarks label Apr 25, 2023
@pzread pzread changed the title [WIP] Generate and upload benchmark configs in build_e2e_test_artifacts [WIP] Generat benchmark configs in build_e2e_test_artifacts Apr 25, 2023
@pzread pzread changed the title [WIP] Generat benchmark configs in build_e2e_test_artifacts [WIP] Generate benchmark configs in build_e2e_test_artifacts Apr 25, 2023
@github-actions
Copy link

github-actions bot commented Apr 25, 2023

Abbreviated Benchmark Summary

@ commit cafbb97d844c7ecbdf31b516ec97c70272291270 (vs. base 34e07706941838b442fbf21d1b615b54590d7f05)

No improved or regressed benchmarks 🏖️

No improved or regressed compilation metrics 🏖️

For more information:

Source Workflow Run

@pzread pzread changed the title [WIP] Generate benchmark configs in build_e2e_test_artifacts Generate all benchmark artifacts in build_e2e_test_artifacts Apr 25, 2023
@pzread pzread marked this pull request as ready for review April 25, 2023 02:29
- name: "Downloading assets"
id: "download-assets"
env:
COMPILATION_CONFIG: ${{ steps.export.outputs.compilation-config }}
BENCHMARK_CONFIG: ${{ env.E2E_TEST_ARTIFACTS_DIR }}/compilation-benchmark-config.json
BENCHMARK_CONFIG_GCS_ARTIFACT: ${{ env.E2E_TEST_ARTIFACTS_GCS_ARTIFACT_DIR }}/compilation-benchmark-config.json
Copy link
Contributor Author

@pzread pzread Apr 25, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks tedious for me to export each artifact path from build_e2e_test_artifacts, so I choose to use hard-coded names for files under ${E2E_TEST_ARTIFACTS_DIR}.

@pzread pzread added infrastructure/benchmark Relating to benchmarking infrastructure infrastructure Relating to build systems, CI, or testing labels Apr 25, 2023
@pzread pzread changed the title Generate all benchmark artifacts in build_e2e_test_artifacts Move benchmark config generation to in build_e2e_test_artifacts Apr 25, 2023
@pzread pzread changed the title Move benchmark config generation to in build_e2e_test_artifacts Move benchmark config generation to build_e2e_test_artifacts Apr 25, 2023
@pzread pzread merged commit f0fac24 into iree-org:main Apr 26, 2023
jpienaar pushed a commit that referenced this pull request May 1, 2023
When working on #13273, I noticed
that benchmark configs and modules are uploaded in 3 different jobs.
These artifacts can be uploaded to different GCS dirs when some of them
are re-run due to failures (because re-run jobs will create [GCS_DIR
with new run
attempts](https://github.com/openxla/iree/blob/main/.github/workflows/benchmark_compilation.yml#L44-L50)).
As a result, users might not be able to download all benchmark artifacts
from a single GCS dir URL, which can be confusing.

This PR changes the workflow to generate all benchmark modules and
configs in `build_e2e_test_artifacts` to avoid such issue. All files are
uploaded to `${GCS_URL}/e2e-test-artifacts`, a single path to download
all benchmark artifacts.

Besides the reason above, I think during the CMake build we should
generate the benchmark configs under
`${IREE_BUILD_DIR}/e2e_test_artifacts` instead of calling
`export_benchmark_config.py` separately. There are some questions about
how to pass the benchmark presets/filters through CMake configuration,
so I decided to defer that as the next step.
NatashaKnk pushed a commit to NatashaKnk/iree that referenced this pull request Jul 6, 2023
…g#13291)

When working on iree-org#13273, I noticed
that benchmark configs and modules are uploaded in 3 different jobs.
These artifacts can be uploaded to different GCS dirs when some of them
are re-run due to failures (because re-run jobs will create [GCS_DIR
with new run
attempts](https://github.com/openxla/iree/blob/main/.github/workflows/benchmark_compilation.yml#L44-L50)).
As a result, users might not be able to download all benchmark artifacts
from a single GCS dir URL, which can be confusing.

This PR changes the workflow to generate all benchmark modules and
configs in `build_e2e_test_artifacts` to avoid such issue. All files are
uploaded to `${GCS_URL}/e2e-test-artifacts`, a single path to download
all benchmark artifacts.

Besides the reason above, I think during the CMake build we should
generate the benchmark configs under
`${IREE_BUILD_DIR}/e2e_test_artifacts` instead of calling
`export_benchmark_config.py` separately. There are some questions about
how to pass the benchmark presets/filters through CMake configuration,
so I decided to defer that as the next step.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
benchmarks:x86_64 Run default x86_64 benchmarks infrastructure/benchmark Relating to benchmarking infrastructure infrastructure Relating to build systems, CI, or testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants