-
Notifications
You must be signed in to change notification settings - Fork 416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't require AWS access keys for S3 pytests #6556
Conversation
Also allow AWS_PROFILE to be passed. One of the two methods is required however.
sadly it doesn't work however, the test fails and the pageserver log says it's an authentication issue:
Apparently the key does get picked up by the pageserver but for some reason it still doesn't pick up the auth information... |
found the issue, had to also import the HOME env variable. See the new commit. It works now locally, although i had to increase the iteration number, probably due to higher latencies from my box to S3: diff --git a/test_runner/regress/test_tenant_delete.py b/test_runner/regress/test_tenant_delete.py
index b4e5a550f..392b8add6 100644
--- a/test_runner/regress/test_tenant_delete.py
+++ b/test_runner/regress/test_tenant_delete.py
@@ -89,7 +89,7 @@ def test_tenant_delete_smoke(
iterations = poll_for_remote_storage_iterations(remote_storage_kind)
assert ps_http.get_metric_value("pageserver_tenant_manager_slots") == 2
- tenant_delete_wait_completed(ps_http, tenant_id, iterations)
+ tenant_delete_wait_completed(ps_http, tenant_id, iterations * 10)
assert ps_http.get_metric_value("pageserver_tenant_manager_slots") == 1
tenant_path = env.pageserver.tenant_dir(tenant_id) |
2376 tests run: 2263 passed, 0 failed, 113 skipped (full report)Flaky tests (8)Postgres 16Postgres 15Postgres 14Code coverage (full report)
The comment gets automatically updated with the latest test results
d2e1d44 at 2024-02-01T18:35:15.038Z :recycle: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! It's going to be much more convenient to use it locally
Co-authored-by: Alexander Bayandin <alexander@neon.tech>
Co-authored-by: Alexander Bayandin <alexander@neon.tech>
Noticed this while debugging a test failure in #8673 which only occurs with real S3 instead of mock S3: if you authenticate to S3 via `AWS_PROFILE`, then it requires the `HOME` env var to be set so that it can read inside the `~/.aws` directory. The scrubber abstraction `StorageScrubber::scrubber_cli` in `neon_fixtures.py` would otherwise not work. My earlier PR #6556 has done similar things for the `neon_local` wrapper. You can try: ``` aws sso login --profile dev export ENABLE_REAL_S3_REMOTE_STORAGE=y REMOTE_STORAGE_S3_BUCKET=neon-github-ci-tests REMOTE_STORAGE_S3_REGION=eu-central-1 AWS_PROFILE=dev RUST_BACKTRACE=1 BUILD_TYPE=debug DEFAULT_PG_VERSION=16 ./scripts/pytest -vv --tb=short -k test_scrubber_tenant_snapshot ``` before and after this patch: this patch fixes it.
Noticed this while debugging a test failure in #8673 which only occurs with real S3 instead of mock S3: if you authenticate to S3 via `AWS_PROFILE`, then it requires the `HOME` env var to be set so that it can read inside the `~/.aws` directory. The scrubber abstraction `StorageScrubber::scrubber_cli` in `neon_fixtures.py` would otherwise not work. My earlier PR #6556 has done similar things for the `neon_local` wrapper. You can try: ``` aws sso login --profile dev export ENABLE_REAL_S3_REMOTE_STORAGE=y REMOTE_STORAGE_S3_BUCKET=neon-github-ci-tests REMOTE_STORAGE_S3_REGION=eu-central-1 AWS_PROFILE=dev RUST_BACKTRACE=1 BUILD_TYPE=debug DEFAULT_PG_VERSION=16 ./scripts/pytest -vv --tb=short -k test_scrubber_tenant_snapshot ``` before and after this patch: this patch fixes it.
Don't require AWS access keys (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) for S3 usage in the pytests, and also allow AWS_PROFILE to be passed.
One of the two methods is required however.
This allows local development like:
related earlier PR for the cargo unit tests of the
remote_storage
crate: #6202