Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

⚠️ CI failed ⚠️ #299

Closed
github-actions bot opened this issue Sep 2, 2022 · 5 comments
Closed

⚠️ CI failed ⚠️ #299

github-actions bot opened this issue Sep 2, 2022 · 5 comments

Comments

@github-actions
Copy link

github-actions bot commented Sep 2, 2022

Workflow Run URL

@ncclementi
Copy link
Contributor

runtime = 'coiled-0.0.4-py3.9', name = 'test_double_diff', category = 'benchmarks', last_three_duration [s] = (426.233430147171, 606.2415187358856, 397.9905481338501), duration_threshold [s] = 381.5963892449 

Screen Shot 2022-09-02 at 10 29 50 AM

This one is a bit tricky to judge because it is quite noisy. Note that the runs that failed (x in the plot) are not used to compute the mean that we use for the analysis. This is running with runtime 0.0.4 (no packages changed, except maybe coiled), which leads me to think that this regression report is a result of noise.

The only thing that makes me doubt is that in upstream about the same date we see an increment in durations, but not so bad yet to call it regression.
Upstream:

Screen Shot 2022-09-02 at 10 40 57 AM

@ian-r-rose do you have any thoughts?

@ncclementi
Copy link
Contributor

On upstream, even though not reported as regression since the signal it's very noisy, wall time seems to be better since this was reported, Although the average memory seems to be increasing.

Screen Shot 2022-09-06 at 1 42 24 PM

Screen Shot 2022-09-06 at 1 43 01 PM

@gjoseph92 any thoughts on this? trying to asses whether I should close this, or keep it open.

@gjoseph92
Copy link
Contributor

image

I wonder if this is due to

However, the movement is the opposite of what I'd expect. The first PR should have made runtime go down from baseline, then the follow-up should have made it go back up to baseline. Unless the "fix" really didn't work properly.

I also would expect it to be affected by dask/distributed#6975.

It's kind of unbelievable how much noisiness and variance there is on 0.0.4. I would say that this workload is simply unrunnable on those older versions.

@ncclementi
Copy link
Contributor

It's kind of unbelievable how much noisiness and variance there is on 0.0.4. I would say that this workload is simply unrunnable on those older versions.

Sure, but the pictures on my latest comment are from upstream (see here https://coiled.github.io/coiled-runtime/coiled-upstream-py3.9.html) and I see that wall time went down 09/02 but avg memory went up is this expected

@ncclementi
Copy link
Contributor

Closing, as current upstream looks like
Screen Shot 2022-11-02 at 5 13 42 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants