Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(NA): rebalance x-pack cigroups #84099

Merged
merged 13 commits into from
Nov 24, 2020

Conversation

mistic
Copy link
Member

@mistic mistic commented Nov 23, 2020

The current ci groups are not well balanced with ciGroup1 being taking the double of the time from ciGroup5.
I've tried balancing by not create a new ciGroup but that endup bot being possible because either ciGroup1 or ciGroup5 will became too long. I've just created a new x-pack ciGroup11 and moved two long tasks from ciGroup1 into ciGroup11.

@mistic mistic self-assigned this Nov 24, 2020
@mistic mistic added Team:Operations Team label for Operations Team v7.11.0 v8.0.0 chore Feature:CI Continuous integration release_note:skip Skip the PR/issue when compiling release notes labels Nov 24, 2020
@mistic mistic marked this pull request as ready for review November 24, 2020 14:33
@mistic mistic requested review from a team as code owners November 24, 2020 14:34
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-operations (Team:Operations)

Copy link
Member

@dmlemeshko dmlemeshko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@brianseeders
Copy link
Contributor

There's a chance that this is going to cause stability problems / flaky failures for ES snapshot and code coverage jobs. They aren't using the "tasks" framework that the tracked branch CI jobs are using, and are going to have fewer resources available for running the x-pack ciGroups in parallel. I suppose that we could make the machines for those jobs bigger if we need, they don't run too often compared to normal CI and PRs.

@mistic
Copy link
Member Author

mistic commented Nov 24, 2020

@brianseeders in that case is your suggestion to go ahead with it and later on resize the machines both for snapshots and coverage jobs if we need to? Or should we resize those machines right now?

Copy link
Contributor

@patrykkopycinski patrykkopycinski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SIEM/Endpoint LGTM

Copy link
Contributor

@spalger spalger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@brianseeders
Copy link
Contributor

@brianseeders in that case is your suggestion to go ahead with it and later on resize the machines both for snapshots and coverage jobs if we need to? Or should we resize those machines right now?

@mistic That's probably okay. I would kick off some jobs after merging this one and keep an eye out.

vars/tasks.groovy Outdated Show resolved Hide resolved
@mistic
Copy link
Member Author

mistic commented Nov 24, 2020

@elasticmachine merge upstream

@kibanamachine
Copy link
Contributor

💚 Build Succeeded

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@mistic mistic merged commit 9ee1ec7 into elastic:master Nov 24, 2020
@mistic
Copy link
Member Author

mistic commented Nov 24, 2020

7.x: a357416

mistic added a commit to mistic/kibana that referenced this pull request Nov 24, 2020
* chore(NA): rebalance cigroup1 into cigroup5

* chore(NA): get list api integration into cigropup1 again

* chore(NA): get apm integration basic into cigropup1 again

* chore(NA): move back apm_api_integration trial tests into ciGroup1

* chore(NA): move exception operators data types into ciGroup1 again

* chore(NA): move detection engine api security and spaces back into ciGroup1

* chore(NA): add a new xpack cigroup11

* chore(NA): correctly create 11 xpack ci groups

* chore(NA): try to balance ciGroup2 and 8

* chore(NA): reset number of xpack parallel worker builds to 10

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
# Conflicts:
#	vars/kibanaCoverage.groovy
mistic added a commit that referenced this pull request Nov 24, 2020
* chore(NA): rebalance cigroup1 into cigroup5

* chore(NA): get list api integration into cigropup1 again

* chore(NA): get apm integration basic into cigropup1 again

* chore(NA): move back apm_api_integration trial tests into ciGroup1

* chore(NA): move exception operators data types into ciGroup1 again

* chore(NA): move detection engine api security and spaces back into ciGroup1

* chore(NA): add a new xpack cigroup11

* chore(NA): correctly create 11 xpack ci groups

* chore(NA): try to balance ciGroup2 and 8

* chore(NA): reset number of xpack parallel worker builds to 10

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
# Conflicts:
#	vars/kibanaCoverage.groovy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
chore Feature:CI Continuous integration release_note:skip Skip the PR/issue when compiling release notes Team:Operations Team label for Operations Team v7.11.0 v8.0.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants