Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Logs Explorer] Fix flaky test - Number of installed integrations do not match #188723

Conversation

awahab07
Copy link
Contributor

@awahab07 awahab07 commented Jul 19, 2024

Fixes #188676
Fixes #187556
Should also fix #182017

Summary

The PR attempts to fix the flaky behavior reported in the parent issue.

@awahab07 awahab07 added release_note:skip Skip the PR/issue when compiling release notes Team:obs-ux-logs Observability Logs User Experience Team labels Jul 19, 2024
@awahab07 awahab07 requested a review from a team as a code owner July 19, 2024 09:00
@elasticmachine
Copy link
Contributor

Pinging @elastic/obs-ux-logs-team (Team:obs-ux-logs)

@obltmachine
Copy link

🤖 GitHub comments

Expand to view the GitHub comments

Just comment with:

  • /oblt-deploy : Deploy a Kibana instance using the Observability test environments.
  • run docs-build : Re-trigger the docs validation. (use unformatted text in the comment!)

@yngrdyn
Copy link
Contributor

yngrdyn commented Jul 19, 2024

Thanks for looking into this, I don't think that will fix the problem here.
Checking the logs I found

...
[00:01:45]           │ proc [kibana] [2024-07-18T15:46:57.373+00:00][ERROR][plugins.fleet] '502 Bad Gateway' error response from package registry at https://epr.elastic.co/epr/aws/aws-1.51.0.zip...
...

Maybe the problem is related to that one instead, something is happening with was package?

@@ -188,6 +188,13 @@ export default function ({ getService, getPageObjects }: FtrProviderContext) {
);
cleanupIntegrationsSetup =
await PageObjects.observabilityLogsExplorer.setupInitialIntegrations();

// Ensure that number of installed packages equals the initial packages
await retry.try(async () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note

This change adds a new assertion without modifying the one that makes the test flaky. How does this resolve the test flakiness?

@kibanamachine
Copy link
Contributor

Flaky Test Runner Stats

🎉 All tests passed! - kibana-flaky-test-suite-runner#6591

[✅] x-pack/test/functional/apps/observability_logs_explorer/config.ts: 25/25 tests passed.
[✅] x-pack/test_serverless/functional/test_suites/observability/config.ts: 25/25 tests passed.

see run history

@elasticmachine
Copy link
Contributor

💚 Build Succeeded

Metrics [docs]

✅ unchanged

@awahab07
Copy link
Contributor Author

Thanks for your input @yngrdyn and @tonyghiani.

Any suggestions to make it deterministic, so that we always have the intended packages installed?

Or if we can't control it, does it make sense to only check the number of packages installed e.g. instead of:

expect(integrations.length).to.be(3);
expect(integrations).to.eql(initialPackagesTexts);

we use getInstalledPackages and do:

expect(integrations.length).to.be(installedPackagesLength);
expect(integrations).to.eql(installedPackagesTexts);

@kibanamachine
Copy link
Contributor

Flaky Test Runner Stats

🎉 All tests passed! - kibana-flaky-test-suite-runner#6600

[✅] x-pack/test/functional/apps/observability_logs_explorer/config.ts: 100/100 tests passed.
[✅] x-pack/test_serverless/functional/test_suites/observability/config.ts: 100/100 tests passed.

see run history

);

// Skip the suite if Apache HTTP Server integration is not available
if (!installedPackagesTexts.includes(initialPackageMap.apache)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get this part.
The test says clicking on Apache HTTP Server integration and moving into the second navigation level

This means Apache HTTP must be present, why are we skipping the test if its not present. If its not present it means we have not set the environment properly ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apache integration is installed as a prerequisite, but once in a while the package installation could fail due to unprecedented reasons e.g. as mentioned here.

So either we need a way which guarantees that the package will always be installed, or explicitly check if the installation is present in dependent tests (which is done here).

I am open for suggestions if there's a better way.

Copy link
Contributor

@achyutjhunjhunwala achyutjhunjhunwala Jul 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case that error should not be handled and it should fail the entire test suite.

This is a clear case of Install Package feature not working correctly. We should not handle such failures on our side, infact escalate them to Fleet if they keep on happening

Copy link
Contributor

@achyutjhunjhunwala achyutjhunjhunwala Jul 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When we are doing Package Installation, we are actually testing EPR's Install function and if that function is broken we must report it rather than swallowing it on our side

@awahab07
Copy link
Contributor Author

@elasticmachine merge upstream

@kibana-ci
Copy link
Collaborator

💛 Build succeeded, but was flaky

Failed CI Steps

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@awahab07
Copy link
Contributor Author

awahab07 commented Jul 22, 2024

Closing the PR as these failures should be handled in the upper level where package installation fails. See.

@awahab07 awahab07 closed this Jul 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release_note:skip Skip the PR/issue when compiling release notes Team:obs-ux-logs Observability Logs User Experience Team
Projects
None yet
8 participants