Skip to content
This repository has been archived by the owner on Sep 17, 2024. It is now read-only.

chore: rename ELASTIC_AGENT_USE_CI_SNAPSHOTS #566

Merged
merged 3 commits into from
Jan 7, 2021

Conversation

mdelapenya
Copy link
Contributor

@mdelapenya mdelapenya commented Jan 7, 2021

⚠️THIS IS A BREAKING CHANGE⚠️

What does this PR do?

It renames the ELASTIC_AGENT_USE_CI_SNAPSHOTS to USE_CI_SNAPSHOTS, which makes its scope wider, and easier to reuse across different test suites.

It also includes a small refactor in the logic to get the Docker namespace: if we use CI snapshots, let's use observability-ci, otherwise use beats.

Why is it important?

We consume that variable in both metricbeat and elastic-agent test suites, so we prefer clarity when using it.

Checklist

  • My code follows the style guidelines of this project
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • I have made corresponding change to the default configuration files
  • I have added tests that prove my fix is effective or that my feature works
  • I have run the Unit tests for the CLI, and they are passing locally
  • I have run the End-2-End tests for the suite I'm working on, and they are passing locally
  • I have noticed new Go dependencies (run make notice in the proper directory)

How to test this?

SUITE="metricbeat" METRICBEAT_VERSION="pr-23142" \
    USE_CI_SNAPSHOTS=true TAGS="metricbeat" DEVELOPER_MODE=true \
    TIMEOUT_FACTOR=3 LOG_LEVEL=TRACE \
    make -C e2e functional-test

Related issues

Follow-ups

We must merge a PR in Beats with the update, as I could not find a simple way of not breaking the downstream project (Beats) after this change

Needs backport to 7.x and 7.10.x

@mdelapenya mdelapenya self-assigned this Jan 7, 2021
@mdelapenya mdelapenya requested a review from a team January 7, 2021 07:29
@elasticmachine
Copy link
Contributor

💔 Tests Failed

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview

Expand to view the summary

Build stats

  • Build Cause: Pull request #566 opened

  • Start Time: 2021-01-07T07:29:54.589+0000

  • Duration: 24 min 36 sec

Test stats 🧪

Test Results
Failed 2
Passed 82
Skipped 8
Total 92

Test errors 2

Expand to view the tests failures

Initializing / Pre-Submit / Sanity checks / go vet – pre_commit.lint
    Expand to view the error details

     error 
    

  • no stacktrace
Initializing / Pre-Submit / Sanity checks / golangcilint – pre_commit.lint
    Expand to view the error details

     error 
    

  • no stacktrace

Log output

Expand to view the last 100 lines of log output

[2021-01-07T07:51:56.095Z] time="2021-01-07T07:51:55Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:51:56.095Z] time="2021-01-07T07:51:55Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=40.738337037s hostname=0a40f5a80e23 isAgentInStatus=false retry=11 status=offline
[2021-01-07T07:52:01.382Z] time="2021-01-07T07:52:01Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:01.382Z] time="2021-01-07T07:52:01Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=46.326774713s hostname=0a40f5a80e23 isAgentInStatus=false retry=12 status=offline
[2021-01-07T07:52:03.923Z] time="2021-01-07T07:52:03Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:03.923Z] time="2021-01-07T07:52:03Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=49.002133713s hostname=0a40f5a80e23 isAgentInStatus=false retry=13 status=offline
[2021-01-07T07:52:12.061Z] time="2021-01-07T07:52:10Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:12.061Z] time="2021-01-07T07:52:10Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=55.803158394s hostname=0a40f5a80e23 isAgentInStatus=false retry=14 status=offline
[2021-01-07T07:52:14.603Z] time="2021-01-07T07:52:14Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:14.603Z] time="2021-01-07T07:52:14Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=59.64684584s hostname=0a40f5a80e23 isAgentInStatus=false retry=15 status=offline
[2021-01-07T07:52:21.184Z] time="2021-01-07T07:52:20Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:21.184Z] time="2021-01-07T07:52:20Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m5.4030229s hostname=0a40f5a80e23 isAgentInStatus=false retry=16 status=offline
[2021-01-07T07:52:24.483Z] time="2021-01-07T07:52:23Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:24.483Z] time="2021-01-07T07:52:23Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m9.187001643s hostname=0a40f5a80e23 isAgentInStatus=false retry=17 status=offline
[2021-01-07T07:52:27.779Z] time="2021-01-07T07:52:27Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:27.779Z] time="2021-01-07T07:52:27Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m12.642739251s hostname=0a40f5a80e23 isAgentInStatus=false retry=18 status=offline
[2021-01-07T07:52:33.062Z] time="2021-01-07T07:52:32Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:33.062Z] time="2021-01-07T07:52:32Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m18.151035877s hostname=0a40f5a80e23 isAgentInStatus=false retry=19 status=offline
[2021-01-07T07:52:39.643Z] time="2021-01-07T07:52:39Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:39.643Z] time="2021-01-07T07:52:39Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m24.765731111s hostname=0a40f5a80e23 isAgentInStatus=false retry=20 status=offline
[2021-01-07T07:52:46.225Z] time="2021-01-07T07:52:45Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:46.225Z] time="2021-01-07T07:52:45Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m30.818768435s hostname=0a40f5a80e23 isAgentInStatus=false retry=21 status=offline
[2021-01-07T07:52:48.767Z] time="2021-01-07T07:52:48Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:48.767Z] time="2021-01-07T07:52:48Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m33.533695473s hostname=0a40f5a80e23 isAgentInStatus=false retry=22 status=offline
[2021-01-07T07:52:54.050Z] time="2021-01-07T07:52:53Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:52:54.050Z] time="2021-01-07T07:52:53Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m38.799345392s hostname=0a40f5a80e23 isAgentInStatus=false retry=23 status=offline
[2021-01-07T07:53:02.182Z] time="2021-01-07T07:53:00Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:53:02.183Z] time="2021-01-07T07:53:01Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m46.274667904s hostname=0a40f5a80e23 isAgentInStatus=false retry=24 status=offline
[2021-01-07T07:53:07.471Z] time="2021-01-07T07:53:07Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:53:07.471Z] time="2021-01-07T07:53:07Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m52.579219827s hostname=0a40f5a80e23 isAgentInStatus=false retry=25 status=offline
[2021-01-07T07:53:11.671Z] time="2021-01-07T07:53:11Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:53:11.671Z] time="2021-01-07T07:53:11Z" level=warning msg="The Agent is not in the offline status yet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d elapsedTime=1m56.595117047s hostname=0a40f5a80e23 isAgentInStatus=false retry=26 status=offline
[2021-01-07T07:53:18.255Z] time="2021-01-07T07:53:17Z" level=debug msg="Agent listed in Fleet with online status" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:53:18.255Z] time="2021-01-07T07:53:17Z" level=info msg="The Agent is in the desired status" elapsedTime=2m2.95067562s hostname=0a40f5a80e23 isAgentInStatus=true retries=27 status=offline
[2021-01-07T07:53:18.255Z] time="2021-01-07T07:53:17Z" level=debug msg="Agent build hash found" commitFile=/elastic-agent/.elastic-agent.active.commit containerName=0a40f5a80e23 hash=11c5367b33fb0a92d93c24fb6d382c1795aed5a5 shortHash=11c536
[2021-01-07T07:53:18.829Z] cat: /opt/Elastic/Agent/elastic-agent.log: No such file or directory
[2021-01-07T07:53:18.829Z] time="2021-01-07T07:53:18Z" level=error msg="Could not execute command in container" command="[cat /opt/Elastic/Agent/elastic-agent.log]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd cat /opt/Elastic/Agent/elastic-agent.log]. exit status 1" service=debian-systemd
[2021-01-07T07:53:18.829Z] time="2021-01-07T07:53:18Z" level=error msg="Could not get agent logs in the container" command="[cat /opt/Elastic/Agent/elastic-agent.log]" containerName=0a40f5a80e23 error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd cat /opt/Elastic/Agent/elastic-agent.log]. exit status 1" hash=11c536
[2021-01-07T07:53:19.772Z] OCI runtime exec failed: exec failed: container_linux.go:370: starting container process caused: exec: "elastic-agent": executable file not found in $PATH: unknown
[2021-01-07T07:53:20.032Z] time="2021-01-07T07:53:19Z" level=error msg="Could not execute command in container" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" service=debian-systemd
[2021-01-07T07:53:20.032Z] time="2021-01-07T07:53:19Z" level=error msg="Could not run agent command in the box" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" profile=fleet service=debian-systemd
[2021-01-07T07:53:20.032Z] time="2021-01-07T07:53:19Z" level=error msg="Could not uninstall the agent"
[2021-01-07T07:53:20.032Z] time="2021-01-07T07:53:19Z" level=debug msg="Un-enrolling agent in Fleet" agentID=1c616cc0-50bd-11eb-9765-af232417a94d hostname=0a40f5a80e23
[2021-01-07T07:53:20.972Z] time="2021-01-07T07:53:20Z" level=debug msg="Fleet agent was unenrolled" agentID=1c616cc0-50bd-11eb-9765-af232417a94d
[2021-01-07T07:53:21.912Z] Stopping fleet_debian-systemd_elastic-agent_1 ... 
[2021-01-07T07:53:22.482Z] 
Stopping fleet_debian-systemd_elastic-agent_1 ... done
Removing fleet_debian-systemd_elastic-agent_1 ... 
[2021-01-07T07:53:22.482Z] 
Removing fleet_debian-systemd_elastic-agent_1 ... done
Going to remove fleet_debian-systemd_elastic-agent_1
[2021-01-07T07:53:22.742Z] time="2021-01-07T07:53:22Z" level=debug msg="Docker compose executed." cmd="[rm -fvs debian-systemd]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/services/debian-systemd/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz973679014 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz973679014 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-01-07T07:53:22.742Z] time="2021-01-07T07:53:22Z" level=debug msg="Service removed from compose" profile=fleet service=debian-systemd
[2021-01-07T07:53:24.124Z] time="2021-01-07T07:53:23Z" level=debug msg="The token was deleted" tokenID=17b63430-50bd-11eb-9765-af232417a94d
[2021-01-07T07:53:24.124Z] time="2021-01-07T07:53:23Z" level=info msg="Integration deleted from the configuration" integration= packageConfigId= policyID=2d4eaf90-50bb-11eb-9765-af232417a94d version=
[2021-01-07T07:53:24.124Z] time="2021-01-07T07:53:23Z" level=debug msg="Destroying Fleet runtime dependencies"
[2021-01-07T07:53:25.065Z] Stopping fleet_kibana_1           ... 
[2021-01-07T07:53:25.065Z] Stopping fleet_package-registry_1 ... 
[2021-01-07T07:53:25.065Z] Stopping fleet_elasticsearch_1    ... 
[2021-01-07T07:53:26.465Z] 
Stopping fleet_kibana_1           ... done

Stopping fleet_package-registry_1 ... done

Stopping fleet_elasticsearch_1    ... done
Removing fleet_kibana_1           ... 
[2021-01-07T07:53:26.465Z] Removing fleet_package-registry_1 ... 
[2021-01-07T07:53:26.465Z] Removing fleet_elasticsearch_1    ... 
[2021-01-07T07:53:26.465Z] 
Removing fleet_package-registry_1 ... done

Removing fleet_kibana_1           ... done

Removing fleet_elasticsearch_1    ... done
Removing network fleet_default
[2021-01-07T07:53:26.725Z] time="2021-01-07T07:53:26Z" level=debug msg="Docker compose executed." cmd="[down --remove-orphans]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/.op/compose/profiles/fleet/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz973679014 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz973679014 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-01-07T07:53:26.725Z] time="2021-01-07T07:53:26Z" level=debug msg="Elastic Agent binary was removed." installer=centos-systemd path=/tmp/elastic-agent-8.0.0-SNAPSHOT-x86_64.rpm993083555
[2021-01-07T07:53:26.725Z] time="2021-01-07T07:53:26Z" level=debug msg="Elastic Agent binary was removed." installer=centos-tar path=/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz973679014
[2021-01-07T07:53:26.725Z] time="2021-01-07T07:53:26Z" level=debug msg="Elastic Agent binary was removed." installer=debian-systemd path=/tmp/elastic-agent-8.0.0-SNAPSHOT-amd64.deb410102989
[2021-01-07T07:53:26.725Z] <?xml version="1.0" encoding="UTF-8"?>
[2021-01-07T07:53:26.725Z] <testsuites name="main" tests="18" skipped="0" failures="0" errors="0" time="1075.506611204">
[2021-01-07T07:53:26.725Z]   <testsuite name="Fleet Mode Agent" tests="18" skipped="0" failures="0" errors="0" time="936.805851618">
[2021-01-07T07:53:26.725Z]     <testcase name="Deploying the centos agent" status="passed" time="37.564991602"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Deploying the debian agent" status="passed" time="20.544805552"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Deploying the centos agent with enroll and then run on rpm and deb" status="passed" time="25.283743393"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Deploying the debian agent with enroll and then run on rpm and deb" status="passed" time="37.47957397"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Stopping the centos agent stops backend processes" status="passed" time="11.54279978"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Stopping the debian agent stops backend processes" status="passed" time="11.58351258"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Restarting the installed centos agent" status="passed" time="35.989727804"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Restarting the installed debian agent" status="passed" time="20.708617409"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Restarting the centos host with persistent agent restarts backend processes" status="passed" time="24.453419892"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Restarting the debian host with persistent agent restarts backend processes" status="passed" time="22.391730814"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Un-enrolling the centos agent" status="passed" time="13.343060391"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Un-enrolling the debian agent" status="passed" time="11.248195621"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Re-enrolling the centos agent" status="passed" time="48.097663722"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Re-enrolling the debian agent" status="passed" time="42.487721019"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Revoking the enrollment token for the centos agent" status="passed" time="29.861087471"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Revoking the enrollment token for the debian agent" status="passed" time="18.755083513"></testcase>
[2021-01-07T07:53:26.725Z]     <testcase name="Un-installing the installed centos agent" status="passed" time="159.166744586"></testcase>
[2021-01-07T07:53:26.726Z]     <testcase name="Un-installing the installed debian agent" status="passed" time="133.825968478"></testcase>
[2021-01-07T07:53:26.726Z]   </testsuite>
[2021-01-07T07:53:26.726Z] </testsuites>+ sed -e 's/^[ \t]*//; s#>.*failed$#>#g' outputs/TEST-fleet
[2021-01-07T07:53:26.726Z] + grep -E '^<.*>$'
[2021-01-07T07:53:26.726Z] + exit 0
[2021-01-07T07:53:26.839Z] Recording test results
[2021-01-07T07:53:27.278Z] [Checks API] No suitable checks publisher found.
[2021-01-07T07:53:27.299Z] Archiving artifacts
[2021-01-07T07:53:28.745Z] Stage "Release" skipped due to when conditional
[2021-01-07T07:53:29.813Z] Running on worker-1244230 in /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-566
[2021-01-07T07:53:29.876Z] [INFO] getVaultSecret: Getting secrets
[2021-01-07T07:53:29.947Z] Masking supported pattern matches of $VAULT_ADDR or $VAULT_ROLE_ID or $VAULT_SECRET_ID
[2021-01-07T07:53:32.032Z] + chmod 755 generate-build-data.sh
[2021-01-07T07:53:32.032Z] + ./generate-build-data.sh https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-566/ https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-566/runs/1 UNSTABLE 1416051
[2021-01-07T07:53:32.032Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-566/runs/1/steps/?limit=10000 -o steps-info.json
[2021-01-07T07:53:32.730Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-566/runs/1/tests/?status=FAILED -o tests-errors.json
[2021-01-07T07:53:33.429Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-566/runs/1/log/ -o pipeline-log.txt

🐛 Flaky test report

❕ There are test failures but not known flaky tests.

Expand to view the summary

Test stats 🧪

Test Results
Failed 2
Passed 82
Skipped 8
Total 92

Genuine test errors 2

💔 There are test failures but not known flaky tests, most likely a genuine test failure.

  • Name: Initializing / Pre-Submit / Sanity checks / go vet – pre_commit.lint
  • Name: Initializing / Pre-Submit / Sanity checks / golangcilint – pre_commit.lint

@mdelapenya mdelapenya marked this pull request as ready for review January 7, 2021 09:57
@mdelapenya mdelapenya merged commit 4c1cea5 into elastic:master Jan 7, 2021
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 7, 2021
* chore: rename variable to a wider scope

* chore: extract common logic to a function

* chore: rename variable
# Conflicts:
#	.ci/Jenkinsfile
#	e2e/_suites/metricbeat/README.md
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 7, 2021
* chore: rename variable to a wider scope

* chore: extract common logic to a function

* chore: rename variable
# Conflicts:
#	.ci/Jenkinsfile
#	e2e/_suites/metricbeat/README.md
mdelapenya added a commit that referenced this pull request Jan 7, 2021
…version before switching to the configuration file (#565) backport for 7.x (#568)

* chore: rename ELASTIC_AGENT_USE_CI_SNAPSHOTS (#566)

* chore: rename variable to a wider scope

* chore: extract common logic to a function

* chore: rename variable
# Conflicts:
#	.ci/Jenkinsfile
#	e2e/_suites/metricbeat/README.md

* fix: set image version before switching to the configuration file (#565)

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
mdelapenya added a commit that referenced this pull request Jan 7, 2021
…version before switching to the configuration file (#565) backport for 7.10.x (#569)

* chore: rename ELASTIC_AGENT_USE_CI_SNAPSHOTS (#566)

* chore: rename variable to a wider scope

* chore: extract common logic to a function

* chore: rename variable
# Conflicts:
#	.ci/Jenkinsfile
#	e2e/_suites/metricbeat/README.md

* fix: set image version before switching to the configuration file (#565)

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
@mdelapenya mdelapenya deleted the rename-snapshots-variable branch January 11, 2021 17:20
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants