Skip to content
This repository has been archived by the owner on Sep 17, 2024. It is now read-only.

fix: set image version before switching to the configuration file #565

Merged

Conversation

mdelapenya
Copy link
Contributor

@mdelapenya mdelapenya commented Jan 7, 2021

What does this PR do?

It switches the order of two operations: 1) setting the version for Metricbeat's docker image, and 2) defining which branch/tag will be used to download the configuration file.

Why is it important?

It causes a bug in the tests for metricbeat configuration in the case it's triggered by a PR from Beats: it will use the CI snapshots but the version, as it starts with pr-, was replaced with the base version for the maintenance branch, when we still want to use the version of the PR (in the form of its Docker image, downloaded from observability-ci registry)

Checklist

  • My code follows the style guidelines of this project
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • I have made corresponding change to the default configuration files
  • I have added tests that prove my fix is effective or that my feature works
  • I have run the Unit tests for the CLI, and they are passing locally
  • I have run the End-2-End tests for the suite I'm working on, and they are passing locally
  • I have noticed new Go dependencies (run make notice in the proper directory)

How to test this PR locally

SUITE="metricbeat" METRICBEAT_VERSION="pr-23142" \
    ELASTIC_AGENT_USE_CI_SNAPSHOTS=true TAGS="metricbeat" DEVELOPER_MODE=true \
    TIMEOUT_FACTOR=3 LOG_LEVEL=TRACE \
    make -C e2e functional-test

Related issues

Follow-ups

We still need to download the proper config file from the PR, but we do not have access to the commit SHA at this moment, unless other modifications are in place, like reading the SHA from Jenkins.

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
@mdelapenya mdelapenya self-assigned this Jan 7, 2021
@mdelapenya mdelapenya requested a review from a team January 7, 2021 06:28
@mdelapenya mdelapenya marked this pull request as ready for review January 7, 2021 06:29
@mdelapenya mdelapenya added the bug Something isn't working label Jan 7, 2021
@elasticmachine
Copy link
Contributor

💔 Tests Failed

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview

Expand to view the summary

Build stats

  • Build Cause: Pull request #565 opened

  • Start Time: 2021-01-07T06:29:09.388+0000

  • Duration: 28 min 58 sec

Test stats 🧪

Test Results
Failed 3
Passed 80
Skipped 9
Total 92

Test errors 3

Expand to view the tests failures

Initializing / End-To-End Tests / fleet_fleet_mode_agent / Un-installing the installed centos agent – Fleet Mode Agent
    Expand to view the error details

     Step the agent is listed in Fleet as "offline": The Agent is not in the offline status yet 
    

  • no stacktrace
Initializing / Pre-Submit / Sanity checks / go vet – pre_commit.lint
    Expand to view the error details

     error 
    

  • no stacktrace
Initializing / Pre-Submit / Sanity checks / golangcilint – pre_commit.lint
    Expand to view the error details

     error 
    

  • no stacktrace

Steps errors 3

Expand to view the steps failures

Run functional tests for fleet:fleet_mode_agent
  • Took 22 min 17 sec . View more details on here
  • Description: .ci/scripts/functional-test.sh "fleet" "fleet_mode_agent" "8.0.0-SNAPSHOT" "8.0.0-SNAPSHOT"
Archive the artifacts
  • Took 0 min 0 sec . View more details on here
  • Description: [2021-01-07T06:57:02.716Z] Archiving artifacts hudson.AbortException: script returned exit code 1
Error signal
  • Took 0 min 0 sec . View more details on here
  • Description: hudson.AbortException: script returned exit code 1

Log output

Expand to view the last 100 lines of log output

[2021-01-07T06:55:54.651Z] time="2021-01-07T06:55:54Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:55:54.651Z] time="2021-01-07T06:55:54Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m19.556479254s hostname=37241c20da58 isAgentInStatus=false retry=21 status=offline
[2021-01-07T06:56:01.347Z] time="2021-01-07T06:56:01Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:01.347Z] time="2021-01-07T06:56:01Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m26.230331024s hostname=37241c20da58 isAgentInStatus=false retry=22 status=offline
[2021-01-07T06:56:05.614Z] time="2021-01-07T06:56:05Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:05.614Z] time="2021-01-07T06:56:05Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m30.515575333s hostname=37241c20da58 isAgentInStatus=false retry=23 status=offline
[2021-01-07T06:56:09.806Z] time="2021-01-07T06:56:09Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:09.806Z] time="2021-01-07T06:56:09Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m34.816499616s hostname=37241c20da58 isAgentInStatus=false retry=24 status=offline
[2021-01-07T06:56:13.997Z] time="2021-01-07T06:56:13Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:13.997Z] time="2021-01-07T06:56:13Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m38.62641945s hostname=37241c20da58 isAgentInStatus=false retry=25 status=offline
[2021-01-07T06:56:17.283Z] time="2021-01-07T06:56:17Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:17.283Z] time="2021-01-07T06:56:17Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m42.266910899s hostname=37241c20da58 isAgentInStatus=false retry=26 status=offline
[2021-01-07T06:56:22.554Z] time="2021-01-07T06:56:22Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:22.554Z] time="2021-01-07T06:56:22Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m47.636578148s hostname=37241c20da58 isAgentInStatus=false retry=27 status=offline
[2021-01-07T06:56:27.928Z] time="2021-01-07T06:56:26Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:27.928Z] time="2021-01-07T06:56:27Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m52.183245128s hostname=37241c20da58 isAgentInStatus=false retry=28 status=offline
[2021-01-07T06:56:32.118Z] time="2021-01-07T06:56:32Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:32.118Z] time="2021-01-07T06:56:32Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=1m57.269619492s hostname=37241c20da58 isAgentInStatus=false retry=29 status=offline
[2021-01-07T06:56:36.307Z] time="2021-01-07T06:56:35Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:36.307Z] time="2021-01-07T06:56:35Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=2m0.696454269s hostname=37241c20da58 isAgentInStatus=false retry=30 status=offline
[2021-01-07T06:56:40.496Z] time="2021-01-07T06:56:39Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:40.496Z] time="2021-01-07T06:56:39Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=2m4.891136616s hostname=37241c20da58 isAgentInStatus=false retry=31 status=offline
[2021-01-07T06:56:47.063Z] time="2021-01-07T06:56:46Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:47.063Z] time="2021-01-07T06:56:46Z" level=warning msg="The Agent is not in the offline status yet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 elapsedTime=2m11.567873972s hostname=37241c20da58 isAgentInStatus=false retry=32 status=offline
[2021-01-07T06:56:53.631Z] time="2021-01-07T06:56:52Z" level=debug msg="Agent listed in Fleet with online status" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:53.631Z] time="2021-01-07T06:56:52Z" level=info msg="The Agent is in the desired status" elapsedTime=2m17.648583061s hostname=37241c20da58 isAgentInStatus=true retries=33 status=offline
[2021-01-07T06:56:53.631Z] time="2021-01-07T06:56:52Z" level=debug msg="Agent build hash found" commitFile=/elastic-agent/.elastic-agent.active.commit containerName=37241c20da58 hash=11c5367b33fb0a92d93c24fb6d382c1795aed5a5 shortHash=11c536
[2021-01-07T06:56:53.631Z] cat: /opt/Elastic/Agent/elastic-agent.log: No such file or directory
[2021-01-07T06:56:53.631Z] time="2021-01-07T06:56:53Z" level=error msg="Could not execute command in container" command="[cat /opt/Elastic/Agent/elastic-agent.log]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd cat /opt/Elastic/Agent/elastic-agent.log]. exit status 1" service=debian-systemd
[2021-01-07T06:56:53.631Z] time="2021-01-07T06:56:53Z" level=error msg="Could not get agent logs in the container" command="[cat /opt/Elastic/Agent/elastic-agent.log]" containerName=37241c20da58 error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd cat /opt/Elastic/Agent/elastic-agent.log]. exit status 1" hash=11c536
[2021-01-07T06:56:54.570Z] OCI runtime exec failed: exec failed: container_linux.go:370: starting container process caused: exec: "elastic-agent": executable file not found in $PATH: unknown
[2021-01-07T06:56:54.570Z] time="2021-01-07T06:56:54Z" level=error msg="Could not execute command in container" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" service=debian-systemd
[2021-01-07T06:56:54.570Z] time="2021-01-07T06:56:54Z" level=error msg="Could not run agent command in the box" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" profile=fleet service=debian-systemd
[2021-01-07T06:56:54.570Z] time="2021-01-07T06:56:54Z" level=error msg="Could not uninstall the agent"
[2021-01-07T06:56:54.570Z] time="2021-01-07T06:56:54Z" level=debug msg="Un-enrolling agent in Fleet" agentID=3189f700-50b5-11eb-9eaf-b78530015a04 hostname=37241c20da58
[2021-01-07T06:56:56.483Z] time="2021-01-07T06:56:56Z" level=debug msg="Fleet agent was unenrolled" agentID=3189f700-50b5-11eb-9eaf-b78530015a04
[2021-01-07T06:56:57.051Z] Stopping fleet_debian-systemd_elastic-agent_1 ... 
[2021-01-07T06:56:57.619Z] 
Stopping fleet_debian-systemd_elastic-agent_1 ... done
Removing fleet_debian-systemd_elastic-agent_1 ... 
[2021-01-07T06:56:57.619Z] 
Removing fleet_debian-systemd_elastic-agent_1 ... done
Going to remove fleet_debian-systemd_elastic-agent_1
[2021-01-07T06:56:57.878Z] time="2021-01-07T06:56:57Z" level=debug msg="Docker compose executed." cmd="[rm -fvs debian-systemd]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/services/debian-systemd/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz682278649 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz682278649 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-01-07T06:56:57.879Z] time="2021-01-07T06:56:57Z" level=debug msg="Service removed from compose" profile=fleet service=debian-systemd
[2021-01-07T06:56:59.258Z] time="2021-01-07T06:56:59Z" level=debug msg="The token was deleted" tokenID=2d4a7890-50b5-11eb-9eaf-b78530015a04
[2021-01-07T06:56:59.258Z] time="2021-01-07T06:56:59Z" level=info msg="Integration deleted from the configuration" integration= packageConfigId= policyID=bad7d5c0-50b2-11eb-9eaf-b78530015a04 version=
[2021-01-07T06:56:59.258Z] time="2021-01-07T06:56:59Z" level=debug msg="Destroying Fleet runtime dependencies"
[2021-01-07T06:57:00.195Z] Stopping fleet_kibana_1           ... 
[2021-01-07T06:57:00.195Z] Stopping fleet_package-registry_1 ... 
[2021-01-07T06:57:00.195Z] Stopping fleet_elasticsearch_1    ... 
[2021-01-07T06:57:01.590Z] 
Stopping fleet_kibana_1           ... done

Stopping fleet_package-registry_1 ... done

Stopping fleet_elasticsearch_1    ... done
Removing fleet_kibana_1           ... 
[2021-01-07T06:57:01.590Z] Removing fleet_package-registry_1 ... 
[2021-01-07T06:57:01.590Z] Removing fleet_elasticsearch_1    ... 
[2021-01-07T06:57:01.849Z] 
Removing fleet_kibana_1           ... done

Removing fleet_package-registry_1 ... done

Removing fleet_elasticsearch_1    ... done
Removing network fleet_default
[2021-01-07T06:57:02.108Z] time="2021-01-07T06:57:01Z" level=debug msg="Docker compose executed." cmd="[down --remove-orphans]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/.op/compose/profiles/fleet/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz682278649 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz682278649 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-01-07T06:57:02.108Z] time="2021-01-07T06:57:01Z" level=debug msg="Elastic Agent binary was removed." installer=debian-systemd path=/tmp/elastic-agent-8.0.0-SNAPSHOT-amd64.deb593691140
[2021-01-07T06:57:02.108Z] time="2021-01-07T06:57:01Z" level=debug msg="Elastic Agent binary was removed." installer=debian-tar path=/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz682278649
[2021-01-07T06:57:02.108Z] time="2021-01-07T06:57:01Z" level=debug msg="Elastic Agent binary was removed." installer=centos-systemd path=/tmp/elastic-agent-8.0.0-SNAPSHOT-x86_64.rpm057694594
[2021-01-07T06:57:02.108Z] <?xml version="1.0" encoding="UTF-8"?>
[2021-01-07T06:57:02.108Z] <testsuites name="main" tests="18" skipped="0" failures="1" errors="0" time="1313.059451058">
[2021-01-07T06:57:02.108Z]   <testsuite name="Fleet Mode Agent" tests="18" skipped="0" failures="1" errors="0" time="1182.744113607">
[2021-01-07T06:57:02.108Z]     <testcase name="Deploying the centos agent" status="passed" time="40.296057211"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Deploying the debian agent" status="passed" time="17.139321631"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Deploying the centos agent with enroll and then run on rpm and deb" status="passed" time="20.502409851"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Deploying the debian agent with enroll and then run on rpm and deb" status="passed" time="46.074448305"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Stopping the centos agent stops backend processes" status="passed" time="10.263836635"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Stopping the debian agent stops backend processes" status="passed" time="10.446204836"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Restarting the installed centos agent" status="passed" time="41.833554648"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Restarting the installed debian agent" status="passed" time="38.717960122"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Restarting the centos host with persistent agent restarts backend processes" status="passed" time="23.09943929"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Restarting the debian host with persistent agent restarts backend processes" status="passed" time="21.148719534"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Un-enrolling the centos agent" status="passed" time="10.231434151"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Un-enrolling the debian agent" status="passed" time="10.233360704"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Re-enrolling the centos agent" status="passed" time="33.671430007"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Re-enrolling the debian agent" status="passed" time="46.129210076"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Revoking the enrollment token for the centos agent" status="passed" time="25.95781456"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Revoking the enrollment token for the debian agent" status="passed" time="18.057494566"></testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Un-installing the installed centos agent" status="failed" time="367.34998031">
[2021-01-07T06:57:02.108Z]       <failure message="Step the agent is listed in Fleet as &#34;offline&#34;: The Agent is not in the offline status yet"></failure>
[2021-01-07T06:57:02.108Z]     </testcase>
[2021-01-07T06:57:02.108Z]     <testcase name="Un-installing the installed debian agent" status="passed" time="148.372418333"></testcase>
[2021-01-07T06:57:02.108Z]   </testsuite>
[2021-01-07T06:57:02.108Z] </testsuites>make: *** [functional-test] Error 1
[2021-01-07T06:57:02.108Z] Makefile:59: recipe for target 'functional-test' failed
[2021-01-07T06:57:02.108Z] + echo 'ERROR: functional-test failed'
[2021-01-07T06:57:02.108Z] ERROR: functional-test failed
[2021-01-07T06:57:02.108Z] + exit_status=1
[2021-01-07T06:57:02.108Z] + sed -e 's/^[ \t]*//; s#>.*failed$#>#g' outputs/TEST-fleet
[2021-01-07T06:57:02.108Z] + grep -E '^<.*>$'
[2021-01-07T06:57:02.108Z] + exit 1
[2021-01-07T06:57:02.252Z] Recording test results
[2021-01-07T06:57:02.692Z] [Checks API] No suitable checks publisher found.
[2021-01-07T06:57:02.716Z] Archiving artifacts
[2021-01-07T06:57:02.849Z] Failed in branch fleet_fleet_mode_agent
[2021-01-07T06:57:04.481Z] Stage "Release" skipped due to earlier failure(s)
[2021-01-07T06:57:06.058Z] Running on worker-1244230 in /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-565
[2021-01-07T06:57:06.122Z] [INFO] getVaultSecret: Getting secrets
[2021-01-07T06:57:06.195Z] Masking supported pattern matches of $VAULT_ADDR or $VAULT_ROLE_ID or $VAULT_SECRET_ID
[2021-01-07T06:57:08.312Z] + chmod 755 generate-build-data.sh
[2021-01-07T06:57:08.312Z] + ./generate-build-data.sh https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-565/ https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-565/runs/1 FAILURE 1677533
[2021-01-07T06:57:08.312Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-565/runs/1/steps/?limit=10000 -o steps-info.json
[2021-01-07T06:57:12.422Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-565/runs/1/tests/?status=FAILED -o tests-errors.json
[2021-01-07T06:57:12.422Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-565/runs/1/log/ -o pipeline-log.txt

🐛 Flaky test report

❕ There are test failures but not known flaky tests.

Expand to view the summary

Test stats 🧪

Test Results
Failed 3
Passed 80
Skipped 9
Total 92

Genuine test errors 3

💔 There are test failures but not known flaky tests, most likely a genuine test failure.

  • Name: Initializing / End-To-End Tests / fleet_fleet_mode_agent / Un-installing the installed centos agent – Fleet Mode Agent
  • Name: Initializing / Pre-Submit / Sanity checks / go vet – pre_commit.lint
  • Name: Initializing / Pre-Submit / Sanity checks / golangcilint – pre_commit.lint

@mdelapenya mdelapenya merged commit cd0bc3d into elastic:master Jan 7, 2021
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 7, 2021
…astic#565)

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 7, 2021
…astic#565)

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
mdelapenya added a commit that referenced this pull request Jan 7, 2021
…version before switching to the configuration file (#565) backport for 7.x (#568)

* chore: rename ELASTIC_AGENT_USE_CI_SNAPSHOTS (#566)

* chore: rename variable to a wider scope

* chore: extract common logic to a function

* chore: rename variable
# Conflicts:
#	.ci/Jenkinsfile
#	e2e/_suites/metricbeat/README.md

* fix: set image version before switching to the configuration file (#565)

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
mdelapenya added a commit that referenced this pull request Jan 7, 2021
…version before switching to the configuration file (#565) backport for 7.10.x (#569)

* chore: rename ELASTIC_AGENT_USE_CI_SNAPSHOTS (#566)

* chore: rename variable to a wider scope

* chore: extract common logic to a function

* chore: rename variable
# Conflicts:
#	.ci/Jenkinsfile
#	e2e/_suites/metricbeat/README.md

* fix: set image version before switching to the configuration file (#565)

We still need to download the proper config file from the PR, but we do not
have access to the commit SHA at this moment, unless other moddifications
are in place, like reading from Jenkins
@mdelapenya mdelapenya mentioned this pull request Jan 7, 2021
8 tasks
@mdelapenya mdelapenya deleted the 564-fix-metricbeat-version-prs branch January 11, 2021 17:20
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Metricbeat configuration tests are not using config files from Beats PRs
4 participants