Skip to content

Latest commit

 

History

History
35 lines (24 loc) · 3.7 KB

README.md

File metadata and controls

35 lines (24 loc) · 3.7 KB

Event generation testing framework

Introduction

What it is

Many modules do requests to an HTTP endpoint to fetch metrics data. Then they do data manipulation and enriching of this data before sending it to Elasticsearch.

What we wanted to achieve is to mock the HTTP responses from the modules into a generic server that will serve those responses. So, for each tested metricset, an HTTP server is launched at a random port (but be aware that JSON responses written to disk are hardcoded with value "127.0.0.1:5555") with the mocked response from a fixed module version to respond to the HTTP requests, once the test is done, it's shut down. This way we isolate the manipulation of data in some tests and reduce lot of boilerplate we had in many modules.

How to use it

The idea is simple, head to beats/metricbeat/mb/testing/data and run go test . It will run all tests, each metricset of each module. An alternative is to just run from metricbeat mage mockedTests to achieve the same result but using environment variables instead of flags, for example: MODULE=apache GENERATE=true mage mockedTests

Worth to mention
  • If the input file in testdata folder is prefixed (named) docs, whatever its extension is, and the flag -data is passed; the framework will also create a docs.json file in _meta folder of the metricset as historically has been done in Metricbeat.
  • Config file must be called config.yml and be located inside metricbeat/module/{module}/{metricset}/_meta/testdata

Available flags / environment variables

  • -data: It will regenerate the expected JSON file with the output of an event an place it within testdata folder. For example: go test . -data. If using mage, a environment variable GENERATE is available to
  • -module: Test only the specified module. For example go test . -module=apache. If using mage MODULE environment variable must be set with the module name that must be tested.

You can also combine both flags with go test . -data -module=apache to generate files for Apache module only.

Available settings in config.yml

  • path: (string) Path to reach the directory containing the files to read from. Default is "_meta/testdata".
  • wirtepath: (string) Path to the directory where the expected files are going to be written. Default is "_meta/testdata".
  • type: (string) The type of the test to run. At the moment, only http is supported.
  • url: (string) This is the URL path that the module usually fetches to get metrics. For example, in case of Apache module this url is /server-status?auto=
  • suffix: (string) The suffix that the input file has. By default json other common suffixes are plain (string) for plain text files.
  • omit_documented_fields_check: (List of strings) Some fields generated by the modules are completely dynamic so they aren't documented in fields.yml. Set a list of fields or paths in your metricset that might not be documented like apache.status.* for all fields within apache.status object or apache.status.hostname just for that specific field. Even you can omit all fields using *
  • remove_fields_from_comparison: (List of strings) Some fields must be removed for byte-to-byte comparison but they must be printed anyways in the expected JSON files. Write a list of those fields here. For example, apache.status.hostname in the Apache module was generating a new port on each run so a comparison wasn't possible. Set one item with apache.status.hostname to omit this field when comparing outputs.
  • module: (Map) Anything added to this map will be appended in the module config before launching tests. For example, This is useful for some modules that requires the user to specify a namespace.