From 017002796d4354dde3b19cfdab071343be18229a Mon Sep 17 00:00:00 2001 From: prmellor Date: Wed, 28 Apr 2021 17:12:29 +0100 Subject: [PATCH 1/3] [DOC] Update external configuration to include DirectoryConfigProvider Signed-off-by: prmellor --- ...a.model.connect.ExternalConfiguration.adoc | 32 ++++++++++++------- 1 file changed, 21 insertions(+), 11 deletions(-) diff --git a/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc b/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc index b6689d4143..52a18c8c67 100644 --- a/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc +++ b/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc @@ -8,7 +8,7 @@ When applied, the environment variables and volumes are available for use when d [id='property-kafka-connect-external-env-{context}'] === `env` -The `env` property is used to specify one or more environment variables. +Use the `env` property to specify one or more environment variables. These variables can contain a value from either a ConfigMap or a Secret. .Example Secret containing values for environment variables @@ -51,7 +51,8 @@ spec: key: awsSecretAccessKey ---- -A common use case for mounting Secrets to environment variables is when your connector needs to communicate with Amazon AWS and needs to read the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` environment variables with credentials. +A common use case for mounting Secrets is for a connector to communicate with Amazon AWS. +The connector needs to be able to read the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`. To mount a value from a ConfigMap to an environment variable, use `configMapKeyRef` in the `valueFrom` property as shown in the following example. @@ -76,13 +77,22 @@ spec: [id='property-kafka-connect-external-volumes-{context}'] === `volumes` -You can also mount ConfigMaps or Secrets to a Kafka Connect pod as volumes. +Use volumes to mount ConfigMaps or Secrets to a Kafka Connect pod. Using volumes instead of environment variables is useful in the following scenarios: * Mounting truststores or keystores with TLS certificates * Mounting a properties file that is used to configure Kafka Connect connectors +Configuration providers load the values from mounted ConfigMaps or Secrets. + +* `FileConfigProvider` loads configuration values from properties in a file. +* `DirectoryConfigProvider` loads configuration values from separate files within a directory structure. ++ +For example, a directory might contain the credentials for truststores and keystores in separate files. + +In this example, a Secret named `mysecret` contains connector properties that specify a database name and password. + .Example Secret with properties [source,yaml,subs=attributes+] ---- @@ -99,10 +109,12 @@ stringData: <1> The connector configuration in properties file format. <2> Database username and password properties used in the configuration. -In this example, a Secret named `mysecret` is mounted to a volume named `connector-config`. -In the `config` property, a configuration provider (`FileConfigProvider`) is specified, which will load configuration values from external sources. -The Kafka `FileConfigProvider` is given the alias `file`, -and will read and extract database _username_ and _password_ property values from the file to use in the connector configuration. +The Secret and a configuration provider are specified in the Kafka Connect configuration: + +* The Secret is mounted to a volume named `connector-config`. +* `FileConfigProvider` is specified as the configuration provider and is given the alias `file`. ++ +`FileConfigProvider` reads and extracts the database _username_ and _password_ property values from the mounted Secret in connector configurations. .Example external volumes set to values from a Secret [source,yaml,subs="attributes+"] @@ -127,10 +139,8 @@ spec: Use a comma-separated list if you want to add more than one provider. <2> The `FileConfigProvider` is the configuration provider that provides values from properties files. The parameter uses the alias from `config.providers`, taking the form `config.providers.${alias}.class`. -<3> The name of the volume containing the Secret. Each volume must specify a name in the `name` property and a reference to ConfigMap or Secret. +<3> The name of the volume containing the Secret. Each volume must specify a name in the `name` property and a reference to a ConfigMap or Secret. <4> The name of the Secret. The volumes are mounted inside the Kafka Connect containers in the path `/opt/kafka/external-configuration/__`. -For example, the files from a volume named `connector-config` would appear in the directory `/opt/kafka/external-configuration/connector-config`. - -The `FileConfigProvider` is used to read the values from the mounted properties files in connector configurations. +For example, the files from a volume named `connector-config` will appear in the directory `/opt/kafka/external-configuration/connector-config`. From fd1ba6be2ac65efd54908e0fa188207889dd96ad Mon Sep 17 00:00:00 2001 From: prmellor Date: Fri, 7 May 2021 10:16:53 +0100 Subject: [PATCH 2/3] new example, connector configs, and split for provider types Signed-off-by: prmellor --- ...a.model.connect.ExternalConfiguration.adoc | 131 ++++++++++++++++-- 1 file changed, 116 insertions(+), 15 deletions(-) diff --git a/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc b/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc index 52a18c8c67..53f4d1d2df 100644 --- a/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc +++ b/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc @@ -81,19 +81,26 @@ Use volumes to mount ConfigMaps or Secrets to a Kafka Connect pod. Using volumes instead of environment variables is useful in the following scenarios: -* Mounting truststores or keystores with TLS certificates * Mounting a properties file that is used to configure Kafka Connect connectors +* Mounting truststores or keystores with TLS certificates Configuration providers load the values from mounted ConfigMaps or Secrets. +Volumes are mounted inside the Kafka Connect containers in the path `/opt/kafka/external-configuration/__`. +For example, the files from a volume named `connector-config` will appear in the directory `/opt/kafka/external-configuration/connector-config`. + +Use a _provider_ mechanism to avoid passing restricted information over the Kafka Connect REST interface. + * `FileConfigProvider` loads configuration values from properties in a file. * `DirectoryConfigProvider` loads configuration values from separate files within a directory structure. -+ -For example, a directory might contain the credentials for truststores and keystores in separate files. -In this example, a Secret named `mysecret` contains connector properties that specify a database name and password. +Use a comma-separated list if you want to add more than one provider. + +.Using `FileConfigProvider` to load property values + +In this example, a Secret named `mysecret` contains connector properties that specify a database name and password: -.Example Secret with properties +.Example Secret with database properties [source,yaml,subs=attributes+] ---- apiVersion: v1 @@ -103,18 +110,16 @@ metadata: type: Opaque stringData: connector.properties: |- <1> - dbUsername: my-user <2> + dbUsername: my-username <2> dbPassword: my-password ---- <1> The connector configuration in properties file format. <2> Database username and password properties used in the configuration. -The Secret and a configuration provider are specified in the Kafka Connect configuration: +The Secret and the `FileConfigProvider` configuration provider are specified in the Kafka Connect configuration. * The Secret is mounted to a volume named `connector-config`. -* `FileConfigProvider` is specified as the configuration provider and is given the alias `file`. -+ -`FileConfigProvider` reads and extracts the database _username_ and _password_ property values from the mounted Secret in connector configurations. +* `FileConfigProvider` is given the alias `file`. .Example external volumes set to values from a Secret [source,yaml,subs="attributes+"] @@ -135,12 +140,108 @@ spec: secret: secretName: mysecret <4> ---- -<1> The alias for the configuration provider, which is used to define other configuration parameters. -Use a comma-separated list if you want to add more than one provider. -<2> The `FileConfigProvider` is the configuration provider that provides values from properties files. +<1> The alias for the configuration provider is used to define other configuration parameters. +<2> `FileConfigProvider` provides values from properties files. The parameter uses the alias from `config.providers`, taking the form `config.providers.${alias}.class`. <3> The name of the volume containing the Secret. Each volume must specify a name in the `name` property and a reference to a ConfigMap or Secret. <4> The name of the Secret. -The volumes are mounted inside the Kafka Connect containers in the path `/opt/kafka/external-configuration/__`. -For example, the files from a volume named `connector-config` will appear in the directory `/opt/kafka/external-configuration/connector-config`. +Placeholders for the property values in the Secret are referenced in the connector configuration. +The placeholder structure is `file:__PATH-AND-FILE-NAME__:__PROPERTY-VALUE__`. +`FileConfigProvider` reads and extracts the database _username_ and _password_ property values from the mounted Secret in connector configurations. + +.Example connector configuration showing placeholders for external values +[source,yaml,subs="attributes+"] +---- +apiVersion: {KafkaConnectorApiVersion} +kind: KafkaConnector +metadata: + name: my-source-connector + labels: + strimzi.io/cluster: my-connect-cluster +spec: + class: io.debezium.connector.mysql.MySqlConnector + tasksMax: 2 + config: + database.hostname: 192.168.99.1 + database.port: "3306" + database.user: "${file:/opt/kafka/external-configuration/connector-config/mysecret:my-username}" + database.password: "${file:/opt/kafka/external-configuration/connector-config/mysecret:my-password}" + database.server.id: "184054" + #... +---- + +.Using `DirectoryConfigProvider` to load property values from separate files + +In this example, a `Secret` contains TLS truststore and keystore user credentials in separate files. + +.Example Secret with user credentials +[source,yaml,subs="attributes+"] +---- +apiVersion: v1 +kind: Secret +metadata: + name: mysecret + labels: + strimzi.io/kind: KafkaUser + strimzi.io/cluster: my-cluster +type: Opaque +data: <1> + ca.crt: # Public key of the client CA + user.crt: # User certificate that contains the public key of the user + user.key: # Private key of the user + user.p12: # PKCS #12 archive file for storing certificates and keys + user.password: # Password for protecting the PKCS #12 archive file +---- + +The Secret and the `DirectoryConfigProvider` configuration provider are specified in the Kafka Connect configuration. + +* The Secret is mounted to a volume named `connector-config`. +* `DirectoryConfigProvider` is given the alias `directory`. + +.Example external volumes set for user credentials files +[source,yaml,subs="attributes+"] +---- +apiVersion: {KafkaConnectApiVersion} +kind: KafkaConnect +metadata: + name: my-connect +spec: + # ... + config: + config.providers: directory + config.providers.directory.class: org.apache.kafka.common.config.provider.DirectoryConfigProvider <1> + #... + externalConfiguration: + volumes: + - name: connector-config + secret: + secretName: mysecret +---- +<1> The `DirectoryConfigProvider` provides values from files in a directory. +The parameter uses the alias from `config.providers`, taking the form `config.providers.${alias}.class`. + +Placeholders for the credentials are referenced in the connector configuration. +The placeholder structure is `directory:__PATH__:__FILE-NAME__`. +`DirectoryConfigProvider` reads and extracts the credentials from the mounted Secret in connector configurations. + +.Example connector configuration showing placeholders for external values +[source,yaml,subs="attributes+"] +---- +apiVersion: {KafkaConnectorApiVersion} +kind: KafkaConnector +metadata: + name: my-source-connector + labels: + strimzi.io/cluster: my-connect-cluster +spec: + class: io.debezium.connector.mysql.MySqlConnector + tasksMax: 2 + config: + security.protocol: SSL + ssl.truststore.type: PEM + ssl.truststore.location: "${directory:/opt/kafka/external-configuration/connector-config:ca.crt}" + ssl.keystore.type: PEM + ssl.keystore.location: ${directory:/opt/kafka/external-configuration/connector-config:user.key}" + #... +---- From 83544d3fc2920d0b4b6f0dd1328a2d05d6aafc4d Mon Sep 17 00:00:00 2001 From: prmellor Date: Tue, 11 May 2021 10:38:28 +0100 Subject: [PATCH 3/3] review edits TB Signed-off-by: prmellor --- ....api.kafka.model.connect.ExternalConfiguration.adoc | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc b/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc index 53f4d1d2df..9d19e9f712 100644 --- a/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc +++ b/documentation/api/io.strimzi.api.kafka.model.connect.ExternalConfiguration.adoc @@ -84,17 +84,17 @@ Using volumes instead of environment variables is useful in the following scenar * Mounting a properties file that is used to configure Kafka Connect connectors * Mounting truststores or keystores with TLS certificates -Configuration providers load the values from mounted ConfigMaps or Secrets. - -Volumes are mounted inside the Kafka Connect containers in the path `/opt/kafka/external-configuration/__`. +Volumes are mounted inside the Kafka Connect containers on the path `/opt/kafka/external-configuration/__`. For example, the files from a volume named `connector-config` will appear in the directory `/opt/kafka/external-configuration/connector-config`. -Use a _provider_ mechanism to avoid passing restricted information over the Kafka Connect REST interface. +Configuration _providers_ load values from outside the configuration. +Use a provider mechanism to avoid passing restricted information over the Kafka Connect REST interface. * `FileConfigProvider` loads configuration values from properties in a file. * `DirectoryConfigProvider` loads configuration values from separate files within a directory structure. -Use a comma-separated list if you want to add more than one provider. +Use a comma-separated list if you want to add more than one provider, including custom providers. +You can use custom providers to load values from other file locations. .Using `FileConfigProvider` to load property values