Kafka
Learn more about the component and how to use it.
Kafka produces records to the Kafka brokers configured in it.
Take a look at the configuration parameters of the component:
- Kafka Authentication Account (BASIC): if the Kafka server needs authentication, it will be necessary to create an account type BASIC for this component. We also support authentication via Kerberos.
- Truststore: if it’s necessary to inform a truststore to make the SSL Handshake using private certificates, a CERTIFICATE-CHAIN account type must be created, and the concatenated certificates must be informed. It’s optional to inform the password to be registered in the truststore creation, in the password field.
- Keystore: if it’s necessary to inform a keystore to make the mutual SSL authentication, a CERTIFICATE-CHAIN account type must be created, the complete chain with the concatenated certificates and the private key to be used for the SSL mutual authentication must be informed. If there’s a private key, it’s necessary to inform it in the password field.
- Brokers: brokers of the server (HOST: PORT) used to send registers. To inform multiple HOSTS, you can separate them by comma. Example: HOST1:PORT1,HOST2:PORT2,...,HOSTn:PORTn
- Security Protocol: the way the connection is established. It's optional to use a security channel (SSL) and an authentication channel (SASL). The use of both (SASL_SSL) is also possible.
IMPORTANT: due to the need of a great memory allocation, we don't support the following types of Security Protocol: PLAINTEXT and SASL_PLAINTEXT. For more information, visit the external Apache Kafka documentation.
- Topic Name: Kafka's topic name.
- Schema Registry URL: if at least one of the options Headers By Avro Schema, Payload As Avro and Partition Key As Avro is enabled, the field will be shown to configure the Schema Registry's URL.
- Schema Registry Account: account to authenticate with Schema Registry.
- Schema Registry Truststore: if it’s necessary to inform a truststore to make the SSL Handshake using private certificates, a CERTIFICATE-CHAIN account type must be created and the concatenated certificates must be informed. It’s optional to inform the password to be registered in the truststore creation, in the password field.
- Schema Registry Keystore: if it’s necessary to inform a keystore to make the mutual SSL authentication, a CERTIFICATE-CHAIN account type must be created, the complete chain with the concatenated certificates and the private key to be used for the SSL mutual authentication must be informed. If there’s a private key, it’s necessary to inform it in the password field.
- Headers: set of "key": "value" inputs, with headers to be sent in the message (optional field).
- Binary Headers: if the option is active, the header values are considered binary and are interpreted as a string with the base64 representation; otherwise, the header values are interpreted as text.
- Headers By Avro Schema: if the option is active, the component will validate the Headers based on an Avro Schema before sending the Headers.
- Headers Schema: if the option Headers By Avro Schema is active, the field will be shown to set the Headers Schemas to be validated.
- Headers Charset: name of the characters code for the header values codification (standard UTF-8).
- Payload: payload to be dispatched.
- Payload As Avro: if the option is active, the component will send the payload in Avro format.
- Payload Schema: this field is available only if the option Payload As Avro is active and informs the Payload Schema to be validated.
- Request Timeout: configuration that controls the maximum time (in milliseconds) that the client waits for the response of an inquiry. If the response isn't received before the maximum time elapses, the inquiry is automatically resent. Otherwise, there'll be an error if the retries are exhausted.
- Retries: if a value different than 0 (zero) is established, any register whose dispatch fails will be resent. These registers might be resent with a probable transient error.
- Metadata Timeout: maximum time to the Kafka register dispatch.
- Key Strategy: if the option Partition Key As Avro is active, the field will be shown to inform the subject to be used to construct the subject name for message keys. Value Strategy: if the option Payload as Avro is enabled, the field will be shown to inform the subject to be used to construct the subject name for message values.
- Value Strategy: if the option Payload as Avro is active, the field will be shown to inform the subject to be used to construct the subject name for message values.
- Fail On Error: if the option is active, the execution of the pipeline with error will be interrupted; otherwise, the pipeline execution proceeds, but the result will show a false value for the “success” property.
- Advanced Settings: if the option is active, you can access the following configurations:
- Kerberos Service Name: value defined in the sasl.kerberos.service.name property configured in the Kafka broker server side.
- Partition Number: specified the numbers of the partition where Kafka Trigger will send the messages to. If the property isn’t configured, the Kafka server will be responsible for deciding which topic partition the message will be sent to.
- Partition Key: a partition key can be specified to indicate partition where the message will be sent to. If the field isn’t filled, a partitioner based on hashing is used to determine the partition id given to each key.
- Partition Key As Avro: if the option is enabled, the component will send the partition key in Avro format.
- Partition Key Schema: if the option Partition Key As Avro is active, the field will be shown to inform the Partition Key Schema to be validated.
- Producer Client Name: origin identifier of the requests (optional).
- ACKS: configuration for acknowledging the message receipt by the Kafka broker (values: 0, 1 or ALL).
IMPORTANT: The messages sent in Avro format must be of the maximum size supported by Pipelines SMALL, MEDIUM and LARGE. The component does not support extreme reading scenarios of mega/giga/tera/peta bytes.
The Avro format support is currently in Beta phase.
{
"message": "{}",
"offset": 201,
"timestamp": 1585168528773,
"serializedKeySize": -1,
"serializedValueSize": 2,
"topic": "Welcome-Kafka",
"partition": 1,
"success": true
}
- message: message sent.
- offset: offset of the record in the topic/partition.
- timestamp: time stamp of the record in the topic/partition.
- serializedKeySize: size of the serialized key, uncompressed in bytes. If the value is null, the returned size is -1.
- serializedValueSize: size of the serialized value, uncompressed in bytes. If the value is null, the returned size is -1.
- topic: name of the topic.
- partition: partition the record was sent to.
- success: if "true", the dispatch was successfully made.
The component accepts any input message and can use it through Double Braces.
The component doesn't change any information of the input message. Therefore, it's returned to the following component or it's used as final answer if this component is the last step of the pipeline.
That allows the authentication of your producers and clients to the Kafka cluster (identity verification). This is also a secure way to allow your clients to confirm their identity.
To use the authentication via Kerberos in Kafka is necessary to have registered the configuration file “krb5.conf” in the Realm parameter. If you haven't done it yet, get in touch with us by the chat service. After finishing this step, all you have to do is to correctly set a Kerberos-type account and use it in the component.
Last modified 1mo ago