Collecting events from Apache Kafka¶
TeskaLabs LogMan.io Collector is able to collect events from Apache Kafka, namely its topics. The events stored in Kafka may contain any data encoded in bytes, such as logs about various user, admin, system, device and policy actions.
Prerequisites¶
In order to create a Kafka consumer, the boostrap_servers
, that is the location of the Kafka nodes, need to be known as well as the topic
where to read the data from.
LogMan.io Collector Configuration¶
The LogMan.io Collector provides input:Kafka:
input section, that needs to be specified in the YAML configuration. The configuration looks as follows:
input:Kafka:KafkaInput:
bootstrap_servers: <BOOTSTRAP_SERVERS>
topic: <TOPIC>
group_id: <GROUP_ID>
...
The input creates a Kafka consumer for the specific topic(s).
Configuration options related to the connection establishment:
bootstrap_servers: # Kafka nodes to read messages from (such as `kafka1:9092,kafka2:9092,kafka3:9092`)
Configuration options related to the Kafka Consumer setting:
topic: # Name of the topics to read messages from (such as `lmio-events` or `^lmio.*`)
group_id: # Name of the consumer group (such as: `collector_kafka_consumer`)
refresh_topics: # (optional) If more topics matching the topic name are expected to be created during consumption, this options specifies in seconds how often to refresh the topics' subscriptions (such as: `300`)
Options bootstrap_servers
, topic
and group_id
are always required!
topic
can be a name, a list of names separated by spaces or a simple regex (to match all available topics, use ^.*
)
For more configuration options, please refer to librdkafka configuration guide.