LogMan.io Architecture

LogMan.io Architecture

lmio-collector

LogMan.io Collector serves to receive log lines from various sources such as SysLog NG, files, Windows Event Forwarding, databases via ODBC connectors and so on. The log lines may be further processed by a declarative processor and put into LogMan.io Ingestor via WebSocket.

lmio-ingestor

LogMan.io Ingestor receives events via WebSocket, transforms them to Kafka-readable format and put them to Kafka collected- topic. There are multiple ingestors for different event formats, such as SysLog, databases, XML and so on.

lmio-parser

LogMan.io Parser runs in multiple instances to receive different formats of incoming events (different Kafka topics) and/or the same events (the instances then run in the same Kafka group to distribute events among them). LogMan.io Parser loads the LogMan.io Library via ZooKeeper or from files to load declarative parsers and enrichers from configured parsing groups.

If the events were parsed by the loaded parser, they are put to input Kafka topic, otherwise they enter the unparsed Kafka topic.

lmio-dispatcher

LogMan.io Dispatcher loads events from input Kafka topic and sends them both to all subscribed (via ZooKeeper) LogMan.io Correlator instances and ElasticSearch in the appropriate index, where all events can be queried and visualized using Kibana.

LogMan.io Dispatcher runs in multiple instances as well.

lmio-correlator

LogMan.io Correlator uses ZooKeeper to subscribe to all LogMan.io Dispatcher instances to receive parsed events (log lines etc.). Then LogMan.io Correlator loads the LogMan.io Library from ZooKeeper or from files to create correlators based on the declarative configuration. Events produced by correlators (Window Correlator, Match Correlator) are then handed down to LogMan.io Dispatcher and LogMan.io Watcher via Kafka.

lmio-watcher

LogMan.io Watcher observes changes in lookups used in LogMan.io Parsers and LogMan.io Correlators instances. When a change occurs, all running components that use LogMan.io Library are notified via Kafka topic lookups about the change and the lookup is updated in the ElasticSearch, which serves as a persistent storage for all lookups.