Split json record in multiple records?

View thread

joost.bijl

Hi,

I'm experimenting with reading from an Azure eventhub with im_kafka. The eventhub receives security data from various security related azure components.

The im_kafka module works great after i found out that the username should be $connectionstring ;).

The output of the eventhub is a json dict with an array, like this: { records: [ {id: 1, msg: "xyz", etc},{id: 2, msg: "abc", etc}]}.

I tried to use extract_json("$.records") but that does not iterate over the array.

I also made a python script that writes the logs to a file, one line at a time:

from confluent_kafka import Consumer
import json
c = Consumer({....})

while True:
    msg = c.poll(1.0)

    eventhub_records = json.loads(msg.value())
    for record in eventhub_records['records']:
        print(json.dumps(record))

This works great, but i'd like to have something like this in nxlog. Can this be done, or does nxlog not support to split a single record into multiple records?

Thanks!