I've got some data that comes in with a somewhat unusual format.  It's a set of fixed fields, followed by a variable length set of keys, followed by a set of values.  It looks something like this (but with more fields):

col1, col2, col3, description(key1; key2; ...;keyn), val1, val2, ..., valn

I'm trying to transform this into something more like:

a=col1, b=col2, c=col3, key1=val1, key2=val2, ..., keyn=valn

I've actually got this working by using Exec and a bit of perl that I wrote that tears apart $raw_event and writes the modified logline to a domain socket, where a second instance Route is listening and sends the log over the network to its destination.  My problem is that this is not terribly performant, since it starts a perl process per log line.  I've had trouble figuring out another way to do this, mostly because the number of keys/values is variable.

Any suggestions on ways this might be done that are likely to have better performance?

AskedAugust 24, 2015 - 11:11pm

Answer (1)

If you already have perl code then you can reuse that with xm_perl. This will be executed in the same process space so it should be a bit faster.

Comments (3)

  • davidatpinger's picture

    Doesn't xm_perl pass an event object as @_?  I experimented a bit with that module, but it didn't look like it could access $raw_event.  The example in the docs appears to parse the log (via parse_syslog_bsd, in that example) to create an event object before calling the perl.  That's a catch-22 for me, since I'm using the perl to parse $raw_event.  Perhaps I've missed something in how xm_perl is used - is there a way to access the raw event?

  • adm's picture

    raw_event is is just like any other field (except that some modules use it explicitly) so you can do this:

    my $rawevt = Log::Nxlog::get_field($event, 'raw_event'); 
    # process $rawevt
    Log::Nxlog::set_field_string($event, 'raw_event', $rawevt);