to_json() don't parse nested object and breaks json string
Hello, I'm having trouble sending logs in json format generated from a command. The command generate (one json per line, json syntax checked with jsonlint and all json lines are ok. I send the input log to output file per debug and the json is ok)
{"metricset":{"module":"system","name":"memory"},"system":{"memory":{"total`":4294967296,"free":1709912064,"used":{"bytes":2585055232,"pct":60.19},"swap":{"total":2046,"free":2012,"used":{"bytes":34,"pct":1.66}}}}} {"metricset":{"module":"system","name":"cpu"},"system":{"cpu":{"cores": 1,"idle":{"pct":99},"irq":{"pct":0},"system":{"pct":0},"user":{"pct":1}}}}
When nxlog send the data to logstash with om_tcp, logstash receive (review the system field, it's not the same as the one generated in the input)
Oct 01 03:04:54 elk logstash[43975]: { Oct 01 03:04:54 elk logstash[43975]: "SourceModuleName" => "counters", Oct 01 03:04:54 elk logstash[43975]: "system" => "{"cpu":{"cores":1,"idle":{"pct":99}"irq":{"pct":0}"system":{"pct":0}"user":{"pct":1}", Oct 01 03:04:54 elk logstash[43975]: "@timestamp" => 2019-10-01T01:04:54.022Z, Oct 01 03:04:54 elk logstash[43975]: "SourceModuleType" => "im_exec", Oct 01 03:04:54 elk logstash[43975]: "port" => 3150, Oct 01 03:04:54 elk logstash[43975]: "@metadata" => { Oct 01 03:04:54 elk logstash[43975]: "input" => "tcp", Oct 01 03:04:54 elk logstash[43975]: "week" => "2019.10-40", Oct 01 03:04:54 elk logstash[43975]: "month" => "2019.10", Oct 01 03:04:54 elk logstash[43975]: "stdout" => "true", Oct 01 03:04:54 elk logstash[43975]: "index" => "in-test-nxlog-2019.10-40", Oct 01 03:04:54 elk logstash[43975]: "day" => "2019.10.01" Oct 01 03:04:54 elk logstash[43975]: }, Oct 01 03:04:54 elk logstash[43975]: "@version" => "1", Oct 01 03:04:54 elk logstash[43975]: "metricset" => "{"module":system,"name":cpu}", Oct 01 03:04:54 elk logstash[43975]: "client" => { Oct 01 03:04:54 elk logstash[43975]: "ip" => "10.71.218.62" Oct 01 03:04:54 elk logstash[43975]: }, Oct 01 03:04:54 elk logstash[43975]: "EventReceivedTime" => "2019-10-01 03:03:58" Oct 01 03:04:54 elk logstash[43975]: }
If we add the to_json() exec in the input configuration, the debug output breaks in the same way. So, I think that the to_json procedure have a bug with nested json object.
<Extension json> Module xm_json </Extension>
<Extension charconv> Module xm_charconv </Extension>
powershell to recover counter metrics from a windows 2003 server at the same way as metrcbeat do it
<Input counters> Module im_exec InputType LineBased Command "%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\powershell.exe" Arg "-ExecutionPolicy" Arg "Bypass" Arg "-NoProfile" Arg "-File" Arg %ROOT%\modules\input\counters.ps1 Arg -interval Arg 60 Exec parse_json(); </Input>
<Output tcp> Module om_tcp Host elk Port 5045 Exec to_json(); </Output>
<Output debug> Module om_file CreateDir TRUE File "C:\Program Files\nxlog\data\debug.log" # if we uncomment this line, the debug file breaks at the same way #Exec to_json(); </Output>
<Route 1> Path counters => tcp </Route>
<Route 2> Path counters => debug </Route>
I think you should simply get rid of parse_json();
and to_json();
so that the original JSON remains intact.