FIM output to logstash has a mismatch Json format

Tags: FIM

#1 steven.su

Hi team, i use the FIM module to monitor a test file and output it to 2 destination: local file and remote logstash with tcp.

Now I could see the log in local file, but remote logstash fails to parse the log with json. After checking the log, i figure out that the log received by logstash is different:

Local: {"EventTime":"2022-02-13T16:11:50.094508+08:00","EventType":"CHANGE","Object":"FILE","PrevFileName":"c:\users\test\desktop\test20220211.txt","PrevModificationTime":"2022-02-11T19:18:59.925713+08:00","FileName":"c:\users\test\desktop\test20220211.txt","ModificationTime":"2022-02-13T16:10:53.402144+08:00","PrevFileSize":6,"FileSize":10,"DigestName":"SHA1","Digest":"31ca8d2ae67b53db43d3581974d12a48c648eca5","PrevDigest":"1b1e2aa8fb50e43dd20429afdbbec1b81b153853","Severity":"WARNING","SeverityValue":3,"EventReceivedTime":"2022-02-13T16:11:50.094508+08:00","SourceModuleName":"fim","SourceModuleType":"im_fim"}

Logstash: [2022-02-13T08:11:49,919][ERROR][logstash.codecs.json ][main][2869f035623bc8e694e78ee6b779cd6214f6eba705fdae0bea0b55fadc035072] JSON parse error, original data now in message field {:message=>"Unexpected character ('-' (code 45)): Expected space separating root-level values\n at [Source: (String)"2022-02-13 16:11:50 HKLAP0240 WARNING EventType="CHANGE" Object="FILE" PrevFileName="c:\\users\\test\\desktop\\test20220211.txt" PrevModificationTime="2022-02-11 19:18:59" FileName="c:\\users\\test\\desktop\\test20220211.txt" ModificationTime="2022-02-13 16:10:53" PrevFileSize="6" FileSize="10" DigestName="SHA1" Digest="31ca8d2ae67b53db43d3581974d12a48c648eca5" PrevDigest="1b1e2aa8fb50e43dd20429afdbbec1b81b153853" SeverityValue="3""; line: 1, column: 6]", :exception=>LogStash::Json::ParserError, :data=>"2022-02-13 16:11:50 HKLAP0240 WARNING EventType="CHANGE" Object="FILE" PrevFileName="c:\\users\\test\\desktop\\test20220211.txt" PrevModificationTime="2022-02-11 19:18:59" FileName="c:\\users\\test\\desktop\\test20220211.txt" ModificationTime="2022-02-13 16:10:53" PrevFileSize="6" FileSize="10" DigestName="SHA1" Digest="31ca8d2ae67b53db43d3581974d12a48c648eca5" PrevDigest="1b1e2aa8fb50e43dd20429afdbbec1b81b153853" SeverityValue="3""} { "tags" => [ [0] "jsonparsefailure" ], "message" => "2022-02-13 16:11:50 HKLAP0240 WARNING EventType="CHANGE" Object="FILE" PrevFileName="c:\\users\\test\\desktop\\test20220211.txt" PrevModificationTime="2022-02-11 19:18:59" FileName="c:\\users\\test\\desktop\\test20220211.txt" ModificationTime="2022-02-13 16:10:53" PrevFileSize="6" FileSize="10" DigestName="SHA1" Digest="31ca8d2ae67b53db43d3581974d12a48c648eca5" PrevDigest="1b1e2aa8fb50e43dd20429afdbbec1b81b153853" SeverityValue="3"", "@version" => "1", "host" => "53959da2d559", "@timestamp" => 2022-02-13T08:11:49.924Z, "path" => "/opt/nxlog/var/log/nxlog/logmessage.log", "type" => "json" }

It seems the "2022-02-13 16:11:50 HKLAP0240 WARNING " is added only in the tcp stream and could not be identified as Json format by logstash. Is it normal to see the scenario and is there any workaround? Thank you.

#2 jeffron Nxlog ✓
#1 steven.su
Hi team, i use the FIM module to monitor a test file and output it to 2 destination: local file and remote logstash with tcp. Now I could see the log in local file, but remote logstash fails to parse the log with json. After checking the log, i figure out that the log received by logstash is different: Local: {"EventTime":"2022-02-13T16:11:50.094508+08:00","EventType":"CHANGE","Object":"FILE","PrevFileName":"c:\users\test\desktop\test20220211.txt","PrevModificationTime":"2022-02-11T19:18:59.925713+08:00","FileName":"c:\users\test\desktop\test20220211.txt","ModificationTime":"2022-02-13T16:10:53.402144+08:00","PrevFileSize":6,"FileSize":10,"DigestName":"SHA1","Digest":"31ca8d2ae67b53db43d3581974d12a48c648eca5","PrevDigest":"1b1e2aa8fb50e43dd20429afdbbec1b81b153853","Severity":"WARNING","SeverityValue":3,"EventReceivedTime":"2022-02-13T16:11:50.094508+08:00","SourceModuleName":"fim","SourceModuleType":"im_fim"} Logstash: [2022-02-13T08:11:49,919][ERROR][logstash.codecs.json ][main][2869f035623bc8e694e78ee6b779cd6214f6eba705fdae0bea0b55fadc035072] JSON parse error, original data now in message field {:message=>"Unexpected character ('-' (code 45)): Expected space separating root-level values\n at [Source: (String)"2022-02-13 16:11:50 HKLAP0240 WARNING EventType="CHANGE" Object="FILE" PrevFileName="c:\\users\\test\\desktop\\test20220211.txt" PrevModificationTime="2022-02-11 19:18:59" FileName="c:\\users\\test\\desktop\\test20220211.txt" ModificationTime="2022-02-13 16:10:53" PrevFileSize="6" FileSize="10" DigestName="SHA1" Digest="31ca8d2ae67b53db43d3581974d12a48c648eca5" PrevDigest="1b1e2aa8fb50e43dd20429afdbbec1b81b153853" SeverityValue="3""; line: 1, column: 6]", :exception=>LogStash::Json::ParserError, :data=>"2022-02-13 16:11:50 HKLAP0240 WARNING EventType="CHANGE" Object="FILE" PrevFileName="c:\\users\\test\\desktop\\test20220211.txt" PrevModificationTime="2022-02-11 19:18:59" FileName="c:\\users\\test\\desktop\\test20220211.txt" ModificationTime="2022-02-13 16:10:53" PrevFileSize="6" FileSize="10" DigestName="SHA1" Digest="31ca8d2ae67b53db43d3581974d12a48c648eca5" PrevDigest="1b1e2aa8fb50e43dd20429afdbbec1b81b153853" SeverityValue="3""} { "tags" => [ [0] "jsonparsefailure" ], "message" => "2022-02-13 16:11:50 HKLAP0240 WARNING EventType="CHANGE" Object="FILE" PrevFileName="c:\\users\\test\\desktop\\test20220211.txt" PrevModificationTime="2022-02-11 19:18:59" FileName="c:\\users\\test\\desktop\\test20220211.txt" ModificationTime="2022-02-13 16:10:53" PrevFileSize="6" FileSize="10" DigestName="SHA1" Digest="31ca8d2ae67b53db43d3581974d12a48c648eca5" PrevDigest="1b1e2aa8fb50e43dd20429afdbbec1b81b153853" SeverityValue="3"", "@version" => "1", "host" => "53959da2d559", "@timestamp" => 2022-02-13T08:11:49.924Z, "path" => "/opt/nxlog/var/log/nxlog/logmessage.log", "type" => "json" } It seems the "2022-02-13 16:11:50 HKLAP0240 WARNING " is added only in the tcp stream and could not be identified as Json format by logstash. Is it normal to see the scenario and is there any workaround? Thank you.

Hi Steve,

Can you share your configuration file?

BR

Jeffron