Ask questions. Get answers. Find technical product solutions from passionate experts in the NXLog community.

Sending multi-line messages across om_tcp

Hello

I am a newbie to using NXLog. I am attempting to send custom multi-line messages read from a txt file using im_file to a server using om_tcp. I understand that I can use xm_multiline to read the file but sending NewLine characters across om_tcp will result in new messages per line on the receiving end.

I have tried replacing the NewLine character in the read block but the replace doesn't seem to replace the NewLine Character.

Can someone help me find a way to send multi-line messages to a tcp listener using NXlog.

This is a sample of the log file with the message start being the @timestamp

@12:02:23.7990 [ISCC] Party removed [ssp view]:
    @ c:007c02772ce2e0f0,03b23dd8 @ m:0000000000000000,00000000,0000000000000000 p:3 i:00001170 nw:00000000:000111705693da93 t:2
      p:041c88c8 @ c:007c02772ce2e0f0,03b23dd8 r:2 t:0 s:c n:233061
    - p:03d51b00 @ c:007c02772ce2e0f0,03b23dd8 r:1 t:1 s:0 n:T4464#1
      p:041ceeb0 @ c:007c02772ce2e0f0,03b23dd8 r:10 t:1 s:ac n:233624

This is the block that I have tried. The \t escape character does work and will replace tabs with the replacement value

<Input IN>
    Module   im_file
    File     'd:\logs\nxlog.txt'
    SavePos  FALSE
    ReadFromLast FALSE
    InputType    multiline
    Exec $raw_event = replace($raw_event, "\r\n", " ");
    Exec $raw_event = replace($raw_event, "\r", " ");
    Exec $raw_event = replace($raw_event, "\n", " ");
    Exec $raw_event = replace($raw_event, "0x0A", " ");
    Exec $raw_event = replace($raw_event, "0x0DA", " ");
    Exec $raw_event = replace($raw_event, "0x0D", " ");
</Input>

Thanks

Brent

 


bpedersen created
Replies: 4
View post »
last updated
Parsing Windows Logs (from FILE)

I am having no luck with a simple parsing of EVT log files.

Is there an easy way to read in EVT (Binary Log files) and output them in Syslog Format?

This is the config file I am using: (I Used python evtx to extract into text XML) However that yields XML attributes which apparently are not parse-able.

Problem Set:

Give 3 files (System.evt, Application.evt, and Security.EVT) parse the EVT format into Syslog_BSD(or IETF) formats.

 

 

<Extension multiline>
    Module    xm_multiline
    HeaderLine    /^<event>/
    EndLine    /^</event>/
</Extension>

<Extension syslog>
Module    xm_syslog
</Extension>

<Extension xmlparser>
Module    xm_xml
</Extension>

<Extension json>
Module    xm_json
</Extension>

<Extension fileop>
Module xm_fileop
</Extension>


<Input in>
    Module im_file
    File "%ROOT%/test.xml"
#    File "/tmp/cab.xml"
    SavePos    FALSE
    ReadFromLast FALSE
    InputType    multiline
    <Exec>
      # Discard everything that doesn't seem to be an xml event   
      if $raw_event !~ /^<event>/ drop();

      # Parse the xml event
      parse_xml(); to_syslog_ietf();

      # Rewrite some fields 
      $EventTime = parsedate($timestamp);
      delete($timestamp);
      delete($EventReceivedTime);

      # Convert to JSON
      to_json();
    </Exec>
</Input>


<Output out>
    Module  om_file
    File    "%ROOT%/out.log"
    Exec    parse_xml();
    Exec     log_info("FIELD" +  to_json());
</Output>


<Route 1>
    Path    in => out
</Route>


Colin.Bitterfield created
Replies: 1
View post »
last updated
Sending DHCP and Windows event logs from same server

Hi,

I'm a newbie tetsing the nxlog and i'm after sending windows dhcp and evet logs from same server to an SIEM. Does anyone have a working nxlog configuration that you don't mind posting here. 

Thanks in advance.

Regards

Milton


milton_jose created
Replies: 1
View post »
last updated
Unable to file_remove on Linux Setup

Hi Everyone,

New to nxlog, so apologies in advance! I am currently deploying nxlog on a Linux server (Red Hat Enterprise release 6.6). I am currently trying to remove a file after nxlog has finished processing. From the documentation, I am using file_remove, which is not working. Here is my config that does not throw any syntax errors when starting nxlog. In the debug log, I do not see an attempt to try and match files for removing:

<Extension fileop>
 Module   xm_fileop
    <Schedule>
        Every   1 min
        Exec    file_remove('/eventarchive/processed/*.raw', (now()));
    </Schedule>
</Extension>

I used this same syntax on a windows setup to test it, which worked - it successfully removed files. Does anyone know if there are any limitations with Linux that would stop this from working? Is there something I'm doing wrong?

As a side question, does anyone know the best way to configure nxlog to remove a file after processing, as opposed to setting a time interval like above?

Thanks in advance!

 

 

 


tav1 created
Replies: 1
View post »
last updated
Remove Duplicates Help Needed

Hello All,

I am trying to use the pm_norepeat module to remove duplicate log messages that sometimes flood my logs. I am apparently not grasping how this works as the duplicate records are still flooding through the logs when I attempt to use the pm_norepeat function.

Can anyone advise on what I am doing wrong?  Is there a different way to accomplish de-duplification of messages with the im_msvistalog Exec to_syslog_bsd() modules ?????

Here are my configuration file statements:

define ROOT C:\Program Files (x86)\nxlog
define ROOT_STRING C:\Program Files (x86)\\nxlog
  
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log

<Extension syslog>
    Module      xm_syslog
</Extension>
 
<Input in>
    Module      im_msvistalog
    Exec    to_syslog_bsd();   
    ReadFromLast TRUE
    SavePos     TRUE
    Query       <QueryList>\
                    <Query Id="0">\
                        <Select Path="Application">*</Select>\
                        <Select Path="System">*</Select>\
                        <Select Path="Security">*</Select>\
                    </Query>\
                </QueryList>    
</Input>

<Processor norepeat>
   Module    pm_norepeat
   CheckFields Hostname, SourceName, Message
</Processor>

<Output out>
   Module      om_udp
    Host xxxxx.xxxxxxxxxxapp.com
    Port 12345
</Output>

<Route 1>
    Path in => norepeat => out
</Route>


Zendog created
Replies: 1
View post »
last updated
om_tcp - reconnect(); on a schedule

I have a series of file inputs and tpc outputs. The output targets are geo-balanced across data centers which traffic is directed to based on the originating request. If we have a situation where we need to take one of the two collectors down all the agents would point at one side. Because of this, I want NXLog to reconnect to the output target at a particular interval. How do you properly use the 'reconnect();' procedure? We have a series of inputs using the same outputs.

<Output Tomcat_App_Logs_Out>
    Module      om_tcp
    Host        NP-NC-TOMCAT.Contoso.com
    Port        4112
   <Schedule>
        Every   1 min
        Exec    reconnect();
   </Schedule>
</Output>

 


Jakauppila created
Replies: 1
View post »
last updated
ERROR maximum number of fields reached, limit is 50

Hello Guys,

         I download and I'm using the version nxlog-ce-2.9.1504.msi and everything working as well, but I have to parse logs from Hadoop and I had the error in the maximum number of fields my fields in Hadoop logs are 80 fields.

Here is my module to parse this:

<code>

<Extension hp>
    Module xm_csv
    Fields        $date,$jobname,$jobid,$username,$jobpriority,$jobstatus,$totalmaps,$totalreduces,$failedmaps,$failedreduces,$submittime,$launchtime,$finishtime,$mapavgtime,$reduceavgtime,$mapmaprfsbytesread,$reducemaprfsbytesread,$mapmaprfsbyteswritten,$reducemaprfsbyteswritten,$mapfilebyteswritten,$reducefilebyteswritten,$mapinputrecords,$mapoutputbytes,$mapspilledrecords,$reduceshufflebytes,$reducespilledrecords,$mapcpumilliseconds,$reducecpumilliseconds,$combineinputrecords,$combineoutputrecords,$reduceinputrecords,$reduceinputgroups,$reduceoutputrecords,$mapgctimeelapsedmilliseconds,$reducegctimeelapsedmilliseconds,$mapphysicalmemorybytes,$reducephysicalmemorybytes,$mapvirtualmemorybytes,$reducevirtualmemorybytes,$maptaskmaxtime,$successattemptmaxtime_maptaskmaxtime,$allattemptmaxtime_maptaskmaxtime,$server_successattemptmaxtime_maptaskmaxtime,$server_allattemptmaxtime_maptaskmaxtime,$maptaskmintime,$maptaskmaxinput,$maptaskmininput,$maptaskinputformaxtime,$maptaskinputformintime,$reducetaskmaxtime,$successattemptmaxtime_reducetaskmaxtime,$allattemptmaxtime_reducetaskmaxtime,$server_successattemptmaxtime_reducetaskmaxtime,$server_allattemptmaxtime_reducetaskmaxtime,$reducetaskmintime,$reducetaskmaxinput,$reducetaskmininput,$reducetaskinputformaxtime,$reducetaskinputformintime,$jobpool,$io_sort_spill_percent,$shuffle_input_buffer_percent$,$io_sort_mb,$io_sort_factor,$map_class,$reduce_class,$inputformat_class,$output_compress,$output_compression_codec,$compress_map_output,$map_output_compression_codec,$input_dir,$output_dir,$map_jvm,$reduce_jvm,$working_dir,$java_command,$job_submithost,$reduce_parallel_copies,$racklocalmaps,$datalocalmaps,$totallaunchedmaps,$fallowreduces,$fallowmaps,$mapoutputrecords,$dummy
    FieldTypes    text,text,text,text,text,text,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,text,text,integer,integer,integer,integer,integer,integer,integer,integer,text,text,integer,integer,integer,integer,integer,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,integer,integer,integer,text,text,integer,text
    Delimiter    \t
</Extension>
#
<Input hadoop>
  Module         im_file
  File             'E:\\Hadoop\\analytics.sp_hadoop_stats.txt'
  SavePos         TRUE
  Recursive        TRUE
  Exec    if ( $raw_event =~ /^#/ or size($raw_event) == 0 ) drop(); \
                else \
                { \
                    hp->parse_csv();                            \
                    $EventTime = parsedate($date); \
                    $EventTime = strftime($EventTime, "%Y-%m-%d"); \
                    $SourceName = "Hadoop"; \
                    $hostname = hostname(); \
                    to_json(); \
                } 
</Input>

 

</code>

 

        If possible increase the fields in some files or add the option in the nxlog.conf file?.

Thank you.

 


Juan Andrés.Ramirez created
Replies: 1
View post »
last updated
trying to post an existing Json file to a remote web api

we use log4net to produce log files, and have the json extensions for log4net so the file output is as follows

{
"date":"2016-03-18T13:49:36.9504697-04:00","level":

"ERROR",
"appname":"log4net_json.vshost.exe",
"logger":"log4net_json.Program",
"thread":"9",
"ndc":"(null)",
"message":"System.DivideByZeroException: Attempted to divide by zero.\r\n   at log4net_json.Program.Main() in c:\\temp\\tryMeOut\\log4net.Ext.J
            son.Example-master\\log4net_json\\log4net_json\\Program.cs:line 20"

}

we want to use nxlog to consume the file and post the results to a remote web api that will insert the json values into our database.  I have tried loads of variations of config changes and 
spent hours on the internet / reading the documentation with no luck.  Below is one of the config files I have used, however each time i send it out the i only get the first character of the 
the text, the "{"  .  the rest is missing and will not post

define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
LogLevel INFO

<Extension _syslog>
    Module xm_syslog
</Extension>

<Input in>
    Module      im_file
    File "C:/temp/WOWLog.xml"
    InputType LineBased    
    SavePos     FALSE  
    ReadFromLast FALSE
    
    Exec    if $raw_event =~ /^--/ drop();
    Exec    $raw_event = replace($raw_event, "\r\n", ";");
</Input>

<Output out>
  Module              om_http
  URL                 http://localhost:51990/api/LogFile/
# HTTPSAllowUntrusted TRUE
</Output>


<Route 1>
    Path        in => out      
</Route>

Looking for the correct settings for the configuration file.  
for sanity I have been able to send the same file to the remote web api via a .net console application with a web client

 


gregB created
Replies: 1
View post »
last updated
Help with connecting NXLog to Symantec MSS

Hi NXLog Helpers,

I am looking for some help on getting NXLog connected to Symantec MSS (Managed Security Services) and kind of on my last string with this.  Right now, I am getting the following error and was wondering what I am missing. I am using this section as my conf file:

define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
<Extension syslog>
Module xm_syslog
</Extension>
<Input internal>
Module im_internal
</Input>
<Input in>
Module im_msvistalog
# For windows 2003 and earlier use the following:
# Module im_mseventlog
</Input>
<Output out>
Module om_udp
Host (IP Address has been removed)
Port 514
Exec to_syslog_snare();
</Output>
<Route 1>
Path eventlog, in => out
</Route>

I have also attached my log file, if you need anymore information let me know.

Any help on this would be amazing and help me out a ton.

Thank you NXLog helpers, you guys/gals will save my day and be amazing.


Alex.Gregor created
DateTime format conversion

Hi all

I'm trying to forward logs to my Graylog server using nxlog, and it's working fine, except for one minor problem which I've been unable to fix:

The date/time format in the log is as follows:

2016/03/17    07:06:27 AM     Message

I have been able to extract the date into $1 and time into $2 with regex (and message into $3) without an issue. However, I'm unable to parse the combination of the two as a date and get it into 24H format using parsedate or strptime.

Any ideas how I can populate $EventTime with the date + 24H time format from the above? Everything I try seems to result in the field being undefined.

Thanks


Ascendo created
Replies: 1
View post »
last updated
Recursive file_remove

Is there any way to recursively delete files with file_remove?

I have applications logging in the following structure:

  • D:\Applogs\App1\Access-3172016.log
  • D:\Applogs\App2\Access-3162016.log

We're able to define an input and collect the logs no problem with the following definition:

<Input Access_Logs>

    Module    im_file
    
    File    "D:\\AppLogs\\Access*.log"
    
    Recursive TRUE
    
    ...

</Input>

The number of variance of apps are large, so ideally I would want to specify the schedule to delete logs with the following:

<Extension fileop>
   Module      xm_fileop
   <Schedule>
        Every     1 min
        Exec    file_remove('D:\\AppLogs\\Access-*.log', (now() - 259200 ));
   </Schedule>
</Extension>

But this looks specifically in the directory defined and looks like you cannot recursively search the directory heirarchy for the file like you can in the input.

Is there any way I can get this functionality?


Jakauppila created
Replies: 1
View post »
last updated
ODBC ASSERTION FAILED

Hello everyone! 

When I using im_odbc I get an error in log file:

2016-03-17 11:57:38 INFO nxlog-2.10.1577-trial started
2016-03-17 11:57:45 ERROR ### ASSERTION FAILED at line 480 in im_odbc.c/im_odbc_execute(): "imconf->results.num_col > 0" ###

Anybody know what is mean and what  I should to do? 
I am using nxlog-2.10.1577 with next config:
 

define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
LogLevel INFO

<Extension gl>
    Module   xm_gelf
</Extension>

# INPUTS ------------------------------------------------------

<Input in6>

    Module im_odbc
    
    ConnectionString DSN=PROD; database=BACAS_SRV;
    PollInterval 250
    
    # EXCEPTIONS IN DATA 
    SQL {CAll dbo.uspGetLoggingExtendedUpperLakes}
    SavePos TRUE
    
    Exec $Hostname = "UpperLakes";
    Exec $SourceName = "UpperLakes";
</Input>

# OUTPUTS ------------------------------------------------------

<Output out6>

    Module om_udp
    Host 192.168.9.25
    Port 4449
    OutputType GELF_UDP
</Output>

# ROUTES ------------------------------------------------------

<Route 6>

    Priority    6
    Path in6 => out6
</Route>

 


Marazm created
Transfer TLS Windows Server 2012R2 DNS logs by nxlog towards ELK pile on debian

Hello everyone!

I'm new to the forum, so, I appeal to you because I meet a problem in viewing my DNS logs on ELK stack.

Here is my problem: I have Windows Server 2012R2 VM with nxlog above . The configuration file is the following :

define ROOT C:\Program Files (x86)\nxlog
define CERTDIR %ROOT%\cert
 
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
 
<Extension _json>
    Module      xm_json
</Extension>
 
<Input dnslog>
    Module      im_file
    File        "C:\\dns-log.log"
    InputType    LineBased
    Exec $Message = $raw_event;
    SavePos TRUE
</Input>
 
<Output out>
    Module      om_ssl
    Host        IP_DU_SERVEUR_LOGSTASH
    Port        PORT_DU_SERVEUR_LOGSTASH
    CAFile      %CERTDIR%\logstash-forwarder.crt
    Exec        $EventReceivedTime = integer($EventReceivedTime) / 1000000; to_json();
</Output>
 
<Route 1>
    Path        dnslog => out
</Route>

And when I start it :
 3,5 Ko

My ELK stack run on debian. This are config files :

input {
tcp {
  codec =>line { charset => CP1252 }
         port => PORT_DU_SERVEUR_LOGSTASH
  ssl_verify => false
  ssl_enable => true
  ssl_cert => "/etc/pki/tls/certs/logstash-forwarder.crt"
  ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  type => "nxlog"
}
filter {
if [type] == "nxlog" {
  grok {
   match => [ "message", "(?<date_n_time_us>%{DATE_US} %{TIME} (?:AM|PM))%{SPACE}%{WORD:dns_thread_id}%{SPACE}%{WORD:dns_context}%{SPACE}%{WORD:dns_internal_packet_identifier}%{SPACE}%{WORD:dns_protocol}%{SPACE}%{WORD:dns_direction}%{SPACE}%{IP:dns_ip}%{SPACE}%{WORD:dns_xid}%{SPACE}(?<dns_query_type>(?:Q|R Q))%{SPACE}[%{NUMBER:dns_flags_hex}%{SPACE}%{WORD:dns_flags_chars}%{SPACE}%{WORD:dns_response_code}]%{SPACE}%{WORD:dns_question_type}%{SPACE}%{GREEDYDATA:dns_question_name}" ]
  }
}
}
output {
elasticsearch {
  hosts => ["localhost:9200"]
  sniffing => true
  manage_template => false 
  index => "%{[@metadata][nxlog]}-%{+YYYY.MM.dd}"
  document_type => "%{[@metadata][type]}"
}
stdout {
  codec => rubydebug
}
}

Issue : I can not view my DNS logs on Kibana. Also configure a dashboard . I'm not sure of my configuration files for Logstash , especially the "filter" section and "output". However, when I type the command ngrep INTERFACE -d -t -W byline on my debian, I have queries that appears to be from my WS, so my logs are well received. Could you help me ?

Thank you very much for your time ! And sorry for my english writing...


OncleThorgal created
Replies: 1
View post »
last updated
how to fix apr_sockaddr_info failed & not functional without input modules for splunk SIEM

1)

2016-03-11 12:03:01 ERROR apr_sockaddr_info failed for 192.168.1.253:514;The requested name is valid, but no data of the requested type was found.  

 

2)

2016-03-11 13:21:37 ERROR module 'in' is not declared at C:\Program Files (x86)\nxlog\conf\nxlog.conf:43
2016-03-11 13:21:37 ERROR route 1 is not functional without input modules, ignored at C:\Program Files (x86)\nxlog\conf\nxlog.conf:43
2016-03-11 13:21:37 WARNING no routes defined!
2016-03-11 13:21:37 WARNING not starting unused module internal
2016-03-11 13:21:37 WARNING not starting unused module out
2016-03-11 13:21:37 INFO nxlog-ce-2.9.1504 started

 

My nxlog.conf file

 

#define ROOT C:\Program Files\nxlog
define ROOT C:\Program Files (x86)\nxlog

Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log

<Extension syslog>

    Module xm_json

</Extension>

<Input internal>

    Module im_internal

</Input>

 

<Output out>

    Module om_tcp
    Host 192.168.253.134
    Port 9001
    Exec _json();
    

</Output>

<Route 1>

    Path   in => out
    
  </Route>

 

I have configured Receive port on Splunk server which is  :9001  and my splunk server ip : 192.168.253.134

I have set the receiving port on my splunk server and trying to get windows 7 logs into my splunk server using nxlog configurations.but having this erros. not able to interpreat these both erros.Appriciate if any one has answer for these both erros. 

 

Thanks!!


Deval.Khatri created
Replies: 1
View post »
last updated
Using NXlog as a server and filtering output log files with hostname.

I am working on a centralised logging server using nxlog both as a client on a windows machine and as a server on rhel 7.
I want to filter the incoming logs using Hostname and SourceName.

I want that the hostname should be used to create the folder with the hostname in /var/log/$Hostname/ and

The filename should use the SourceName like /var/log/$Hostname/$SourceName.log

 

So nxlog server should create the folder and file using $hostname and $sourcename respectively.

 

Please help me with the config file for the same.


ankit3sharma created
How to convert local time to UTC before sending logs to Logstash

I have the following output config:

 

<Output out>
    Module      om_tcp
    Host        10.36.52.62
    Port        12201
    Exec        $EventTime = strftime($EventTime, '%Y-%m-%d %H:%M:%S %Z'); \
                to_json();
</Output>

Which is sending the EventTime in the local time zone of the server. This is how it looks like at Logstash side:

{
             "message" => "{\"EventTime\":\"2016-03-03 03:07:29 Central Standard Time\",\"EventTimeWritten\":\"2016-03-03 03:07:29\",\"Hostname\":\"testwin2012\",\"EventType\":\"INFO\",\"SeverityValue\":2,\"Severity\":\"INFO\",\"SourceName\":\"Service Control Manager\",\"FileName\":\"System\",\"EventID\":7036,\"CategoryNumber\":0,\"RecordNumber\":34297,\"Message\":\"The nxlog service entered the running state. \",\"EventReceivedTime\":\"2016-03-03 03:07:30\",\"SourceModuleName\":\"eventlog\",\"SourceModuleType\":\"im_mseventlog\"}\r",
            "@version" => "1",
          "@timestamp" => "2016-03-03T09:07:34.479Z",
                "host" => "testwin2012",
                "port" => 49632,
                "type" => "windows",
           "EventTime" => "2016-03-03 03:07:29 Central Standard Time",
    "EventTimeWritten" => "2016-03-03 03:07:29",
       "SeverityValue" => 2,
            "Severity" => "INFO",
          "SourceName" => "Service Control Manager",
            "FileName" => "System",
             "EventID" => 7036,
      "CategoryNumber" => 0,
        "RecordNumber" => 34297,
             "Message" => "The nxlog service entered the running state. "
}

 

I have to do a lot of expensive operations in Logstash to convert the timestamp into UTC. I have to convert "Central Standard Time" to Joda, which requires me to take that string, put it into a seperate field, prepare a dictionary, use an expensive translate operation on that new field and put it back to the timestamp field. Is there any way to make nxlog convert the EventTime field into UTC before sending?


achechen created
Replies: 1
View post »
last updated
How to drop the incoming logs based on the severity

I am fairly new to nxlog. I am looking for a help to complete my task. How do i drop the log message based on log levels (severity). The incoming log messages have different log levels (debug, info, warning, error, critical). For example, If i set severity as warning, the nxlog should drop info and debug log messages. Please provide some examples of nxlog.conf to make use of it.

Thanks for the help in advance.


arun.dharan created
Replies: 1
View post »
last updated
Specific windows event 1102 not getting UserData

Hi,

We have the following configuration for event id 1102 (eventlog cleared):

<Input clearev>
    Module      im_msvistalog
 Query   <QueryList>\
    <Query Id="3">\
     <Select Path="Security">*[System[(EventID=1102)]]</Select>\
           </Query>\
           </QueryList>
 Exec delete($Message);
 Exec $Message = to_json();
 Exec $SyslogFacilityValue = 17; $SyslogSeverityValue=6;
</Input>

The received message is like that:

Feb 29 10:37:17 XXXXXXXX.sdsd.local Microsoft-Windows-Eventlog[1004]: {"EventTime":"2016-02-29 10:37:17","Hostname":"XXXXXXXX.sdsd.local","Keywords":4620693217682128896,"EventType":"INFO","SeverityValue":2,"Severity":"INFO","EventID":1102,"SourceName":"Microsoft-Windows-Eventlog","ProviderGuid":"{FC65DDD8-D6EF-4962-83D5-6E5CFE9CE148}","Version":0,"Task":104,"OpcodeValue":0,"RecordNumber":124745,"ProcessID":1004,"ThreadID":7792,"Channel":"Security","Category":"Effacement de journal","Opcode":"Informations","EventReceivedTime":"2016-02-29 10:37:18","SourceModuleName":"clearev","SourceModuleType":"im_msvistalog"}

As you can see the SubjectUserName information is missing.

But if we look at the detailed view in the eventviewer we can find the information in the XML data:

~~  <Provider Name="Microsoft-Windows-Eventlog" Guid="{fc65ddd8-d6ef-4962-83d5-6e5cfe9ce148}" />
  <EventID>1102</EventID>
  <Version>0</Version>
  <Level>4</Level>
  <Task>104</Task>
  <Opcode>0</Opcode>
  <Keywords>0x4020000000000000</Keywords>
  <TimeCreated SystemTime="2016-02-29T09:37:17.602206200Z" />
  <EventRecordID>124745</EventRecordID>
  <Correlation />
  <Execution ProcessID="1004" ThreadID="7792" />
  <Channel>Security</Channel>
  <Computer>XXXXXXXX.sdsd.local</Computer>
  <Security />
  </System>
- <UserData>
- <LogFileCleared xmlns:auto-ns3="http://schemas.microsoft.com/win/2004/08/events" xmlns="http://manifests.microsoft.com/win/2004/08/windows/eventlog">
  <SubjectUserSid>S-1-5-21-1659004503-179605362-725345543-5237</SubjectUserSid>
  <SubjectUserName>myuser</SubjectUserName>
  <SubjectDomainName>SDSD</SubjectDomainName>
  <SubjectLogonId>0xa5c77</SubjectLogonId>
  </LogFileCleared>
  </UserData>
  </Event>

 

How could we get this information through the json format ? do we have to develop something for specificxml view and if yes how can we do that ?

 

Please let me know.

 

Kind regards,
 

 

 


system0845 created
Replies: 1
View post »
last updated
Log on papertrailapp from Windows 10

I have change the conf file like said in the papertrailapp but i don't receive any log from Windows 10. I have stop and start the service but nothing is received.


hamdy.aea created
Filter out all messages, but the ones we want

Hello,

I have a config that I thought would work, but it does not.  I would like to have the syslog service only send specific messages it finds in the log file and ignore all other and not send them to the syslog server.  Her is the config I currently have, but it seems to be sending everything.  Any help would be great.

<Input watchfile_m_LOGFILENAME>

  Module im_file
  File 'C:\\logs\\log.log'
  Exec $Message = $raw_event;
  Exec if $raw_event =~ /has failed/ $SyslogSeverityValue = 3;
  Exec if $raw_event =~ /Rx error in packet/ $SyslogSeverityValue = 3;
  Exec if $raw_event =~ /LossCounter non zero in packet/ $SyslogSeverityValue = 3;
  Exec $SyslogSeverityValue = 6;
  Exec if file_name() =~ /.*\\(.*)/ $SourceName = $1; 

Thank You,

 

Yury


yman182 created
Replies: 1
View post »
last updated