Hi,
I'm using a CE where xm_w3c module is not available. So i'm getting below error:
2019-04-09 14:59:30 ERROR Failed to load module from C:\Program Files (x86)\nxlog\modules\extension\xm_w3c.dll, The specified module could not be found. ; The specified module could not be found.
2019-04-09 14:59:30 ERROR Invalid InputType 'w3c_parser' at C:\Program Files (x86)\nxlog\conf\nxlog.conf:94
Is there a way to overcome this error in CE by downloading the particular module ? Or should I try using the Enterprise edition?
Sangeetha created
Hi, Im actually having an issue with my nxlog server. We are trying to send antivirus log from a McAfee EPO to my NX. The problem we facing is that when we try a connection test from EPO to NXLOG we get this message on our Nxlog server.
2019-04-09 19:32:54 INFO SSL connection accepted from 10.28.26.214:59126 2019-04-09 19:32:54 ERROR SSL error, SSL_ERROR_SSL: retval -1, reason: peer did not return a certificate 2019-04-09 19:32:54 WARNING SSL connection closed from 10.28.26.214:59126
Can we receive the AV log without using the certificate ? Do you know a way to bypass this ?. The certificat have been created with OPENSSL with the help of one of your technicien and the certificat looks good... we have somme difficulty to understand why this operation fail. We have also put the certificat we create for NXLOG on our Antivirus server to let them communicate. Do you have any idea of what is the problem ? Your help is very appreciated gain.
Greetings,
MaxiTremblaycgi created
Is the scalability / performance of community vs enterprise edition any different?
We tried the community edition for WEC/WEF and it appears to be dropping logs at 2000 eps.
We're wondering if there are any configuration we should be aware of.
Moreover, please provide sizing recommendations:
> What eps can a single nxlog agent support for WEC/WEF collection?
> How many VM's of what size (CPU cores and GB memory) should we plan for to support 50,000 eps?
mshakir created
Is there a way to run NxLOG in a "throttled" state during certain times of the day?
For instance, process x number of logs per hour from 8-5
Deleted user created
Hi.
We are facing this problem that NXLog takes a lot of memory when using it to collect logs from PostgreSQL database. I tried to modify the polling interval in the config but it did not help.
Our config is like this now:
<Input PostgreSQL>
Module im_dbi
Driver pgsql
SavePos false
PollInterval 5
Option host 127.0.0.1
Option username *****
Option password **************
Option dbname messagelog
SQL SELECT id, discriminator, time, queryid, message, timestamprecord, response, memberclass, membercode, subsystemcode FROM logrecord
Exec $SourceName = 'PostgreSQL';
Exec to_json();
</Input>
<Output out> Module om_tcp Host 192.168.1.1 Port 1468 Exec to_syslog_ietf(); </Output>
<Route 1> Path PostgreSQL => out </Route>
I enabled debug and it produces a huge amount of these lines per second:
2019-04-09 15:00:57 DEBUG worker 0 processing event 0x7f67240a6d80 2019-04-09 15:00:57 DEBUG PROCESS_EVENT: POLL (out) 2019-04-09 15:00:57 DEBUG nx_module_pollset_poll: out 2019-04-09 15:00:57 DEBUG worker 2 got signal for new job 2019-04-09 15:00:57 DEBUG worker 2 got no event to process 2019-04-09 15:00:57 DEBUG worker 2 waiting for new event 2019-04-09 15:00:57 DEBUG [out] no poll events, pollset_poll timed out 2019-04-09 15:00:57 DEBUG nx_event_to_jobqueue: POLL (out) 2019-04-09 15:00:57 DEBUG event added to jobqueue 2019-04-09 15:00:57 DEBUG worker 0 processing event 0x7f67240a4fb0 2019-04-09 15:00:57 DEBUG PROCESS_EVENT: POLL (out) 2019-04-09 15:00:57 DEBUG nx_module_pollset_poll: out 2019-04-09 15:00:57 DEBUG worker 1 got signal for new job 2019-04-09 15:00:57 DEBUG worker 1 got no event to process 2019-04-09 15:00:57 DEBUG worker 1 waiting for new event 2019-04-09 15:00:57 DEBUG [out] no poll events, pollset_poll timed out 2019-04-09 15:00:57 DEBUG nx_event_to_jobqueue: POLL (out) 2019-04-09 15:00:57 DEBUG event added to jobqueue 2019-04-09 15:00:57 DEBUG worker 0 processing event 0x7f67240a6d80 2019-04-09 15:00:57 DEBUG PROCESS_EVENT: POLL (out) 2019-04-09 15:00:57 DEBUG nx_module_pollset_poll: out 2019-04-09 15:00:57 DEBUG worker 2 got signal for new job 2019-04-09 15:00:57 DEBUG worker 2 got no event to process 2019-04-09 15:00:57 DEBUG worker 2 waiting for new event 2019-04-09 15:00:57 DEBUG [out] no poll events, pollset_poll timed out 2019-04-09 15:00:57 DEBUG nx_event_to_jobqueue: POLL (out) 2019-04-09 15:00:57 DEBUG event added to jobqueue 2019-04-09 15:00:57 DEBUG worker 0 processing event 0x7f67240a4fb0
We are not seeing the memory usage problem on other log collecting methods. Any ideas what could cause this?
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 21239 root 20 0 162120 2396 1600 R 0.3 0.1 0:00.04 top 28670 nxlog 20 0 1589652 1.3g 3676 S 0.3 34.4 14:33.41 nxlog
JaVa created
I hope someone can help me. I need to pick application logs from a few MSSQL database tables. 3 of 4 works perfect -there are no issues getting the data, but one table is causing issues.
Below you can see input I am using. As you can see, there is one column name with the Danish character, however, I am not sure if only this is causing trouble. The error I receive is:
ERROR SQLExecDirect failed, 42000:2:102:[Microsoft][SQL Server Native Client 11.0][SQL Server]Incorrect syntax near 'Ą'.; 42000:3:319:[Microsoft][SQL Server Native Client 11.0][SQL Server]Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a s (odbc error code: -1)
However, If I remove this column from SELECT, another error occurs:
ERROR id column not found or its type is unsupported.
I received a similar error than I left some additional spaces in the query, but after that, I used notepad++ to make sure I don't leave any hidden characters.
=========================================================
</Input>
<Input odbc_besked>
Module im_odbc
ConnectionString Driver={SQL Server Native Client 11.0};Server=SERVERNAME\mssql2014,port;Database=database_name;Trusted_Connection=yes;
SQL SELECT BeskedType, \
Oprettet, \
InternBeskedId, \
SvarPåInternBeskedId, \
Retning, \
BeskedArt, \
SoapBeskedId, \
BeskedId, \
HttpStatusCode, \
Fra, \
Til, \
Failover, \
EnvelopeStartTag, \
SoapHeaderElement, \
IndeholderSoapFault, \
Stack \
FROM table_name WITH (NOLOCK) WHERE InternBeskedId > ?
</Input>
Lauxna created
define ROOT C:\Program Files (x86)\nxlog define ROOT_STRING C:\Program Files (x86)\nxlog define CERTDIR %ROOT%\cert Moduledir %ROOT%\modules CacheDir %ROOT%\data Pidfile %ROOT%\data\nxlog.pid SpoolDir %ROOT%\data LogFile %ROOT%\data\nxlog.log <Extension charconv> Module xm_charconv AutodetectCharsets utf-8, euc-jp, utf-16, utf-32, iso8859-2 </Extension> <Extension fileop> Module xm_fileop </Extension> <Input im_file> Module im_file File "C:\inetpub\vhost\Auth\logs\nlog-test.log" SavePos TRUE Exec $raw_event = '[xxxxxxxxxxxxxxxxxxx]' + $raw_event; Exec $Message = $raw_event; </Input>
<Output out> Module om_tcp Host listener.logz.io Port 8010 </Output>
<Route route> Path im_file => out </Route>
manzur.shaikh created
Hi, I am collecting events (im_msvistalog) from system 7001 and 7002 logon and logoff, I would like to know which field I use to catch the user who is logged on or off, because $UserID returns a SID.
Thank you.
Altair.Pa created
Hi,
I'm trying to transfer two different csv files having different set of columns to a location. I'm trying to define them in one config file. I would like to know how we define the Extension module for this scenario since xm_csv is going to be the common one for the entire file. Is there any option to use more than one extension module with xm_csv specific to each file?
Sample: Is this possible in one config file?
Extension <Extension csv_parser> Module xm_csv Fields A,B,C,D Delimiter , </Extension>
<Extension csv_parser> Module xm_csv Fields E,F,G Delimiter , </Extension>
Sangeetha created
I got the okta add-on as part of a trial, but when I am trying to run the nxlog using the below config it doesn't show any data in the output file. Please advise
Panic Soft #NoFreeOnExit TRUE
define ROOT C:\Program Files (x86)\nxlog define CERTDIR %ROOT%\cert define CONFDIR %ROOT%\conf define LOGDIR %ROOT%\data define LOGFILE %LOGDIR%\nxlog.log LogFile %LOGFILE%
Moduledir %ROOT%\modules CacheDir %ROOT%\data Pidfile %ROOT%\data\nxlog.pid SpoolDir %ROOT%\data LogLevel DEBUG NoCache True
<Extension _json> Module xm_json </Extension>
<Extension _syslog> Module xm_syslog </Extension>
<Input okta> Module im_exec Command C:\Program files (x86)\nxlog-okta\nxlog-okta.exe Exec parse_syslog(); <Exec> parse_syslog(); parse_json($Message); </Exec> </Input>
<Output file> Module om_file File 'C:\syslog\o0.log' Exec to_json(); </Output>
<Route r> Path okta => file </Route>
It has following data in nxlog.log file: 2019-04-04 11:10:50 DEBUG new event in event_thread [okta:READ] 2019-04-04 11:10:50 DEBUG nx_event_to_jobqueue: READ (okta) 2019-04-04 11:10:50 DEBUG event added to jobqueue 2019-04-04 11:10:50 DEBUG no events or no future events, event thread sleeping in condwait 2019-04-04 11:10:50 DEBUG worker 1 got signal for new job 2019-04-04 11:10:50 DEBUG worker 1 processing event 0x1dcf30 2019-04-04 11:10:50 DEBUG PROCESS_EVENT: READ (okta) 2019-04-04 11:10:50 DEBUG im_exec_add_read_event with delay 1000000 2019-04-04 11:10:50 DEBUG got EAGAIN 2019-04-04 11:10:50 DEBUG worker 1 waiting for new event 2019-04-04 11:10:50 DEBUG new event in event_thread [okta:READ] 2019-04-04 11:10:50 DEBUG future event, event thread sleeping 1000000ms in cond_timedwait
Divya created
Hi,
I'm new to nxlog. I have recently installed nxlog in my local system and tried to copy nxlog log file which is available in a default location to another location in my machine. Is this feasible using nxlog?
Sangeetha created
We are using NXLog to relay logs from ModSecurity to AlienVault. The transfer is working but NXLog is adding time and date to the beginning of every line. This is stopping AlienVault from processing the data properly. Is there a way for us to stop NXLog from modifying the sent logs?
Bauer3139 created
Is there any way of splitting very long log messages in half or smaller portions? We are currently forwarding logs with NXLog to a SIEM system that has a 8kb limit on the messages and what goes beyond that limit is truncated and we don't want that. I tried to read the manual but did not find anything related to my problem. Help please?
JaVa created
I was trying to convert JSON to syslog, okta logs are the source of JSON, but couldn't convert okta logs to syslogs and copy the converted logs to a .txt file as I was getting this: Module in2 got EOF from C:\Users\user\output.txt DEBUG got EOF for C:\Users\user\output.txt. Please help me in resolving this. My nxlog config file:
define ROOT C:\Program Files (x86)\nxlog Moduledir %ROOT%\modules CacheDir %ROOT%\data Pidfile %ROOT%\data\nxlog.pid SpoolDir %ROOT%\data LogFile %ROOT%\data\nxlog.log NoCache TRUE LogLevel DEBUG
<Extension json> Module xm_json </Extension>
<Extension syslog> Module xm_syslog </Extension>
<Input in2> Module im_file File 'C:\Users\user\output.txt' SavePos TRUE ReadFromLast TRUE PollInterval 1 Exec $Message = $to_json; $SyslogFacilityValue = 22; </Input>
<Output out> Module om_file File 'C:\syslog\Sysoutput.txt' Exec to_syslog_bsd(); </Output>
<Route r> Path in2 => out </Route>
Divya created
Good Afternoon. I was hoping someone may be able to assist me with an issue I am having sending my logs from IIS in W3C format to Graylog. The W3C time is by default in UTC. When NXlog is sending the logs to my graylog server it is sending logs that are already 4 hours old because I am in EST, but the IIS logs are in UTC. Is there something I can do in the configuration so NXlog is shipping current logs?
<Extension w3c> Module xm_csv Fields $date, $time, $s-ip, $cs-method, $cs-uri-stem, $cs-uri-query, $s-port, $cs-username, $c-ip, $csUser-Agent, $cs-Referer, $cs-host, $sc-status, $sc-substatus, $sc-win32-status, $time-taken FieldTypes string, string, string, string, string, string, integer, string, string, string, string, string, string, string, string, integer Delimiter ' ' QuoteChar '"' EscapeControl FALSE UndefValue - </Extension>
<Input iis> Module im_file File "C:\inetpub\logs\LogFiles\\u_ex*" SavePos TRUE
Exec if $raw_event =~ /^#/ drop(); \
else \
{ \
w3c->parse_csv(); \
$EventTime = parsedate($date + " " + $time); \
$SourceName = "Server"; \
$Message = to_json(); \
}
</Input>
Thanks in advance.
tnacnud1 created
Hi Everyone,
Can someone help me to read logs from Sybase DB?
We are having two instances of Sybase, One is on Windows and second is on Linux. I want to forward these logs over syslog.
Thanks in advance!!!
amol_more created
The nxlog service will not start up - I get an error 1053 The service did not respond to the start or control request in a timely fashion. No log file written to the logging directory so cant troubleshoot any further - nothing in the event logs that helps with any answers. Anyone have any experience with running this on Windows 2000
Using the following in the config: <Extension _syslog> Module xm_syslog </Extension> <Input eventlog> Module im_mseventlog </Input> #define PROCESSORS <Processor p_transform> Module pm_transformer Exec $Hostname=hostname(); OutputFormat syslog_rfc5424 </Processor>
sleachy created
Since there's no support for floating point data types in nxlog, given a log entry that contains numbers with decimal points, is the best option to convert them to fixed point integers?
For example, given a field $value = "123.45"
(a string) extracted from a log line using regex or xm_kvp, if I go directly to_json, I end up with a string in JSON.
I don't see any way to put the value, without quotes, into JSON. Am I correct?
One workaround is to convert the value to a fixed point integer, choosing a specific precision. For example, if I were to choose to always store my values * 100, I could do the following:
if $value =~ /^(-)?([0-9]+)\.?([0-9]+)?$/
{
if not defined($3) or size($3) == 0 $value = integer($2) * 100;
else if size($3) == 1 $value = integer($2) * 100 + integer($3) * 10;
else $value = integer($2) * 100 + integer(substr($3,0,2));
if $1 == '-' $value = $value * -1;
}
Given $value = "123.45", "123", "123.4567", "123.4", or "123.", this code will assign the correct, 2-place integer value.
Is this the best current approach to converting a string representation of a floating point value to something that will result in a non-quoted value in JSON?
Thank you!
nimaimalle created
From the source code, the version.sh
file needed a few changes to make it work on Mac.
First, the method of deriving the patch number from git needs to trim leading white space.
Before, After:
git log --pretty=oneline 2>/dev/null | wc -l
git log --pretty=oneline 2>/dev/null | wc -l | tr -d ' '
Second, echo -n
is not portable across linux versions. To achieve output without a new-line, use printf instead.
Before, After:
echo -n "${VERSION}.${VERSION_PATCH}" |sed s/M//
printf "${VERSION}.${VERSION_PATCH}"
I'm not sure what the sed s/M//
was for, but maybe it had to do with the svn codepath, which I did not test.
nimaimalle created
From this example, in the docs: Name=Mike, Weight=64, Age=24, Pet=dog, Height=172
The sample shows accessing the fields like this, effectively casting certain values to integers: if ( integer($Weight) > integer($Height) - 100 )
However, I am using parse_kvp
on data that I don't necessarily know the format of, then converting the data into JSON. In JSON, all of the values are quoted, including numeric ones. I need the numeric fields to be unquoted in JSON.
Is there a way to iterate through all fields in an Exec statement? I could test for numeric values and reassign them, casting to integer or float (but I don't think there's a float type)
I also thought about transforming the JSON string with s/"([0-9.]+)"/$1/g
but this sort of regex is not yet supported in nxlog.
Any suggestions how to take Name=Mike, Weight=64, Age=24, Pet=dog, Height=172.5
and get the following JSON without referencing the fields by name?
{"Name":"Mike","Weight":64,"Age":24,"Pet":"dog","Height":172.5}
(no quotes around numbers)
nimaimalle created