Ask questions. Get answers. Find technical product solutions from passionate experts in the NXLog community.
Amazon Linux AMI support
peter.wong@searchoptics.com created
I'm getting Segmentation fault when trying to run nxlog under Amazon Linux AMI 2015 or 2016.
My configuration file is ok and nxlog is installed from nxlog-ce-2.9.1716-1_rhel6.x86_64.rpm
[root@feeds ~]# /usr/bin/nxlog -v -c /etc/graylog/collector-sidecar/generated/nxlog.conf
2016-08-22 20:49:50 INFO configuration OK
[root@feeds ~]# /usr/bin/nxlog -f -c /etc/graylog/collector-sidecar/generated/nxlog.conf
2016-08-22 20:41:50 WARNING already running as gid 0
2016-08-22 20:41:50 WARNING already running as uid 0
peter.wong@searchoptics.com created
file_remove : unexpected TOKEN_INTEGER - ce - 2.9.1716
karrakis created
I'm trying to remove log files older than 48hours.
I read that i should use now()-seconds as the datetime, so i'll try
file_remove('filepath',now() - 172800) ;
file_remove(filepath, (now()-172800)) ;
boh failed with message : nxlog.conf; syntax error, unexpected TOKEN_INTEGER
in the doc, it's specified that datetime-integer return a datetime, but when i check with
file_remove('filepath',now()) ;
i don't get the syntax error.
then, i used
file_remove('filepath', datetime(now()-172800)); i also get the syntax error unexpected TOKEN_INTEGER.
How can i specify to file_remove the 'older' parameter to "48h" ?
karrakis created
Add a filter in nxlog
albamv created
Hello, i Want to change the value of the syslog severity level depending on the contain of the message.
somethin like..
if message contains the word INFO
syslog_severity_code=10
albamv created
Making sure that conf file will send all logs to graylog
Selmack created
Hello,
I am rather new to nxlog and really enjoy the product so far. My question is, I want to ensure that ALL Windows Events on a server are being sent to my graylog server and that no logs are being omitted. This appears to be the default conf and it should work this way, but I am just being extra sure. Thanks very much in advance.
## This is a sample configuration file. See the nxlog reference manual about the
## configuration options. It should be installed locally and is also available
## online at http://nxlog.org/docs/
## Please set the ROOT to the folder your nxlog was installed into,
## otherwise it will not start.
#define ROOT C:\Program Files\nxlog
define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
#<Extension _syslog>
# Module xm_syslog
#</Extension>
<Extension gelf>
Module xm_gelf
</Extension>
<Input in>
Module im_msvistalog
# For windows 2003 and earlier use the following:
# Module im_mseventlog
</Input>
<Output out>
Module om_udp
Host 192.168.1.71
Port 12201
OutputType GELF
# Exec to_syslog_snare();
</Output>
<Route 1>
Path in => out
</Route>
Selmack created
Work om_dbi with Oracle
Mikhail created
Hi all,
we used CE version Nxlog, and have some truble.
Don't insert in DB params over 4000 simbols.
Query like INSERT INTO web_log ( param) VALUES ( TO_CLOB($param)).
The '$params' sending in driver(last oracle libdbi driver 0.9.0) like a string parametr those converted as VARCHAR(4000)
How this implemented in EE version?
Mikhail created
Need to send nxlog collected windows events to sensu client
eyang@cisco.com created
I have some windows event log being collected by nxlog, I need to send this to Sensu client(using UDP port 3030) on the same machine. Do you have any experience on it?
Have you done anything simliar on it? Just want to know how to configure nxlog and sensu client to make it work.
eyang@cisco.com created
GELF timestamp field missing millisecond precision
coffee-squirrel created
We have nxlog CE pushing to a GELF TCP input in Graylog, and the timestamp field received from nxlog appears to not have the milliseconds (i.e. it ends in ".000"), resulting in out-of-order messages in Graylog within a 1-second window. Other sources (Graylog Collectors, apps pushing directly, etc.) include the original millisecond value as expected. For Graylog inputs receiving nxlog messages we've had to set up an extractor to extract the timestamp from the message itself. Are there any options to keep millisecond precision with nxlog?
coffee-squirrel created
Nxlog.conf unable to read /parse Directory or File path
Nick79 created
Hello, I am using NXLOG on Windows 2012 to get DNS logs forwarded to my syslog server. I have enabled DNS logging on the Windows server and see the dns.log file is getting created under C:\Windows\System32\DNS\ folder . However my nxlog.conf is unable to browse or parse to get to this directory. I have made sure to check the log file is dns.log and not dns.txt in Windows.
If i mention the below in my nxlog.conf file, i get an error " WARNING input file does not exist: C:\Windows\System32\dns\dns.log"
<Input in>
Module im_file
File "C:\\Windows\\System32\\dns\\dns.log"
SavePos TRUE
InputType LineBased
</Input>
If i mention the below File path in my nxlog.conf then i get an error : "ERROR failed to open directory: C:\Windows\System32\dns: The system cannot find the path specified."
<Input in>
Module im_file
File "C:\\Windows\\System32\\dns\\dns*"
SavePos TRUE
InputType LineBased
</Input>
Same thing, even if i use single quotes & single \ i get the same error - " WARNING input file does not exist: C:\Windows\System32\dns\dns.log
<Input in>
Module im_file
File 'C:\Windows\System32\dns\dns.log'
SavePos TRUE
InputType LineBased
</Input>
Can someone plss help ? This is drving me crazy
Nick79 created
Pass the value of the variable in nxlog from perl script
toreno93 created
Hello!
I want pass value a variable from NxLog in Perl script, and pass variable in nxlog after running the script.
how do I do this ?
Thank
toreno93 created
the perl interpreter failed to parse /tmp/nxlog/Perl/perl.pl
toreno93 created
Hello.
Help me please, i beginner in NxLog. I use NxLog on Unix and use Perl module, and script perl.pl
What this error ERROR the perl interpreter failed to parse /tmp/nxlog/Perl/perl.pl??
How do i fix this??
beginner
toreno93 created
[patch] Stop to_syslog_ietf() from incorrectly escaping carriage return and newline characters
ron-macneil-ice created
Hi,
RFC5424 and all transports (except obsolete non-octet-counted TCP) can handle MSG containing ANY character including newlines and carriage returns.
In violation of the above, NxLog's to_syslog_ietf() function backslash-escapes these two characters. Furthermore, the escaping scheme is broken because it doesn't also escape the escape character itself (the backslash) so there's no way to reliably un-escape the MSG on the receiving end.
The correct behaviour is to stop escaping these characters altogether. In the rare case that someone needs to send multiline messages over non-octet-counted TCP, they can escape/unescape the $Message themselves using NxLog's replace() function.
Patch below.
RFC References:
https://tools.ietf.org/html/rfc5424#section-6.4
https://tools.ietf.org/html/rfc6587#section-3.4
Regards,
Ron MacNeil
--- src/modules/extension/syslog/syslog.c.orig 2014-07-19 23:52:06.000000000 +1000
+++ src/modules/extension/syslog/syslog.c 2016-07-26 14:01:57.296175500 +1000
@@ -1321,16 +1321,8 @@
nx_syslog_add_structured_data(logdata);
// Append message
i = (int) logdata->raw_event->len;
nx_string_append(logdata->raw_event, " ", 1);
nx_string_append(logdata->raw_event, msg.string->buf, (int) msg.string->len);
for ( ; i < (int) logdata->raw_event->len; i++ )
{ // replace linebreaks with space
if ( (logdata->raw_event->buf[i] == '\n') || (logdata->raw_event->buf[i] == '\r') )
{
logdata->raw_event->buf[i] = ' ';
}
}
if (tmpmsg != NULL)
{ // clean up temp copy
ron-macneil-ice created
Same processor on multi routes
Popote created
Hi,
I want to use buffer (disk and memory) before sending my data to a TCP syslog, for that I create 2 processors (diskBuffer and memoryBuffer) that I use in a route : IN => diskBuffer => memoryBuffer => out.
When i try to create another route with one or more different process but which also uses buffers (IN2 => P1 => P2 => diskBuffer => memoryBuffer => out), i have an error message on log :
2016-07-23 13:28:51 ERROR cannot add processor module 'diskBuffer' to route 'XXX' because it is already added to route 'YYYY'
2016-07-23 13:28:51 ERROR cannot add processor module 'memoryBuffer' to route 'XXX' because it is already added to route 'YYYY'
This concept is not really explain in the community documentation and suggest to think of the opposite with the example given in page 18 :
Example 4.14 Different routes
<Input in1>
Module im_null
</Input>
<Input in2>
Module im_null
</Input>
<Processor p1>
Module pm_null
</Processor>
<Processor p2>
Module pm_null
</Processor>
<Output out1>
Module om_null
</Output>
<Output out2>
Module om_null
</Output>
<Route 1>
# no processor modules
Path in1 => out1
</Route>
<Route 2>
# one processor module
Path in1 => p1 => out1
</Route>
<Route 3>
# multiple modules
Path in1, in2 => p1 => p2 => out1, out2
</Route>
We have the same error: 2016-07-23 13:36:09 ERROR cannot add processor module 'p1' to route '3' because it is already added to route '2'.
Why a processor is limited on one route ? Is it a bug or a mistake in documentation ?
I use the latest version of nxlog-ce : V2.9.1716.
Best regards
Popote created
Is there any way to enable serialization of underscore prefixed fields by to_json or xm_gelf
dls314 created
Hi,
I'm trying to fit output into the GELF format and I'd like to preserve their specification that user fields have underscore prefixes. From what I read in the nxlog docs, any fields with underscore prefix wouldn't be preserved by xm_json or xm_gelf
Is that true?
Is there any way around this?
dls314 created
Issues With "Multi-line message parser (xm_multiline)"
gmelasecca created
My company is looking to setup NxLog. We are having issues reading in multiline exception logs from applications such as Tomcat, Java, Apache etc. I am able to read in the files but unfortunately the output in our GrayLog application is showing every event as one line. I tried to implement the xm_multiline module but i seem to be having issues getting it to work.
installed NxLog and checked my configuration to the following below. restarted the services, let the service run all night and still the output is the same as shown below.
Sample Input Log:
07/07/2016 13:35:11.654 [tomcat-http--43] [ERROR] [4114723 ms] Warning - unprocessed rows in esolutions.care.assess.WeAssessment
esolutions.EsolutionsException: There were 83 unprocessed rows out of 84
at esolutions.base.WeObject.sleep(WeObject.java:2767)
at esolutions.base.WeObject.clear(WeObject.java:3250)
at esolutions.care.assess.WeAssessment.clear(WeAssessment.java:7699)
at esolutions.base.WeObject.close(WeObject.java:2815)
at esolutions.util.WeHTMLTable.getTableHTML(WeHTMLTable.java:541)
at esolutions.util.WeHTMLTable.toHTML(WeHTMLTable.java:508)
at org.apache.jsp.admin.client.cp_005fassessment_jsp._jspService(cp_005fassessment_jsp.java:4412)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
07/07/2016 13:36:21.828 [tomcat-http--26] [ERROR] [4184897 ms] Warning - unprocessed rows in esolutions.care.assess.WeAssessment
esolutions.EsolutionsException: There were 82 unprocessed rows out of 83
at esolutions.base.WeObject.sleep(WeObject.java:2767)
at esolutions.base.WeObject.clear(WeObject.java:3250)
at esolutions.care.assess.WeAssessment.clear(WeAssessment.java:7699)
at esolutions.base.WeObject.close(WeObject.java:2815)
at esolutions.util.WeHTMLTable.getTableHTML(WeHTMLTable.java:541)
at esolutions.util.WeHTMLTable.toHTML(WeHTMLTable.java:508)
at org.apache.jsp.admin.client.cp_005fassessment_jsp._jspService(cp_005fassessment_jsp.java:4412)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:432)
Sample Output From GrayLog in CSV format. The output in the webui is each event as it shows in the "message" column.
timestamp
source
EventReceivedTime
level
message
SourceModuleName
SourceModuleType
2016-07-19T21:27:08.000Z
GDPCCA02
07/19/16 17:27
6
2016/07/19 17:27:08.032 | srvmain | INFO | 07/19/2016 17:27:08
pcc-wrapper-log
im_file
2016-07-19T21:27:08.000Z
GDPCCA02
07/19/16 17:27
6
2016/07/19 17:27:08.032 | srvmain | INFO | java.lang.NumberFor
pcc-wrapper-log
im_file
2016-07-19T21:27:08.000Z
GDPCCA02
07/19/16 17:27
6
2016/07/19 17:27:08.032 | srvmain | INFO | at com.pointclickc
pcc-wrapper-log
im_file
2016-07-19T21:27:08.000Z
GDPCCA02
07/19/16 17:27
6
2016/07/19 17:27:08.032 | srvmain | INFO | at org.apache.cata
pcc-wrapper-log
im_file
Configuration File. I tried multiple regular expressions with no success.
## This is a sample configuration file. See the nxlog reference manual about the
## configuration options. It should be installed locally and is also available
## online at http://nxlog.org/docs/
## Please set the ROOT to the folder your nxlog was installed into,
## otherwise it will not start.
#define ROOT C:\Program Files\nxlog
define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
<Extension gelf>
Module xm_gelf
</Extension>
<Extension fileop>
Module xm_fileop
</Extension>
<Extension multiline>
Module xm_multiline
HeaderLine /^\d{0,2}\/\d{0,2}\/\d{0,4}/
# HeaderLine '^\d{0,2}\/\d{0,2}\/\d{0,4}\ \d{0,3}\:\d{0,3}\:\d{0,3}\.\d{0,4}\ \['
</Extension>
<Input pcc-wrapper-log>
Module im_file
File "C:\\pivotal-tc-server-standard-3.1.0.RELEASE\\pccweb\\logs\\wrapper.log"
SavePos TRUE
InputType multiline
</Input>
<Input pcc-mdstrace-log>
Module im_file
File "C:\\pivotal-tc-server-standard-3.1.0.RELEASE\\pccweb\\logs\\mdstrace.log"
SavePos TRUE
InputType multiline
</Input>
<Input pcc-exceptionHidingUtil-log>
Module im_file
File "C:\\pivotal-tc-server-standard-3.1.0.RELEASE\\pccweb\\logs\\exceptionHidingUtil.log"
SavePos TRUE
InputType multiline
</Input>
<Input pcc-esolutions-log>
Module im_file
File "C:\\pivotal-tc-server-standard-3.1.0.RELEASE\\pccweb\\logs\\esolutions.log"
SavePos TRUE
InputType multiline
</Input>
#<Input pcc-localHostAccess-log>
# Module im_file
# File "C:\\pivotal-tc-server-standard-3.1.0.RELEASE\\pccweb\\logs\\localhost_access_log.*"
# SavePos TRUE
# InputType multiline
#</Input>
<Output graylog>
Module om_udp
Host graylog.genesishcc.com
Port 12201
OutputType GELF
</Output>
<Route PCC>
Path pcc-wrapper-log => pcc-mdstrace-log => pcc-exceptionHidingUtil-log => pcc-esolutions-log => graylog
## Path pcc-wrapper-log => pcc-mdstrace-log => pcc-exceptionHidingUtil-log => pcc-esolutions-log => pcc-localHostAccess-log => graylog
</Route>
gmelasecca created
Сollect events from the database use Time-based (Not Id)
toreno93 created
Hello
Help me please.
I want collect events from the database use Time-based (Not Id)
What can i do?
Thank
toreno93 created
open source items
enghouse created
Team,
As part of the assesment i need to provide list of open source items nxlog uses and its licensing information so our software legal team can do its assesment. Is there any link our documentation available which explains all the open source item we are currently using in nxlog.
Thanks,
Imran
enghouse created
Use community edition with ElasticSearch
ehsanTC created
Hi all
I would be glad to know, is it possible to use Nxlog community edition with ElasticSearch?
In the documentation I have read that the om_elasticsearch module is needed which exists in Enterprise Edition only.
Thanks in advanced.
ehsanTC created
Split input to multiple outputs based on content
Preston.Taylor created
How can I select some messages from a single source for 1 output and some for another based on the syslog content, I'm using community edition I have RTFMed but haven't found anything describing how to do this. I've tried using the Route block to send to multiple outputs and then using the drop() option in the output inside some <Exec> tags but it doesn't seem to work and I end up with the same stuff in both outputs.
Preston.Taylor created
xm_perl with nested fields
zz created
Hi, I´m trying to add some info to my logs via xm_perl before send it to elasticsearch (using json format). As result, it would be nice to add some fileds from my perl code in nested way. Is it posible to use something like set_field_XXX($event, "myAddedfield.myAddedSubfield", "value")?
At the end, I want to create nested fields inside my json object.
Thx.
zz created