Ask questions. Get answers. Find technical product solutions from passionate experts in the NXLog community.
Nxlog handling big number of files
alexandru.enciu created
Hello,
I have an application that logs some API requests and responses. Each request is logged in a different file, as a single line. In the system there are thousands of files, and Nxlog seems to have issues sending the logs to Elasticsearch. It reads the file, I can see im_file_add_file command in logs, but it takes a long time to actually send the message.
How does Nxlog process multiple files in a single directory?
alexandru.enciu created
Sent DHCP and DC logs to a SYSLOG-NG Server
Ezein created
Hello,
Do you known if it's possible to send DHCP and DC log to a SYSLOG-NG server
I don't know how specify the facilities for the windows logs
Thanks in advance.
Ezein created
NxLog not finding any modules
Deleted user created
Hello,
I am running a trial version of EE, but when I try to start NxLog, I get errors saying it cannot find the modules.
Here is my conf file. I have verified that nxlog is installed at C:\Program Files\nxlog.
configuration options. It should be installed locally and is also available
online at http://nxlog.org/docs/
Please set the ROOT to the folder your nxlog was installed into,
otherwise it will not start.
define ROOT C:\Program Files\nxlog
#define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
#LogLevel DEBUG
<Extension json>
Module xm_json
</Extension>
<Input in>
For Windows 2008 and later
Module im_msvistalog
For Windows 2003 and earlier
#Module im_mseventlog
File "c:\\documents and settings\\administrator\\desktop\\events\\app.evtx"
Exec to_json();
</Input>
<Output out>
Module om_tcp
Host localhost
Port 5013
</Output>
<Route 1>
Path in => out
</Route>
Error logs
2018-10-12 13:51:24 ERROR Failed to load module from C:\Program Files\nxlog\modules\input\im_msvistalog.dll, The specified module could not be found. ; The specified module could not be found.
2018-10-12 13:51:24 WARNING no functional input modules!
2018-10-12 13:51:24 ERROR module 'in' is not declared at C:\Program Files\nxlog\conf\nxlog.conf:42
2018-10-12 13:51:24 ERROR route 1 is not functional without input modules, ignored at C:\Program Files\nxlog\conf\nxlog.conf:42
2018-10-12 13:51:24 INFO nxlog-4.1.4046-trial started
2018-10-12 13:51:24 WARNING not starting unused module out
Deleted user created
Linux rsyslogd SSL to nxlog errno=9 is reported even with Digital Signature flag omitted
comoalt created
Hello,
i am setting up SSL connection between rsyslog over linux box and nxlog endpoint. While win boxes connect like a charm linux boxes issue the following:
2018-10-12 11:51:26 ERROR remote ssl socket was reset? (SSL_ERROR_SSL with errno=9); End of file found
I then found on your forum this post https://nxlog.co/question/1926/nxlog-ce-v291716-certificate-built-ecdsa-key where they talk about rebuild certificate without Digital Signature KeyUsage flag.
I assumed to rebuild client.crs since my rootCA.crt does not report any Digital Signature :
X509v3 extensions:
X509v3 Subject Key Identifier:
AB:E6:E4:61:11:89:43:21:87:FB:91:08:44:C0:15:A7:41:3B:A3:53
X509v3 Authority Key Identifier:
keyid:AB:E6:E4:61:11:89:43:21:87:FB:91:08:44:C0:15:A7:41:3B:A3:53
DirName:/C=US/ST=Some-State/L=Somecity/O=CompanyName/OU=Organizational Unit Name (eg, section)/CN=Common Name (e.g. server FQDN or YOUR name)/emailAddress=Email Address
serial:AF:06:5F:4B:97:ED:81:90
X509v3 Basic Constraints:
CA:TRUE
X509v3 Key Usage:
Certificate Sign, CRL Sign
I built a new client.csr without any trace of X509v3 extensions, but i always get the same error message.
Any help is well appreciated. Thanks
comoalt created
Unable to get multiline working
romainp created
Hi guys!
I really someone can help because I think I have tested all the things I could think of to make it work...
Ok, so we have those logs:
'[2018-10-11T12:06:47,434][DEBUG][o.e.a.s.TransportSearchAction] [master01] [245674] Failed to execute fetch phase
org.elasticsearch.transport.RemoteTransportException: [hot08][10.10.30.168:9300][indices:data/read/search[phase/fetch/id]]
Caused by: org.elasticsearch.search.SearchContextMissingException: No search context found for id [245674]
at org.elasticsearch.search.SearchService.findContext(SearchService.java:520) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.search.SearchService.executeFetchPhase(SearchService.java:487) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.action.search.SearchTransportService$11.messageReceived(SearchTransportService.java:440) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.action.search.SearchTransportService$11.messageReceived(SearchTransportService.java:437) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler$1.doRun(SecurityServerTransportInterceptor.java:258) ~[?:?]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.common.util.concurrent.EsExecutors$1.execute(EsExecutors.java:135) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.lambda$messageReceived$0(SecurityServerTransportInterceptor.java:307) ~[?:?]
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:60) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.lambda$inbound$2(ServerTransportFilter.java:166) ~[?:?]
at org.elasticsearch.xpack.security.authz.AuthorizationUtils$AsyncAuthorizer.maybeRun(AuthorizationUtils.java:183) ~[?:?]
at org.elasticsearch.xpack.security.authz.AuthorizationUtils$AsyncAuthorizer.setRunAsRoles(AuthorizationUtils.java:177) ~[?:?]
at org.elasticsearch.xpack.security.authz.AuthorizationUtils$AsyncAuthorizer.authorize(AuthorizationUtils.java:165) ~[?:?]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.lambda$inbound$3(ServerTransportFilter.java:168) ~[?:?]
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:60) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.lambda$authenticateAsync$2(AuthenticationService.java:184) ~[?:?]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.lambda$lookForExistingAuthentication$4(AuthenticationService.java:217) ~[?:?]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.lookForExistingAuthentication(AuthenticationService.java:228) ~[?:?]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.authenticateAsync(AuthenticationService.java:182) ~[?:?]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.access$000(AuthenticationService.java:143) ~[?:?]
at org.elasticsearch.xpack.security.authc.AuthenticationService.authenticate(AuthenticationService.java:113) ~[?:?]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.inbound(ServerTransportFilter.java:142) ~[?:?]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.messageReceived(SecurityServerTransportInterceptor.java:314) ~[?:?]
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:66) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.transport.TcpTransport$RequestHandler.doRun(TcpTransport.java:1555) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:672) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:41) ~[elasticsearch-6.2.4.jar:6.2.4]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-6.2.4.jar:6.2.4]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_181]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_181]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]'
I really need only the first 3 lines, and is possible, make only one at the end.
I have tried the following config:
<Extension multi>
Module xm_multiline
HeaderLine /^[\d{0,4}-\d{0,2}-\d{0,2}\D\d\d{0,2}:\d{0,2}:\d{0,2}\D\d{0,4}]*/
EndLine /^.+(at)/
</Extension>
<Input elastic-log>
InputType multi
Module im_file
File "/var/log/elasticsearch/mega.log.test"
</Input>
<Output file>
Module om_file
File '/tmp/output'
</Output>
The the output file keep giving me all the lines instead of the first 3 that I expect...
I have tested my regular expressions and I know they are working so.. why I can't have my first 3 lines!!!!???? :)
Any help will be very appreciated.
R.
romainp created
Drop Win Event message based on text file content
habrosec created
I'm attempting to use NXLog (community edition atm) to read in active directory logs into NXLog and output to syslog/json. I have a text file (one username per line) that I need to be able to compare to the username in the Windows event logs from AD. I need to be able to drop messages that the username in the Windows AD Event logs if it matches a username in the text file of usernames.
I've spent quite a bit of time googling and reading documentation and haven't found a method to achieve this. Can anyone assit?
habrosec created
Apache log with custom log
comoalt created
Hello guyz,
i am setting up nxlog service in our network and i am focusing to work with apache custom log and nxlog.
Since Apache is at the end of a reverse proxies chain, the only way to keep forwarders ip adresses, is to use an Apache conditional variable (X-Forwarded-For) able to switch between combined and custom log (as explained in details here: http://www.techstacks.com/howto/log-client-ip-and-xforwardedfor-ip-in-apache.html).
Apache log variable %h is replaced by %{X-Forwarded-For}i which is a string empty or containing one / more ip addresses.
In this specific case when nxlog hit a custom log, parse_kvp error is reported.
My Apache conf includes:
LogFormat "%h %l %u %t "%r" %>s %b "%{Referer}i" "%{User-Agent}i"" combined
LogFormat "%{X-Forwarded-For}i %l %u %t "%r" %>s %b "%{Referer}i" "%{User-Agent}i"" proxy
SetEnvIf X-Forwarded-For "^......." forwarded
CustomLog "logs/access.log" combined env=!forwarded
CustomLog "logs/access.log" proxy env=forwarded
On the net i was able to find working examples with Apache common log only. My config:
define ROOT C:\\Program Files (x86)\\nxlog
define ROOT_STRING C:\\Program Files (x86)\\nxlog
define CERTDIR %ROOT%\\cert
Moduledir %ROOT%\\modules
CacheDir %ROOT%\\data
Pidfile %ROOT%\\data\\nxlog.pid
SpoolDir %ROOT%\\data
LogFile %ROOT%\\data\\nxlog.log
<extension fileop="">
Module xm_fileop
</extension><extension json="">
Module xm_json
</extension><extension syslog="">
Module xm_syslog
</extension><extension exec="">
Module xm_exec
</extension>
## Create the parse rule for IIS logs. You can copy these from the header of the IIS log file.
<extension w3c="">
Module xm_csv
Fields $date, $time, $s-ip, $cs-method, $cs-uri-stem, $cs-uri-query, $s-port, $cs-username, $c-ip, $csUser-Agent, $csReferer, $sc-status, $sc-substatus, $sc-win32-status, $time-taken
FieldTypes string, string, string, string, string, string, string, string, string, string, string, string, string, string, string
Delimiter ' '
UndefValue -
</extension>
## In questa macchina con frontend Apache usiamo ext kvp invece di w3c
<extension kvp="">
Module xm_kvp
KVPDelimiter &
KVDelimiter =
</extension><extension kvp2="">
Module xm_kvp
KVPDelimiter ;
KVDelimiter =
#QuoteMethod None
</extension><input apache="" />
Module im_file
File "C:\Apache_install\httpd-2.4.25-win64-VC14\Apache24\logs\access.log"
Exec if $raw_event =~ /^(\S+) (\S+) (\S+) \[([^\]]+)\] \"(\S+) (.+) HTTP.\d\.\d\" (\d+) (\d+) \"([^\"]+)\" \"([^\"]+)\"/\
{ \
$Hostname = $1; \
if $3 != '-' $AccountName = $3; \
$EventTime = parsedate($4); \
$HTTPMethod = $5; \
$HTTPURL = $6; \
$HTTPResponseStatus = $7; \
$FileSize = $8; \
$HTTPReferer = $9; \
$HTTPUserAgent = $10; \
}
#Exec if $raw_event =~ /^(\S+) (\S+) (\S+) \[([^\]]+)\] \"(\S+) (.+) HTTP.\d\.\d\" (\d+) (\d+) \"([^\"]+)\" \"([^\"]+)\"/\
# { \
# $Hostname = $1; \
# if $3 != '-' $AccountName = $3; \
# $EventTime = parsedate($4); \
# $HTTPMethod = $5; \
# $HTTPURL = $6; \
# $HTTPResponseStatus = $7; \
# $FileSize = $8; \
# $HTTPReferer = $9; \
# $HTTPUserAgent = $10; \
# if $HTTPURL =~ /\?(.+)/ { $HTTPParams = $1; } \
# kvp->parse_kvp($HTTPParams); \
# delete($EventReceivedTime); \
# kvp2->to_kvp(); \
# }
<input internal="" />
Module im_internal
Exec $Message = to_json();
# Windows Event Log
<input eventlog="" />
Module im_msvistalog
# Query per ridurre Event Log . Usato il QueryXML non il comando Query
# I commenti nella quey vanno indicati in XML: <!-- stringa -->
<queryxml><querylist><query id="0"><!-- Select --><select path="Application">*[System[(Level=1 or Level=2 or Level=3)]]</select><select path="Security">*[System[(Level=1 or Level=2 or Level=3)]]</select><select path="System">*[System[(Level=1 or Level=2 or Level=3)]]</select><select path="ForwardedEvents">*</select><select path="Setup">*</select><select path="HardwareEvents">*</select><select path="Microsoft-Windows-PowerShell/Operational">*[System[(Level=1 or Level=2 or Level=3)]]</select><select path="Microsoft-Windows-TaskScheduler/Operational">*[System[(Level=1 or Level=2 or Level=3)]]</select><!-- Suppress --><suppress path="Security">*[System[(EventID=4689 or EventID=5158 or EventID=5440 or EventID=5444)]]</suppress><suppress path="Windows PowerShell">*[System[(EventID=501 or EventID=400 or EventID=600)]]</suppress></query></querylist></queryxml>
Exec $EventReceivedTime = integer($EventReceivedTime) / 1000000;
Exec to_json();
# 100Mb disk buffer
<processor buffer="">
Module pm_buffer
MaxSize 102400
Type disk
</processor>
# RFC5424 come indicato https://www.scip.ch/en/?labs.20141106
<processor rfc5424="">
Module pm_transformer
Exec $Hostname = hostname();
Outputformat syslog_rfc5424
</processor><output ssl_out="">
Module om_ssl
Host IP.IP.IP.IP
Port 443
CAFile %CERTDIR%/nxlog_rootCA.crt
CertFile %CERTDIR%/client.crt
CertKeyFile %CERTDIR%/client.key
KeyPass secret
AllowUntrusted TRUE
OutputType Binary
Exec to_syslog_ietf();
# Rimuovo CRLF LF TAB - lato server in om_file non funziona
Exec $raw_event =~ s/(\t|\r|\n)//g; $raw_event = replace($raw_event, '{', '[" "] {', 1);
#tag windows
Exec $raw_event =~ s/(\[.*])//g; $raw_event = replace($raw_event, '{', '[tag="windows"] {', 1);
#Use the following line for debugging (uncomment the fileop extension above as well)
#Exec file_write("C:\\Program Files (x86)\\nxlog\\data\\nxlog_output.log", $raw_event);
</output><route>
Path Apache, internal, eventlog => rfc5424 => buffer => ssl_out
</route>
```
Is there any solution for nxlog conf to work with this kind of apache custom log?
Thanks in advance
comoalt created
adding nxlog version to syslog
aleksandrc created
hi everyone
I've been searching this forum and the web, but can't find if there is a way to make nxlog include its version in the syslogs it forwards out
Thanks!
aleksandrc created
Error: incorrect header check
hatula created
Hi everybody,
I have a very mystical case and I need your help, please:
I have nxlog server nxlog-3.2.2016-1.x86_64
I installed a update to latest version of nxlog server (nxlog-4.1.4016-1.x86_64)
Config for clients was not change.
In the logfile of nxlog server I see many errors and nxlog do not save data:
2018-09-25 10:50:32 ERROR zlib decompression error, data error (Z_DATA_ERROR) incorrect header check
If I did downgrade version, this problem is go, everything is ok.
What's happened?! Please, help me.
hatula created
Assistance required in log file ingestion
navdeepsingh83 created
Hi,
We have a following log file from open source password manager solution. It runs on tomcat. We have graylog server where we would like to send the log data and parse it. Now, we can send the log file to graylog however the entire line comes as one message block, instead of parsing into fields automatically. I am wondering how can i convert the file into csv and send to graylog.
Here is sample log. It doesn't come with any header.
2018-08-25T07:40:14Z, ERROR, http.PwmResponse, {117412} 5028 ERROR_BAD_SESSION (client unable to reply with session key) [xx.xx.47.82]
2018-08-25T07:40:15Z, ERROR, filter.SessionFilter, {117413} 5028 ERROR_BAD_SESSION (client unable to reply with session key) [xx.xx.47.82]
2018-08-25T07:40:15Z, ERROR, http.PwmResponse, {117413} 5028 ERROR_BAD_SESSION (client unable to reply with session key) [xx.xx.47.82]
2018-08-25T07:40:17Z, ERROR, filter.SessionFilter, {117415} 5028 ERROR_BAD_SESSION (client unable to reply with session key) [xx.xx.47.82]
2018-08-25T07:40:17Z, ERROR, http.PwmResponse, {117415} 5028 ERROR_BAD_SESSION (client unable to reply with session key) [xx.xx.47.82]
2018-08-25T10:04:28Z, ERROR, filter.RequestInitializationFilter, {117422} 5063 ERROR_SECURITY_VIOLATION (current network address 'yy.yy.185.123' has changed from original network address 'yy.yy.173.181') [yy.yy.173.181]
2018-08-25T10:04:28Z, ERROR, http.PwmResponse, {117422} 5063 ERROR_SECURITY_VIOLATION (current network address 'yy.yy.185.123' has changed from original network address 'yy.yy.173.181') [yy.yy.173.181]
2018-08-25T11:08:03Z, INFO , auth.LDAPAuthenticationRequest, {117467} authID=130, successful ldap authentication for UserIdentity{"userDN":"CN=UserA,CN=Users,DC=org,DC=com","ldapProfile":"default"} (606ms) type: AUTHENTICATED, using strategy BIND, using proxy connection: false, returning bind dn: CN=UserA,CN=Users,DC=org,DC=com [yy.yy.32.238]
2018-08-25T11:08:03Z, INFO , event.AuditService, audit event: {"perpetratorID":"UserA","perpetratorDN":"CN=UserA,CN=Users,DC=org,DC=com","perpetratorLdapProfile":"default","sourceAddress":"yy.yy.32.238","sourceHost":"yy.yy.32.238","type":"USER","eventCode":"AUTHENTICATE","guid":"941aa151-8998-4c89-b690-484e623429d8","timestamp":"2018-08-25T05:38:03Z","message":"type=AUTHENTICATED, source=LOGIN_FORM","narrative":"UserA (CN=UserA,CN=Users,DC=org,DC=com) has authenticated","xdasTaxonomy":"XDAS_AE_AUTHENTICATE_ACCOUNT","xdasOutcome":"XDAS_OUT_SUCCESS"}
2018-08-25T11:08:48Z, INFO , operations.PasswordUtility, {117467,UserA} user 'UserIdentity{"userDN":"CN=UserA,CN=Users,DC=org,DC=com","ldapProfile":"default"}' successfully changed password [yy.yy.32.238]
2018-08-25T11:08:49Z, INFO , event.AuditService, audit event: {"perpetratorID":"UserA","perpetratorDN":"CN=UserA,CN=Users,DC=org,DC=com","perpetratorLdapProfile":"default","sourceAddress":"yy.yy.32.238","sourceHost":"yy.yy.32.238","type":"USER","eventCode":"CHANGE_PASSWORD","guid":"00c158d5-0ea5-46aa-8c8c-cd279f783ecd","timestamp":"2018-08-25T05:38:49Z","narrative":"UserA (CN=UserA,CN=Users,DC=org,DC=com) has changed their password","xdasTaxonomy":"XDAS_AE_SET_CRED_ACCOUNT","xdasOutcome":"XDAS_OUT_SUCCESS"}
2018-08-25T11:10:04Z, ERROR, filter.RequestInitializationFilter, {117471} 5063 ERROR_SECURITY_VIOLATION (current network address 'yy.yy.112.147' has changed from original network address 'xx.xx.243.3') [xx.xx.243.3]
I wrote the following nxlog conf but it doesn't seems to be working.
<Extension tomcat>
Module xm_csv
Fields $DateTime,$Type,$Category,$Details
FieldTypes string,string,string,string
Delimiter ","
</Extension>
<Input in_pwm>
Module im_file
File "C:\\Users\\Documents\\TempOut\\PWM\\PWM.log"
PollInterval 1
ReadFromLast False
#Recursive True
SavePos False
Exec tomcat->parse_csv();
</Input>
Appreciate your assistance in getting this working.
navdeepsingh83 created
NXlog just read access the logfile, or .... ? [Win2012]
JanVerhaag created
Hi all,
I've set up nxlog (4.1.4016) to monitor a logfile that is been written to constantly.
For some reason, when i start nxlog, the programm that creates the loglines no longer adds info to the existing log.
A simple commandline 'echo logline >> thelog.txt' does add the line to the logfile (and processed by nxlog), but other logging is not added.
As soon as I stop the NXLog service, the log is modified again.
NXLog is running as system, the programm is running as a normal user.
Any suggestions for troubleshooting would be welcome, as I have no clue what is happening.
JanVerhaag created
NXLog as a collector for Azure App Service Logs for SIEMS
EdB created
Hi all,
I am new here, so hello.
I am trying to work out a solution to collect IIS Access Log data from Azure App Services and then forward to a SIEM such as Splunk, Loggly or ElasticSearch for Security analysis, Anomoly identification and alerting.
As far I can see NXLog may provide the link between the Azure Access Logs and my chosen SIEM. Am I right? I would prefer to not get into rewriting of code, hence my interest in NXLog. In addition it appears that NXL:og seems to be widely supported by SIEM tools.
With regards to implementation, would I be correct in thinking that one needs to setup a Linux or WIndows VM on Azure with NXLog running on it. I am trying to avoid onprem installs.
In conclusion, I would welcome your advice on how I could use NXLog as a simple collector and forwarder of Access data. One final word, the Access Logs are currently stored in Azure Storage Blobs, although I could go back to storing them in the File System.
Thanks.
EdB created
im_dbi : is working ?
iCirco created
hi,
Is somebody has got an experience of im_dbi ?
I tried this example but /tmp/output is filled of blank char ?
I checked nxlog log at starting, everything is OK.
Driver mysql has been installed correcly
<Input dbi>
Module im_dbi
Driver mysql
Option host 127.0.0.1
Option username mysql
Option password mysql
Option dbname logdb
SQL SELECT id, facility, severity, hostname,
timestamp, application, message
FROM log
</Input>
<Output file>
Module om_file
File "tmp/output"
</Output>
<Route dbi_to_file>
Path dbi => file
</Route>
iCirco created
Add information from one event to another.
DDGH created
Hello!
I've been fighting for a week, but the ideas have ended.
When you delete files, Windows generates 2 Events 4663 then 4660.
In EventID:4663 there is a file name, in EventID:4660 there is a result.
The Marker can use the EventRecordID, which will differ by 1 for these two events.
The idea with the help pm_evcorr add in EventID:4663 field from EventID:4660.
As far as I understood, the design should be this:
EventID:4663 arrives
If EventID:4660 arrives within 2 seconds and in it EventRecordID greater by 1, then
We drop the ObjectName from the event 4663 into event 4660.
User guides tell us that the design should be of the form
<Pair>
# If TriggerCondition is true, wait Interval seconds for
# RequiredCondition to be true and then do the Exec. If Interval is
# 0, there is no window on matching.
TriggerCondition $Message =~ /^pair-first/
RequiredCondition $Message =~ /^pair-second/
Interval 30
Exec $raw_event = "got pair";
</Pair>
And
Exec $new_field = 'new field value';
But the problem is that it's absolutely certain that something (or rather everything) is not doing so
<Pair>
# If TriggerCondition is true, wait Interval seconds for
# RequiredCondition to be true and then do the Exec. If Interval is
# 0, there is no window on matching.
TriggerCondition $EventID =4663
RequiredCondition $EventID =4660 and $EventRecordID = get_prev_event_data("EventRecordID" + 1); - Here the main problem
Interval 2
Exec $FileName = get_prev_event_data("ObjectName");
</Pair>
I will be very grateful for the help, the hint what to read or examples.
DDGH created
Nxlog Deploy on windows
Maksimsk created
Hello!
I'm trying to deploy nxlog with GPO on windows, but sims like MSI package from https://nxlog.co/products/nxlog-community-edition/download not working properly.
After creating GPO nothing happens, I have tried install as well with scrip (cmd /c Msiexec /I \file server\share\nxlog-ce-2.9.1716.msi /qn) nothing.
When I run the script on local PC getting the error "This installation package could not be opened. Contact the application vendor to verify that this is a valid Windows Installer package."
We using AD 2012 and windows 10/8 machines.
Is there any way to deploy nxlog massive on all PC's?
Thx
Maksimsk created
Problems sending Windows Eventlog to graylog
c.scharfenberg created
Hello everybody,
I'm sorry to bother you with another question concerning Windows Eventlog forwarding to graylog. Unfortunately I'm not able to figure this out on my own.
used versions:
nxlog 2.10.2102 (running on Windows Server 2016)
graylog 2.4.6 (running on Debian 9)
I have two nxlog setups. One using syslog and another one using GELF. Both do not work as I would expect.
1. Syslog
<Extension syslog>
Module xm_syslog
</Extension>
<Input eventlog>
Module im_msvistalog
Exec delete($Keywords);
Exec if ($EventType == "VERBOSE") drop();
</Input>
<Output out_graylog>
Module om_tcp
Host graylog
Port 5140
Exec $raw_event = replace($raw_event, "\n", " ");
Exec $raw_event = replace($raw_event, "\r", " ");
Exec $raw_event = replace($raw_event, "\t", " ");
Exec to_syslog_ietf();
</Output>
<Route route_eventlog>
Path eventlog => out_graylog
</Route>
The problem is that there are eventlog entries containing line breaks. Unfortunately they are not removed by the replace commands. So in graylog one message is split into many messages with every linebreak. Using wireshark I can observe that the linebreaks consist of LF characters (Unix line endings).
2. Gelf
<Extension gelf>
Module xm_gelf
</Extension>
<Input eventlog>
Module im_msvistalog
Exec delete($Keywords);
Exec if ($EventType == "VERBOSE") drop();
</Input>
<Output out_graylog>
Module om_tcp
Host graylog
Port 12201
OutputType GELF
</Output>
<Route route_eventlog>
Path eventlog => out_graylog
</Route>
Unfortunately this setup does not work at all. No messages are showing up in Graylog (of course I've activated the correspnding input). Using wireshark I can observe that a lot of TCP packets are sent to graylog but none of them contain readable messages.
Can anybody help me with either setup?
Thanks and Regards,
Carsten
c.scharfenberg created
input file does not exist
skawt created
hi,
I'm working on monitoring a log file using nxlog. I have the File set to "C:\Program Files\test1.log" but it's saying that the "input file does not exist". I tried running a python script to check the file using the os module
import os
test = os.listdir('C:\Program Files\test1.log')
print(test)
This will return an error "FileNotFoundError: The system cannot find the path specified"
I noticed that this error has been encountered before but none of the solutions I tried work.
any help is much appreciated.
Thanks,
skawt
skawt created
How to setting om_http custom timeout?
wisnu.sudarmadi created
Hello,
is the any way to set custom timeout in om_http? or custom retry mechanism?
Thanks
wisnu.sudarmadi created
Windows event ID not forwarded and problem with control characters
ryssland created
Hi.
I am having an issue with forwarding event logs from a centralized server to an rsyslog and indexed in splunk.
The logs are forwarded but the Event ID (the most important part) is missing. I am also having an issue with control characters on , this however could be blamed on rsyslog, but as I understand it the issue with control characters could be solved in the nxlog config.
Anyone care to give me a nudge in the correct way here?
//Thx
ryssland created
convert field containig xml to json
w.schmitt@evidos.nl created
i am getting data from a database, one of these fields containts an xml, is it possible to convert this single field to json?
sample data
{
"id": 27101,
"ResponseStatus": "SUCCESS",
"RequestTime": "2018-09-19 14:21:48",
"ResponseXml": "<?xml version="1.0" encoding="UTF-8"?>\r\n<Envelope xmlns="http://schemas.xmlsoap.org/soap/envelope/"><Header /><Body><from>Jani</from></Body></Envelope>\r\n",
"RequestMode": "DSS",
"ErrorCode": null,
}
i want the ResponseXml field to be converted to json aswell, i also want to keep the other fields
or any other sollution to parse the xml so i have access to the data inside the xml
thx!
w.schmitt@evidos.nl created