Ask questions. Get answers. Find technical product solutions from passionate experts in the NXLog community.

nxlog crashes continuously
nxlog is crashing with the following error:   Faulting application name: nxlog.exe, version: 0.0.0.0, time stamp: 0x5666d55e Faulting module name: ntdll.dll, version: 6.3.9600.18202, time stamp: 0x569e72c5 Exception code: 0xc0000005 Fault offset: 0x000192cb Faulting process id: 0x1b60 Faulting application start time: 0x01d18540c8297bd3 Faulting application path: C:\Program Files (x86)\nxlog\nxlog.exe Faulting module path: C:\Windows\SYSTEM32\ntdll.dll Report Id: 06d89363-f134-11e5-80dd-005056a619fb Faulting package full name: Faulting package-relative application ID:   config file:   define ROOT C:\Program Files (x86)\nxlog Moduledir %ROOT%\modules CacheDir %ROOT%\data Pidfile %ROOT%\data\nxlog.pid SpoolDir %ROOT%\data LogFile %ROOT%\data\nxlog.log <Extension json> Module xm_json </Extension> <Input eventlog> # Use 'im_mseventlog' for Windows XP and 2003 Module im_msvistalog SavePos FALSE ReadFromLast FALSE Query <QueryList>\ <Query Id="0">\ <Select Path="Pool2PdfCreator.Produce">*</Select>\ </Query>\ </QueryList> </Input> <Output out> Module om_tcp Host 10.36.52.62 Port 12201 Exec $EventTime = integer($EventTime) / 1000000; to_json(); Exec log_info("RecordNumber: " + $RecordNumber); </Output> <Route r> Path eventlog => out </Route>   (during troubleshooting, I have narrowed down the query to one eventsource and also added Exec log_info("RecordNumber: " + $RecordNumber); to be able to pinpoint the exact entry that causes the issue. I was able to locate the entry that causes the crash. The strange thing is, it sometimes goes through, most of the times causes crash. I am not comfortable with sharing the entry here but I can send it via e-mail. This definitely looks like a bug.   nxlog version: nxlog-ce-2.9.1504

achechen created
Mimic rsyslog output on Ubuntu
Using nxlog in front of logstash on a server.  On the same user, I want to use nxlog to replace rsyslog.  Seems pretty simple.  The only issue is the file format is slightly different than the what rsyslog outputs.  I see: <78>May 6 13:50:01 CRON[19454]: (root) CMD ( /opt/observium/discovery.php -h new >> /dev/null 2>&1) vs: Jul 16 18:00:01 monitor01 CRON[6871]: (root) CMD ( /opt/observium/poller-wrapper.py 16 >> /dev/null 2>&1) Main difference is the <NN> at the beginning of the line and the missing hostname (monitor01).  Here is my .conf: <Input in_uds> Module im_uds UDS /dev/log </Input> <Input in_kernel> Module im_kernel </Input> <Output out> Module om_file File "/var/log/syslog" </Output> <Route local_route> Path in_uds, in_kernel => out </Route> Is there a simple change I can make to get the desired format?  Thanks.  -- Bud    

bbach created
Replies: 1
View post »
last updated
Nxlog not handling winevent TimeCreated
I'm using NXlog to ship windows event logs to an ELK stack.  I need to preserve the datetime when the event happened <TimeCreated SystemTime=> that is stored in the event log.   However, the NXLog that is shipped doesn't preserve <TimeCreated SystemTime>, which I assume is because its invalid json.  How can I preserve this in my nxlog.conf?  Otherwise, I'm stuck with EventTime, which appears to be the datetime of when nxlog processes the event, not when the event happened.   How do I handle this?

cybergoof created
Replies: 1
View post »
last updated
Issues with data upload to ElasticSearch
Hello everyone, This is my first time posting in this community forum, so any help would be greatly appreciated. I've been working with NxLog and ElasticSearch for a few months now and I've had mostly no issues with it until very recently, where a new ElasticSearch index was created in order to accomodate the new structure of our logs. With that, we also had to update our existing nxlog.conf file. We have about 3 different ElasticSearch endpoints with the same setup that we currently work with, and at one point during the week we had run out of storage space. After increasing the storage size for all 3 endpoints, two environments appeared to continue sending new information up to ElasticSearch with no further problems. However, the third environment's NxLog services appear to be stuck in both existing AWS instances and newly created instances, repeating the following NxLog log entry over and over: 2016-04-29 15:33:12 INFO connecting to search-stage-logging-udf7h4lq2bsm245ciawp2stcvu.us-east-1.es.amazonaws.com:80 2016-04-29 15:33:12 INFO reconnecting in 1 seconds 2016-04-29 15:33:12 ERROR ### PANIC at line 2456 in module.c/nx_module_pollset_add_socket(): "failed to add descriptor to pollset: Not enough space ; [cannot dump backtrace on this platform]" ### This was the log entry that we initially discovered that alerted us we had run out of space in ElasticSearch. However, the ElasticSearch dashboard does not show a lack of space anymore, so it's a bit confusing why NxLog would continue to output log entries in the other two environments Basically, I have two questions: 1. Is this a type of scenario where the NxLog service's working state has been stuck unable to see there is space available? Or does the fault lie with ElasticSearch not showing storage space correctly? 2. If the Nxlog service has been stuck in this state, is there a configuration or some other automated procedure for NxLog to get the service to restart itself on multiple failures?   If anyone has gone through a similar experience, any tips would be greatly appreciated. Thank you for your time.

jppacheco created
Replies: 1
View post »
last updated
Sysmon Parsing Problem
The article on structured logging (https://nxlog.co/why-use-structured-logging) shows how you should use structured logging so that changes in log format is minimized.  The example of the sysmon event, process creates, shows what I think is a bug in NXLog.   The ProcessID in the "Message" is the ProcessID (25848) of the new process that sysmon sees created.  However, in the structured NXLog key/values, the ProcessID is that of sysmon itself (1680)   The only way to get the ProcessID of the process sysmon observed created, is to use regular expressions.  Can you verify that this is a bug in NXLog?

cybergoof created
Replies: 1
View post »
last updated
IIS7 W3C log parsing fails
Hello Guys, I have a question about NXLog IIS7 W3C logs. I set NXLog up and it works basicaly but the NXLog logs are full with error messages like this: 2016-04-26 09:46:36 ERROR if-else failed at line 64, character 257 in C:\Program Files (x86)\nxlog\conf\nxlog.conf. statement execution has been aborted; procedure 'parse_csv' failed at line 64, character 113 in C:\Program Files (x86)\nxlog\conf\nxlog.conf. statement execution has been aborted; couldn't parse integer: language=UK&region=802&idfwbonavigation=180173.2 It look, the IIS logs contain an "=" sign in the $cs-uri-query field, and NXLog wants integer after the "=", however the field set to be string in the config file. Have you ever met anything like this? And if yes, what could be the solution?   NXLog extension and input config: <Extension exiis>     Module        xm_csv     Fields        $date $time $s-ip $cs-method $cs-uri-stem $cs-uri-query $s-port $cs-username $c-ip $cs(User-Agent) $cs(Cookie) $cs(Referer) $sc-status $sc-substatus $sc-win32-status $sc-bytes $cs-bytes $time-taken     FieldTypes    string, string, string, string, string, string, integer, string, string, string, string, string, integer, integer, integer, integer, integer, integer     Delimiter    ' '     QuoteChar    '"'     EscapeControl    FALSE     UndefValue    - </Extension> <Input IIS>     Module        im_file     File        "D:\\Logs\\IIS\\W3SVC300\\u_ex*"     SavePos        TRUE     Recursive    FALSE     Exec if $raw_event =~ /^#/ drop();                                                                  \         else                                                                                                         \         {                                                                                                              \             exiis->parse_csv();                                                                                \             $EventTime = parsedate($date + " " + $time);                                          \             $EventTime = strftime($EventTime, "%Y-%m-%dT%H:%M:%SZ");            \         } </Input>  

CSimpiFoN created
Replies: 2
View post »
last updated
How to collect only windows security logs
Hello, I'm kind of new to nxlog, but is it possible to collect only Windows security logs ?

mulail created
Replies: 1
View post »
last updated
how to Perform Windows NXLog.conf
Hello, I do admit I am totally lost about NXLog.conf for Windows 2K12 R2 machines. The purpose is to filter some EventIDs from Security Event Log, for that I tried the below nslog.conf : <Extension _syslog>     Module      xm_syslog </Extension> <Input>    Module      im_msvistalog # For windows 2003 and earlier use the following: #   Module      im_mseventlog     Exec if ($EventID == 4634 or $EventID == 4624 or $EventID == 4672 or $EventID == 4801 or $EventID == 64 or $EventID == 7036 or $EventID == 7040) drop();\     else\     {\         parse_syslog_ietf();\         $Message = $FileName + ": " + $Message;\         $SyslogFacility = syslog_facility_string(22);\         $SyslogFacilityValue = syslog_facility_value("local6");\         if ( $EventType == "INFO" ) $SyslogSeverityValue = 6;\         if ( $EventType == "WARNING" ) $SyslogSeverityValue = 4;\         if ( $EventType == "ERROR" ) $SyslogSeverityValue = 3;\     } </Input> <Output out>     Module      om_udp     Host        10.1.1.39     Port        1514     Exec        to_syslog_snare(); </Output> <Route 1>     Path internal, eventlog => out </Route>     Unfortunately despite the host and port are well set it doesn't work, and I also have these messages from nxlog.log : xxxxxx WARNING no routes defined! xxxxxx WARNING not starting unused module out   I would really appreciate any help

CBush created
Replies: 1
View post »
last updated
Is it possible to compile: im_msvistalog on Linux to ingest saved log files?
Am I missing something? I see the source code, but no configuration options to compile the module under linux.  

cbitterfield created
Replies: 1
View post »
last updated
Request a simple example for processing AWS Logs from the S3 Bucket.
I am trying to parse AWS S3 Logs. They are in JSON format (One line no CR/LF) and Gzip'd. I need to ingest these into syslog TCP or UDP (Testing with file out) I can't get a reliable working nxlog.conf that will process the JSON files. NXLOG define ROOT /usr/local/libexec/nxlog/ Pidfile /var/run/nxlog.pid LogFile ./nxlog.log define WORK /Users/cbitterfield/awslogs-project SpoolDir %WORK%/data CacheDir %WORK%/data LogLevel DEBUG Module xm_syslog Module xm_json Module im_file File "%WORK%/data19/*.json" Exec parse_json(); # Dump $raw_event Exec to_syslog_bsd(); SavePos FALSE ReadFromLast False Module om_file File "./output" Path in => out Yields the following errors and no output. 2016-04-10 22:13:00 DEBUG '^KE<F4>t^G<C7>C^D' does not match wildcard '859121128579_CloudTrail_ap-northeast-.json' 2016-04-10 22:13:00 DEBUG checking '^KE<F4>t^G<C7>C^D' against wildcard '859121128579_CloudTrail_ap-northeast-.json': 2016-04-10 22:13:00 DEBUG '^KE<F4>t^G<C7>C^D' does not match wildcard '859121128579_CloudTrail_ap-northeast-.json' 2016-04-10 22:13:00 DEBUG checking '^KE<F4>t^G<C7>C^D' against wildcard '859121128579_CloudTrail_ap-northeast-.json': 2016-04-10 22:13:00 DEBUG '^KE<F4>t^G<C7>C^D' does not match wildcard '859121128579_CloudTrail_ap-northeast-.json' 2016-04-10 22:13:00 DEBUG checking '^KE<F4>t^G<C7>C^D' against wildcard '859121128579_CloudTrail_ap-northeast-.json':

cbitterfield created
Replies: 2
View post »
last updated
DB2 database support?
I am trying to read the table contents from DB2 table and send it as syslog. Is this possible with im_dbi module?

ramajith created
Replies: 1
View post »
last updated
DBI input/output modules missing in windows installed location
I was trying to test the database input module of NXlog in windows environment. After installed, I have verified the modules folder. I can see all other modules, but im_dbi and om_dbi are missing in the 2.8.x and 2.9.x versions, any idea please?

ramajith created
Replies: 1
View post »
last updated
how can i use nxlog with kibana and logstash
Dear all, I have question below: 1 -  I want to get log from my Server 2008 r2 and i have installed nxlog in there. Beside, i have configed a file logstash config with some information as: input {   syslog {     type => "WindowsEventLog"     codec => json     port => 3515 --> i open this port on Win Server   } } filter {   if [type] == "WindowsEventLog" {     json {       source => "message"     }     if [SourceModuleName] == "eventlog" {       mutate {         replace => [ "message", "%{Message}" ]       }       mutate {         remove_field => [ "Message" ]       }     }   } } output {   elasticsearch {     protocol {host => localhost}     stdout {codec => rubydebug}   } } And have a file config as nxlog #define ROOT C:\\Program Files\\nxlog #define ROOT_STRING C:\\Program Files\\nxlog define ROOT C:\\Program Files (x86)\\nxlog define ROOT_STRING C:\\Program Files (x86)\\nxlog define CERTDIR %ROOT%\\cert   Moduledir %ROOT%\\modules CacheDir %ROOT%\\data Pidfile %ROOT%\\data\\nxlog.pid SpoolDir %ROOT%\\data LogFile %ROOT%\\data\\nxlog.log   # Include fileop while debugging, also enable in the output module below #<Extension fileop> #    Module      xm_fileop #</Extension>   <Extension json>     Module      xm_json </Extension>   <Extension syslog>     Module xm_syslog </Extension>   <Input internal>     Module im_internal     Exec  $Message = to_json(); </Input>      <Input eventlog>     Module      im_msvistalog # Uncomment if you want only specific logs #    Query       <QueryList>\ #                    <Query Id="0">\ #                        <Select Path="Application">*</Select>\ #                        <Select Path="System">*</Select>\ #                        <Select Path="Security">*</Select>\ #                    </Query>\ #                </QueryList> </Input> <Input file>     Module    im_file     File    "C:\\MyApp\\Logs\\mylog.json" </Input> <Input myapp>     Module    im_file     File    "C:\\MyApp\\Logs\\mylog.json"     Exec    parse_json();     Exec        $EventTime = parsedate($timestamp); </Input> <Input eventlog> # Uncomment im_msvistalog for Windows 2008 and later     Module im_msvistalog   #Uncomment im_mseventlog for Windows XP/Windows 7 and later #Module im_mseventlog       Exec  $Message = to_json(); </Input>   <Output elasticsearch>     Module      om_http     URL         http://elasticsearch:9200     ContentType application/json     Exec        set_http_request_path(strftime($EventTime, "/nxlog-%Y%m%d/" + $SourceModuleName)); rename_field("timestamp","@timestamp"); to_json(); </Output> <Output out>     Module om_tcp     Host 10.151.130.114 --> this is address of Kibana and Logstash ( i config in a same server )     Port 3515       Exec to_syslog_ietf(); $raw_event = replace($raw_event, 'NXLOG@14506', '6701e99f-8724-4388-b2ac-cce6fd0eb03f@41058 tag="windows"] [', 1);   #Use the following line for debugging (uncomment the fileop extension above as well) #Exec file_write("C:\\Program Files (x86)\\nxlog\\data\\nxlog_output.log",  $raw_event); </Output>   <Route 1>     Path internal, eventlog => out </Route> However, when i open kibana with it's address 10.151.130.114, don't have other data in there as well as result is No Results Found I don't know where is my mistake. Pls, support me 2- Follow this web https://nxlog.co/docs/elasticsearch-kibana/using-nxlog-with-elasticsearch-and-kibana.html#idp54463840 in here, i don't know about om_elasticsearch module as well as om_http module. Where are they? and how can i config them? Pls support me Thanks and regds

khoipham created
Sending multi-line messages across om_tcp
Hello I am a newbie to using NXLog. I am attempting to send custom multi-line messages read from a txt file using im_file to a server using om_tcp. I understand that I can use xm_multiline to read the file but sending NewLine characters across om_tcp will result in new messages per line on the receiving end. I have tried replacing the NewLine character in the read block but the replace doesn't seem to replace the NewLine Character. Can someone help me find a way to send multi-line messages to a tcp listener using NXlog. This is a sample of the log file with the message start being the @timestamp @12:02:23.7990 [ISCC] Party removed [ssp view]:     @ c:007c02772ce2e0f0,03b23dd8 @ m:0000000000000000,00000000,0000000000000000 p:3 i:00001170 nw:00000000:000111705693da93 t:2       p:041c88c8 @ c:007c02772ce2e0f0,03b23dd8 r:2 t:0 s:c n:233061     - p:03d51b00 @ c:007c02772ce2e0f0,03b23dd8 r:1 t:1 s:0 n:T4464#1       p:041ceeb0 @ c:007c02772ce2e0f0,03b23dd8 r:10 t:1 s:ac n:233624 This is the block that I have tried. The \t escape character does work and will replace tabs with the replacement value <Input IN>     Module   im_file     File     'd:\logs\nxlog.txt'     SavePos  FALSE     ReadFromLast FALSE     InputType    multiline     Exec $raw_event = replace($raw_event, "\r\n", " ");     Exec $raw_event = replace($raw_event, "\r", " ");     Exec $raw_event = replace($raw_event, "\n", " ");     Exec $raw_event = replace($raw_event, "0x0A", " ");     Exec $raw_event = replace($raw_event, "0x0DA", " ");     Exec $raw_event = replace($raw_event, "0x0D", " "); </Input> Thanks Brent  

bpedersen created
Replies: 4
View post »
last updated
Parsing Windows Logs (from FILE)
I am having no luck with a simple parsing of EVT log files. Is there an easy way to read in EVT (Binary Log files) and output them in Syslog Format? This is the config file I am using: (I Used python evtx to extract into text XML) However that yields XML attributes which apparently are not parse-able. Problem Set: Give 3 files (System.evt, Application.evt, and Security.EVT) parse the EVT format into Syslog_BSD(or IETF) formats.     <Extension multiline>     Module    xm_multiline     HeaderLine    /^<event>/     EndLine    /^</event>/ </Extension> <Extension syslog> Module    xm_syslog </Extension> <Extension xmlparser> Module    xm_xml </Extension> <Extension json> Module    xm_json </Extension> <Extension fileop> Module xm_fileop </Extension> <Input in>     Module im_file     File "%ROOT%/test.xml" #    File "/tmp/cab.xml"     SavePos    FALSE     ReadFromLast FALSE     InputType    multiline     <Exec>       # Discard everything that doesn't seem to be an xml event          if $raw_event !~ /^<event>/ drop();       # Parse the xml event       parse_xml(); to_syslog_ietf();       # Rewrite some fields        $EventTime = parsedate($timestamp);       delete($timestamp);       delete($EventReceivedTime);       # Convert to JSON       to_json();     </Exec> </Input> <Output out>     Module  om_file     File    "%ROOT%/out.log"     Exec    parse_xml();     Exec     log_info("FIELD" +  to_json()); </Output> <Route 1>     Path    in => out </Route>

Colin.Bitterfield created
Replies: 1
View post »
last updated
Sending DHCP and Windows event logs from same server
Hi, I'm a newbie tetsing the nxlog and i'm after sending windows dhcp and evet logs from same server to an SIEM. Does anyone have a working nxlog configuration that you don't mind posting here.  Thanks in advance. Regards Milton

milton_jose created
Replies: 1
View post »
last updated
Unable to file_remove on Linux Setup
Hi Everyone, New to nxlog, so apologies in advance! I am currently deploying nxlog on a Linux server (Red Hat Enterprise release 6.6). I am currently trying to remove a file after nxlog has finished processing. From the documentation, I am using file_remove, which is not working. Here is my config that does not throw any syntax errors when starting nxlog. In the debug log, I do not see an attempt to try and match files for removing: <Extension fileop>  Module   xm_fileop     <Schedule>         Every   1 min         Exec    file_remove('/eventarchive/processed/*.raw', (now()));     </Schedule> </Extension> I used this same syntax on a windows setup to test it, which worked - it successfully removed files. Does anyone know if there are any limitations with Linux that would stop this from working? Is there something I'm doing wrong? As a side question, does anyone know the best way to configure nxlog to remove a file after processing, as opposed to setting a time interval like above? Thanks in advance!      

tav1 created
Replies: 1
View post »
last updated
Remove Duplicates Help Needed
Hello All, I am trying to use the pm_norepeat module to remove duplicate log messages that sometimes flood my logs. I am apparently not grasping how this works as the duplicate records are still flooding through the logs when I attempt to use the pm_norepeat function. Can anyone advise on what I am doing wrong?  Is there a different way to accomplish de-duplification of messages with the im_msvistalog Exec to_syslog_bsd() modules ????? Here are my configuration file statements: define ROOT C:\Program Files (x86)\nxlog define ROOT_STRING C:\Program Files (x86)\\nxlog    Moduledir %ROOT%\modules CacheDir %ROOT%\data Pidfile %ROOT%\data\nxlog.pid SpoolDir %ROOT%\data LogFile %ROOT%\data\nxlog.log <Extension syslog>     Module      xm_syslog </Extension>   <Input in>     Module      im_msvistalog     Exec    to_syslog_bsd();        ReadFromLast TRUE     SavePos     TRUE     Query       <QueryList>\                     <Query Id="0">\                         <Select Path="Application">*</Select>\                         <Select Path="System">*</Select>\                         <Select Path="Security">*</Select>\                     </Query>\                 </QueryList>     </Input> <Processor norepeat>    Module    pm_norepeat    CheckFields Hostname, SourceName, Message </Processor> <Output out>    Module      om_udp     Host xxxxx.xxxxxxxxxxapp.com     Port 12345 </Output> <Route 1>     Path in => norepeat => out </Route>

Zendog created
Replies: 1
View post »
last updated
om_tcp - reconnect(); on a schedule
I have a series of file inputs and tpc outputs. The output targets are geo-balanced across data centers which traffic is directed to based on the originating request. If we have a situation where we need to take one of the two collectors down all the agents would point at one side. Because of this, I want NXLog to reconnect to the output target at a particular interval. How do you properly use the 'reconnect();' procedure? We have a series of inputs using the same outputs. <Output Tomcat_App_Logs_Out>     Module      om_tcp     Host        NP-NC-TOMCAT.Contoso.com     Port        4112    <Schedule>         Every   1 min         Exec    reconnect();    </Schedule> </Output>  

Jakauppila created
Replies: 1
View post »
last updated
ERROR maximum number of fields reached, limit is 50
Hello Guys,          I download and I'm using the version nxlog-ce-2.9.1504.msi and everything working as well, but I have to parse logs from Hadoop and I had the error in the maximum number of fields my fields in Hadoop logs are 80 fields. Here is my module to parse this: <code> <Extension hp>     Module xm_csv     Fields        $date,$jobname,$jobid,$username,$jobpriority,$jobstatus,$totalmaps,$totalreduces,$failedmaps,$failedreduces,$submittime,$launchtime,$finishtime,$mapavgtime,$reduceavgtime,$mapmaprfsbytesread,$reducemaprfsbytesread,$mapmaprfsbyteswritten,$reducemaprfsbyteswritten,$mapfilebyteswritten,$reducefilebyteswritten,$mapinputrecords,$mapoutputbytes,$mapspilledrecords,$reduceshufflebytes,$reducespilledrecords,$mapcpumilliseconds,$reducecpumilliseconds,$combineinputrecords,$combineoutputrecords,$reduceinputrecords,$reduceinputgroups,$reduceoutputrecords,$mapgctimeelapsedmilliseconds,$reducegctimeelapsedmilliseconds,$mapphysicalmemorybytes,$reducephysicalmemorybytes,$mapvirtualmemorybytes,$reducevirtualmemorybytes,$maptaskmaxtime,$successattemptmaxtime_maptaskmaxtime,$allattemptmaxtime_maptaskmaxtime,$server_successattemptmaxtime_maptaskmaxtime,$server_allattemptmaxtime_maptaskmaxtime,$maptaskmintime,$maptaskmaxinput,$maptaskmininput,$maptaskinputformaxtime,$maptaskinputformintime,$reducetaskmaxtime,$successattemptmaxtime_reducetaskmaxtime,$allattemptmaxtime_reducetaskmaxtime,$server_successattemptmaxtime_reducetaskmaxtime,$server_allattemptmaxtime_reducetaskmaxtime,$reducetaskmintime,$reducetaskmaxinput,$reducetaskmininput,$reducetaskinputformaxtime,$reducetaskinputformintime,$jobpool,$io_sort_spill_percent,$shuffle_input_buffer_percent$,$io_sort_mb,$io_sort_factor,$map_class,$reduce_class,$inputformat_class,$output_compress,$output_compression_codec,$compress_map_output,$map_output_compression_codec,$input_dir,$output_dir,$map_jvm,$reduce_jvm,$working_dir,$java_command,$job_submithost,$reduce_parallel_copies,$racklocalmaps,$datalocalmaps,$totallaunchedmaps,$fallowreduces,$fallowmaps,$mapoutputrecords,$dummy     FieldTypes    text,text,text,text,text,text,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,text,text,integer,integer,integer,integer,integer,integer,integer,integer,text,text,integer,integer,integer,integer,integer,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,integer,integer,integer,text,text,integer,text     Delimiter    \t </Extension> # <Input hadoop>   Module         im_file   File             'E:\\Hadoop\\analytics.sp_hadoop_stats.txt'   SavePos         TRUE   Recursive        TRUE   Exec    if ( $raw_event =~ /^#/ or size($raw_event) == 0 ) drop(); \                 else \                 { \                     hp->parse_csv();                            \                     $EventTime = parsedate($date); \                     $EventTime = strftime($EventTime, "%Y-%m-%d"); \                     $SourceName = "Hadoop"; \                     $hostname = hostname(); \                     to_json(); \                 }  </Input>   </code>           If possible increase the fields in some files or add the option in the nxlog.conf file?. Thank you.  

Juan Andrés.Ramirez created
Replies: 1
View post »
last updated