Ask questions. Get answers. Find technical product solutions from passionate experts in the NXLog community.
Sysmon Parsing Problem
cybergoof created
The article on structured logging (https://nxlog.co/why-use-structured-logging) shows how you should use structured logging so that changes in log format is minimized. The example of the sysmon event, process creates, shows what I think is a bug in NXLog.
The ProcessID in the "Message" is the ProcessID (25848) of the new process that sysmon sees created. However, in the structured NXLog key/values, the ProcessID is that of sysmon itself (1680)
The only way to get the ProcessID of the process sysmon observed created, is to use regular expressions. Can you verify that this is a bug in NXLog?
cybergoof created
IIS7 W3C log parsing fails
CSimpiFoN created
Hello Guys,
I have a question about NXLog IIS7 W3C logs. I set NXLog up and it works basicaly but the NXLog logs are full with error messages like this:
2016-04-26 09:46:36 ERROR if-else failed at line 64, character 257 in C:\Program Files (x86)\nxlog\conf\nxlog.conf. statement execution has been aborted; procedure 'parse_csv' failed at line 64, character 113 in C:\Program Files (x86)\nxlog\conf\nxlog.conf. statement execution has been aborted; couldn't parse integer: language=UK®ion=802&idfwbonavigation=180173.2
It look, the IIS logs contain an "=" sign in the $cs-uri-query field, and NXLog wants integer after the "=", however the field set to be string in the config file.
Have you ever met anything like this? And if yes, what could be the solution?
NXLog extension and input config:
<Extension exiis>
Module xm_csv
Fields $date $time $s-ip $cs-method $cs-uri-stem $cs-uri-query $s-port $cs-username $c-ip $cs(User-Agent) $cs(Cookie) $cs(Referer) $sc-status $sc-substatus $sc-win32-status $sc-bytes $cs-bytes $time-taken
FieldTypes string, string, string, string, string, string, integer, string, string, string, string, string, integer, integer, integer, integer, integer, integer
Delimiter ' '
QuoteChar '"'
EscapeControl FALSE
UndefValue -
</Extension>
<Input IIS>
Module im_file
File "D:\\Logs\\IIS\\W3SVC300\\u_ex*"
SavePos TRUE
Recursive FALSE
Exec if $raw_event =~ /^#/ drop(); \
else \
{ \
exiis->parse_csv(); \
$EventTime = parsedate($date + " " + $time); \
$EventTime = strftime($EventTime, "%Y-%m-%dT%H:%M:%SZ"); \
}
</Input>
CSimpiFoN created
How to collect only windows security logs
mulail created
Hello,
I'm kind of new to nxlog,
but is it possible to collect only Windows security logs ?
mulail created
how to Perform Windows NXLog.conf
CBush created
Hello,
I do admit I am totally lost about NXLog.conf for Windows 2K12 R2 machines.
The purpose is to filter some EventIDs from Security Event Log, for that I tried the below nslog.conf :
<Extension _syslog>
Module xm_syslog
</Extension>
<Input>
Module im_msvistalog
# For windows 2003 and earlier use the following:
# Module im_mseventlog
Exec if ($EventID == 4634 or $EventID == 4624 or $EventID == 4672 or $EventID == 4801 or $EventID == 64 or $EventID == 7036 or $EventID == 7040) drop();\
else\
{\
parse_syslog_ietf();\
$Message = $FileName + ": " + $Message;\
$SyslogFacility = syslog_facility_string(22);\
$SyslogFacilityValue = syslog_facility_value("local6");\
if ( $EventType == "INFO" ) $SyslogSeverityValue = 6;\
if ( $EventType == "WARNING" ) $SyslogSeverityValue = 4;\
if ( $EventType == "ERROR" ) $SyslogSeverityValue = 3;\
}
</Input>
<Output out>
Module om_udp
Host 10.1.1.39
Port 1514
Exec to_syslog_snare();
</Output>
<Route 1>
Path internal, eventlog => out
</Route>
Unfortunately despite the host and port are well set it doesn't work, and I also have these messages from nxlog.log :
xxxxxx WARNING no routes defined!
xxxxxx WARNING not starting unused module out
I would really appreciate any help
CBush created
Is it possible to compile: im_msvistalog on Linux to ingest saved log files?
cbitterfield created
Am I missing something?
I see the source code, but no configuration options to compile the module under linux.
cbitterfield created
Request a simple example for processing AWS Logs from the S3 Bucket.
cbitterfield created
I am trying to parse AWS S3 Logs. They are in JSON format (One line no CR/LF) and Gzip'd.
I need to ingest these into syslog TCP or UDP (Testing with file out)
I can't get a reliable working nxlog.conf that will process the JSON files.
NXLOG
define ROOT /usr/local/libexec/nxlog/
Pidfile /var/run/nxlog.pid
LogFile ./nxlog.log
define WORK /Users/cbitterfield/awslogs-project
SpoolDir %WORK%/data
CacheDir %WORK%/data
LogLevel DEBUG
Module xm_syslog
Module xm_json
Module im_file
File "%WORK%/data19/*.json"
Exec parse_json();
# Dump $raw_event
Exec to_syslog_bsd();
SavePos FALSE
ReadFromLast False
Module om_file
File "./output"
Path in => out
Yields the following errors and no output.
2016-04-10 22:13:00 DEBUG '^KE<F4>t^G<C7>C^D' does not match wildcard '859121128579_CloudTrail_ap-northeast-.json'
2016-04-10 22:13:00 DEBUG checking '^KE<F4>t^G<C7>C^D' against wildcard '859121128579_CloudTrail_ap-northeast-.json':
2016-04-10 22:13:00 DEBUG '^KE<F4>t^G<C7>C^D' does not match wildcard '859121128579_CloudTrail_ap-northeast-.json'
2016-04-10 22:13:00 DEBUG checking '^KE<F4>t^G<C7>C^D' against wildcard '859121128579_CloudTrail_ap-northeast-.json':
2016-04-10 22:13:00 DEBUG '^KE<F4>t^G<C7>C^D' does not match wildcard '859121128579_CloudTrail_ap-northeast-.json'
2016-04-10 22:13:00 DEBUG checking '^KE<F4>t^G<C7>C^D' against wildcard '859121128579_CloudTrail_ap-northeast-.json':
cbitterfield created
DB2 database support?
ramajith created
I am trying to read the table contents from DB2 table and send it as syslog. Is this possible with im_dbi module?
ramajith created
DBI input/output modules missing in windows installed location
ramajith created
I was trying to test the database input module of NXlog in windows environment. After installed, I have verified the modules folder. I can see all other modules, but im_dbi and om_dbi are missing in the 2.8.x and 2.9.x versions, any idea please?
ramajith created
how can i use nxlog with kibana and logstash
khoipham created
Dear all,
I have question below:
1 - I want to get log from my Server 2008 r2 and i have installed nxlog in there. Beside, i have configed a file logstash config with some information as:
input {
syslog {
type => "WindowsEventLog"
codec => json
port => 3515 --> i open this port on Win Server
}
}
filter {
if [type] == "WindowsEventLog" {
json {
source => "message"
}
if [SourceModuleName] == "eventlog" {
mutate {
replace => [ "message", "%{Message}" ]
}
mutate {
remove_field => [ "Message" ]
}
}
}
}
output {
elasticsearch {
protocol {host => localhost}
stdout {codec => rubydebug}
}
}
And have a file config as nxlog
#define ROOT C:\\Program Files\\nxlog
#define ROOT_STRING C:\\Program Files\\nxlog
define ROOT C:\\Program Files (x86)\\nxlog
define ROOT_STRING C:\\Program Files (x86)\\nxlog
define CERTDIR %ROOT%\\cert
Moduledir %ROOT%\\modules
CacheDir %ROOT%\\data
Pidfile %ROOT%\\data\\nxlog.pid
SpoolDir %ROOT%\\data
LogFile %ROOT%\\data\\nxlog.log
# Include fileop while debugging, also enable in the output module below
#<Extension fileop>
# Module xm_fileop
#</Extension>
<Extension json>
Module xm_json
</Extension>
<Extension syslog>
Module xm_syslog
</Extension>
<Input internal>
Module im_internal
Exec $Message = to_json();
</Input>
<Input eventlog>
Module im_msvistalog
# Uncomment if you want only specific logs
# Query <QueryList>\
# <Query Id="0">\
# <Select Path="Application">*</Select>\
# <Select Path="System">*</Select>\
# <Select Path="Security">*</Select>\
# </Query>\
# </QueryList>
</Input>
<Input file>
Module im_file
File "C:\\MyApp\\Logs\\mylog.json"
</Input>
<Input myapp>
Module im_file
File "C:\\MyApp\\Logs\\mylog.json"
Exec parse_json();
Exec $EventTime = parsedate($timestamp);
</Input>
<Input eventlog>
# Uncomment im_msvistalog for Windows 2008 and later
Module im_msvistalog
#Uncomment im_mseventlog for Windows XP/Windows 7 and later
#Module im_mseventlog
Exec $Message = to_json();
</Input>
<Output elasticsearch>
Module om_http
URL http://elasticsearch:9200
ContentType application/json
Exec set_http_request_path(strftime($EventTime, "/nxlog-%Y%m%d/" + $SourceModuleName)); rename_field("timestamp","@timestamp"); to_json();
</Output>
<Output out>
Module om_tcp
Host 10.151.130.114 --> this is address of Kibana and Logstash ( i config in a same server )
Port 3515
Exec to_syslog_ietf(); $raw_event = replace($raw_event, 'NXLOG@14506', '6701e99f-8724-4388-b2ac-cce6fd0eb03f@41058 tag="windows"] [', 1);
#Use the following line for debugging (uncomment the fileop extension above as well)
#Exec file_write("C:\\Program Files (x86)\\nxlog\\data\\nxlog_output.log", $raw_event);
</Output>
<Route 1>
Path internal, eventlog => out
</Route>
However, when i open kibana with it's address 10.151.130.114, don't have other data in there as well as result is No Results Found
I don't know where is my mistake. Pls, support me
2- Follow this web https://nxlog.co/docs/elasticsearch-kibana/using-nxlog-with-elasticsearch-and-kibana.html#idp54463840
in here, i don't know about om_elasticsearch module as well as om_http module. Where are they? and how can i config them?
Pls support me
Thanks and regds
khoipham created
Sending multi-line messages across om_tcp
bpedersen created
Hello
I am a newbie to using NXLog. I am attempting to send custom multi-line messages read from a txt file using im_file to a server using om_tcp. I understand that I can use xm_multiline to read the file but sending NewLine characters across om_tcp will result in new messages per line on the receiving end.
I have tried replacing the NewLine character in the read block but the replace doesn't seem to replace the NewLine Character.
Can someone help me find a way to send multi-line messages to a tcp listener using NXlog.
This is a sample of the log file with the message start being the @timestamp
@12:02:23.7990 [ISCC] Party removed [ssp view]:
@ c:007c02772ce2e0f0,03b23dd8 @ m:0000000000000000,00000000,0000000000000000 p:3 i:00001170 nw:00000000:000111705693da93 t:2
p:041c88c8 @ c:007c02772ce2e0f0,03b23dd8 r:2 t:0 s:c n:233061
- p:03d51b00 @ c:007c02772ce2e0f0,03b23dd8 r:1 t:1 s:0 n:T4464#1
p:041ceeb0 @ c:007c02772ce2e0f0,03b23dd8 r:10 t:1 s:ac n:233624
This is the block that I have tried. The \t escape character does work and will replace tabs with the replacement value
<Input IN>
Module im_file
File 'd:\logs\nxlog.txt'
SavePos FALSE
ReadFromLast FALSE
InputType multiline
Exec $raw_event = replace($raw_event, "\r\n", " ");
Exec $raw_event = replace($raw_event, "\r", " ");
Exec $raw_event = replace($raw_event, "\n", " ");
Exec $raw_event = replace($raw_event, "0x0A", " ");
Exec $raw_event = replace($raw_event, "0x0DA", " ");
Exec $raw_event = replace($raw_event, "0x0D", " ");
</Input>
Thanks
Brent
bpedersen created
Parsing Windows Logs (from FILE)
Colin.Bitterfield created
I am having no luck with a simple parsing of EVT log files.
Is there an easy way to read in EVT (Binary Log files) and output them in Syslog Format?
This is the config file I am using: (I Used python evtx to extract into text XML) However that yields XML attributes which apparently are not parse-able.
Problem Set:
Give 3 files (System.evt, Application.evt, and Security.EVT) parse the EVT format into Syslog_BSD(or IETF) formats.
<Extension multiline>
Module xm_multiline
HeaderLine /^<event>/
EndLine /^</event>/
</Extension>
<Extension syslog>
Module xm_syslog
</Extension>
<Extension xmlparser>
Module xm_xml
</Extension>
<Extension json>
Module xm_json
</Extension>
<Extension fileop>
Module xm_fileop
</Extension>
<Input in>
Module im_file
File "%ROOT%/test.xml"
# File "/tmp/cab.xml"
SavePos FALSE
ReadFromLast FALSE
InputType multiline
<Exec>
# Discard everything that doesn't seem to be an xml event
if $raw_event !~ /^<event>/ drop();
# Parse the xml event
parse_xml(); to_syslog_ietf();
# Rewrite some fields
$EventTime = parsedate($timestamp);
delete($timestamp);
delete($EventReceivedTime);
# Convert to JSON
to_json();
</Exec>
</Input>
<Output out>
Module om_file
File "%ROOT%/out.log"
Exec parse_xml();
Exec log_info("FIELD" + to_json());
</Output>
<Route 1>
Path in => out
</Route>
Colin.Bitterfield created
Sending DHCP and Windows event logs from same server
milton_jose created
Hi,
I'm a newbie tetsing the nxlog and i'm after sending windows dhcp and evet logs from same server to an SIEM. Does anyone have a working nxlog configuration that you don't mind posting here.
Thanks in advance.
Regards
Milton
milton_jose created
Unable to file_remove on Linux Setup
tav1 created
Hi Everyone,
New to nxlog, so apologies in advance! I am currently deploying nxlog on a Linux server (Red Hat Enterprise release 6.6). I am currently trying to remove a file after nxlog has finished processing. From the documentation, I am using file_remove, which is not working. Here is my config that does not throw any syntax errors when starting nxlog. In the debug log, I do not see an attempt to try and match files for removing:
<Extension fileop>
Module xm_fileop
<Schedule>
Every 1 min
Exec file_remove('/eventarchive/processed/*.raw', (now()));
</Schedule>
</Extension>
I used this same syntax on a windows setup to test it, which worked - it successfully removed files. Does anyone know if there are any limitations with Linux that would stop this from working? Is there something I'm doing wrong?
As a side question, does anyone know the best way to configure nxlog to remove a file after processing, as opposed to setting a time interval like above?
Thanks in advance!
tav1 created
Remove Duplicates Help Needed
Zendog created
Hello All,
I am trying to use the pm_norepeat module to remove duplicate log messages that sometimes flood my logs. I am apparently not grasping how this works as the duplicate records are still flooding through the logs when I attempt to use the pm_norepeat function.
Can anyone advise on what I am doing wrong? Is there a different way to accomplish de-duplification of messages with the im_msvistalog Exec to_syslog_bsd() modules ?????
Here are my configuration file statements:
define ROOT C:\Program Files (x86)\nxlog
define ROOT_STRING C:\Program Files (x86)\\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
<Extension syslog>
Module xm_syslog
</Extension>
<Input in>
Module im_msvistalog
Exec to_syslog_bsd();
ReadFromLast TRUE
SavePos TRUE
Query <QueryList>\
<Query Id="0">\
<Select Path="Application">*</Select>\
<Select Path="System">*</Select>\
<Select Path="Security">*</Select>\
</Query>\
</QueryList>
</Input>
<Processor norepeat>
Module pm_norepeat
CheckFields Hostname, SourceName, Message
</Processor>
<Output out>
Module om_udp
Host xxxxx.xxxxxxxxxxapp.com
Port 12345
</Output>
<Route 1>
Path in => norepeat => out
</Route>
Zendog created
om_tcp - reconnect(); on a schedule
Jakauppila created
I have a series of file inputs and tpc outputs. The output targets are geo-balanced across data centers which traffic is directed to based on the originating request. If we have a situation where we need to take one of the two collectors down all the agents would point at one side. Because of this, I want NXLog to reconnect to the output target at a particular interval. How do you properly use the 'reconnect();' procedure? We have a series of inputs using the same outputs.
<Output Tomcat_App_Logs_Out>
Module om_tcp
Host NP-NC-TOMCAT.Contoso.com
Port 4112
<Schedule>
Every 1 min
Exec reconnect();
</Schedule>
</Output>
Jakauppila created
ERROR maximum number of fields reached, limit is 50
Juan Andrés.Ramirez created
Hello Guys,
I download and I'm using the version nxlog-ce-2.9.1504.msi and everything working as well, but I have to parse logs from Hadoop and I had the error in the maximum number of fields my fields in Hadoop logs are 80 fields.
Here is my module to parse this:
<code>
<Extension hp>
Module xm_csv
Fields $date,$jobname,$jobid,$username,$jobpriority,$jobstatus,$totalmaps,$totalreduces,$failedmaps,$failedreduces,$submittime,$launchtime,$finishtime,$mapavgtime,$reduceavgtime,$mapmaprfsbytesread,$reducemaprfsbytesread,$mapmaprfsbyteswritten,$reducemaprfsbyteswritten,$mapfilebyteswritten,$reducefilebyteswritten,$mapinputrecords,$mapoutputbytes,$mapspilledrecords,$reduceshufflebytes,$reducespilledrecords,$mapcpumilliseconds,$reducecpumilliseconds,$combineinputrecords,$combineoutputrecords,$reduceinputrecords,$reduceinputgroups,$reduceoutputrecords,$mapgctimeelapsedmilliseconds,$reducegctimeelapsedmilliseconds,$mapphysicalmemorybytes,$reducephysicalmemorybytes,$mapvirtualmemorybytes,$reducevirtualmemorybytes,$maptaskmaxtime,$successattemptmaxtime_maptaskmaxtime,$allattemptmaxtime_maptaskmaxtime,$server_successattemptmaxtime_maptaskmaxtime,$server_allattemptmaxtime_maptaskmaxtime,$maptaskmintime,$maptaskmaxinput,$maptaskmininput,$maptaskinputformaxtime,$maptaskinputformintime,$reducetaskmaxtime,$successattemptmaxtime_reducetaskmaxtime,$allattemptmaxtime_reducetaskmaxtime,$server_successattemptmaxtime_reducetaskmaxtime,$server_allattemptmaxtime_reducetaskmaxtime,$reducetaskmintime,$reducetaskmaxinput,$reducetaskmininput,$reducetaskinputformaxtime,$reducetaskinputformintime,$jobpool,$io_sort_spill_percent,$shuffle_input_buffer_percent$,$io_sort_mb,$io_sort_factor,$map_class,$reduce_class,$inputformat_class,$output_compress,$output_compression_codec,$compress_map_output,$map_output_compression_codec,$input_dir,$output_dir,$map_jvm,$reduce_jvm,$working_dir,$java_command,$job_submithost,$reduce_parallel_copies,$racklocalmaps,$datalocalmaps,$totallaunchedmaps,$fallowreduces,$fallowmaps,$mapoutputrecords,$dummy
FieldTypes text,text,text,text,text,text,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,integer,text,text,integer,integer,integer,integer,integer,integer,integer,integer,text,text,integer,integer,integer,integer,integer,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,text,integer,integer,integer,text,text,integer,text
Delimiter \t
</Extension>
#
<Input hadoop>
Module im_file
File 'E:\\Hadoop\\analytics.sp_hadoop_stats.txt'
SavePos TRUE
Recursive TRUE
Exec if ( $raw_event =~ /^#/ or size($raw_event) == 0 ) drop(); \
else \
{ \
hp->parse_csv(); \
$EventTime = parsedate($date); \
$EventTime = strftime($EventTime, "%Y-%m-%d"); \
$SourceName = "Hadoop"; \
$hostname = hostname(); \
to_json(); \
}
</Input>
</code>
If possible increase the fields in some files or add the option in the nxlog.conf file?.
Thank you.
Juan Andrés.Ramirez created
trying to post an existing Json file to a remote web api
gregB created
we use log4net to produce log files, and have the json extensions for log4net so the file output is as follows
{
"date":"2016-03-18T13:49:36.9504697-04:00","level":
"ERROR",
"appname":"log4net_json.vshost.exe",
"logger":"log4net_json.Program",
"thread":"9",
"ndc":"(null)",
"message":"System.DivideByZeroException: Attempted to divide by zero.\r\n at log4net_json.Program.Main() in c:\\temp\\tryMeOut\\log4net.Ext.J
son.Example-master\\log4net_json\\log4net_json\\Program.cs:line 20"
}
we want to use nxlog to consume the file and post the results to a remote web api that will insert the json values into our database. I have tried loads of variations of config changes and
spent hours on the internet / reading the documentation with no luck. Below is one of the config files I have used, however each time i send it out the i only get the first character of the
the text, the "{" . the rest is missing and will not post
define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
LogLevel INFO
<Extension _syslog>
Module xm_syslog
</Extension>
<Input in>
Module im_file
File "C:/temp/WOWLog.xml"
InputType LineBased
SavePos FALSE
ReadFromLast FALSE
Exec if $raw_event =~ /^--/ drop();
Exec $raw_event = replace($raw_event, "\r\n", ";");
</Input>
<Output out>
Module om_http
URL http://localhost:51990/api/LogFile/
# HTTPSAllowUntrusted TRUE
</Output>
<Route 1>
Path in => out
</Route>
Looking for the correct settings for the configuration file.
for sanity I have been able to send the same file to the remote web api via a .net console application with a web client
gregB created
Help with connecting NXLog to Symantec MSS
Alex.Gregor created
Hi NXLog Helpers,
I am looking for some help on getting NXLog connected to Symantec MSS (Managed Security Services) and kind of on my last string with this. Right now, I am getting the following error and was wondering what I am missing. I am using this section as my conf file:
define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
<Extension syslog>
Module xm_syslog
</Extension>
<Input internal>
Module im_internal
</Input>
<Input in>
Module im_msvistalog
# For windows 2003 and earlier use the following:
# Module im_mseventlog
</Input>
<Output out>
Module om_udp
Host (IP Address has been removed)
Port 514
Exec to_syslog_snare();
</Output>
<Route 1>
Path eventlog, in => out
</Route>
I have also attached my log file, if you need anymore information let me know.
Any help on this would be amazing and help me out a ton.
Thank you NXLog helpers, you guys/gals will save my day and be amazing.
Alex.Gregor created
DateTime format conversion
Ascendo created
Hi all
I'm trying to forward logs to my Graylog server using nxlog, and it's working fine, except for one minor problem which I've been unable to fix:
The date/time format in the log is as follows:
2016/03/17 07:06:27 AM Message
I have been able to extract the date into $1 and time into $2 with regex (and message into $3) without an issue. However, I'm unable to parse the combination of the two as a date and get it into 24H format using parsedate or strptime.
Any ideas how I can populate $EventTime with the date + 24H time format from the above? Everything I try seems to result in the field being undefined.
Thanks
Ascendo created
Recursive file_remove
Jakauppila created
Is there any way to recursively delete files with file_remove?
I have applications logging in the following structure:
D:\Applogs\App1\Access-3172016.log
D:\Applogs\App2\Access-3162016.log
We're able to define an input and collect the logs no problem with the following definition:
<Input Access_Logs>
Module im_file
File "D:\\AppLogs\\Access*.log"
Recursive TRUE
...
</Input>
The number of variance of apps are large, so ideally I would want to specify the schedule to delete logs with the following:
<Extension fileop>
Module xm_fileop
<Schedule>
Every 1 min
Exec file_remove('D:\\AppLogs\\Access-*.log', (now() - 259200 ));
</Schedule>
</Extension>
But this looks specifically in the directory defined and looks like you cannot recursively search the directory heirarchy for the file like you can in the input.
Is there any way I can get this functionality?
Jakauppila created