File-based logs are where it all began. These logs can yield information of
great value to security analysts and administrators alike. Armed with this
information, IT professionals are better equipped to troubleshoot issues,
evaluate system performance, identify bugs, and even detect security breaches.
In today’s world, we tend to focus on the modern, integrated logging facilities
Windows Event Log
Unified Logging System (ULS).
However, all the major operating systems still generate log files that may or
may not be integrated into these logging facilities. File-based logs, it seems,
are still alive and well, probably far into the foreseeable future.
Whether it be on Windows platforms that log high-frequency events in the form of
ETL binary files, or Linux systems that store log files in the
directory, organizations rely even more so today on this conventional
logging method for capturing information about unique OS operations.
Beyond the IT departments of large corporations, manufacturing plants also use
log files extensively. In these industries, log files are the primary data
source for supervisory control and data acquisition systems (SCADA), which
monitor and control industrial processes while gathering real-time data.
Excerpt from a log file from a SCADA system
2020-05-12 12:59:38.103 +03:00 *************************************
2020-05-12 12:59:38.112 +03:00 *** Citect process is starting up ***
2020-05-12 12:59:38.135 +03:00 Running in 32 bit process architecture
2020-05-12 12:59:38.136 +03:00 Running in Console Mode
2020-05-12 12:59:38.164 +03:00 Data Execution Prevention (DEP), has been enabled successfully
Gleaning useful information from log files may sound easy, but there are some
Log files often lack structure and standardization. This is
mainly due to proprietary platforms or applications whose vendors did not adopt
any standard for their log file format. Having to read and translate various
incompatible log formats can hinder the aggregation of logged data and hence,
the ability to detect security breaches.
Another consideration is the "signal to noise ratio" of log files.
Although log files can contain extremely valuable data, these gems are often
buried under an enormous heap of rubble.
Therefore, today’s businesses need a solution that can effectively meet these
challenges that log files present.
This solution should be capable enough—and powerful enough—to read any file
format, any data structure, on any platform, aggregate the data, and forward it
to your choice of SIEM or Analytics platform. It should also provide a
flexible, powerful facility for filtering messages so that each organization
can get what it needs.
NXLog can collect your log files from any source and route them
to literally any destination simply by using its dedicated im_file module
and specifying the path to the log files and their destination.
Due to business requirements, your log data’s cumulative storage needs may have
to be kept within prescribed storage limits.
With NXlog, you can easily
your logs based on field values so that any event matching certain criteria is
dropped. You may also decide to
trim your logs
which allows you to choose a list of fields to discard or specify a list of
fields to keep, thus reducing the overall size of log files. Another approach
NXLog can use to mitigate file size is log file
for situations in which log records cannot be dropped or altered due to policy
Regardless of your organization’s choice of software solutions and their
underlying platforms, you can add value to your current log collection strategy
by complementing it with file-based logs. All you need is the right log
collection tool that can read any file, in any format, from any location. With
such a flexible solution, you can rest assured, you will reap the additional
benefits of processing file-based logs while forgetting all of the challenges
they normally present.