Trying to setup a macOS laptop to send log data to Azure OMS. Any feedback on if this is possible?

I have installed the nxlog macOS package, added the azure-oms module, set up the oms-pipe-py file - specifying my azure OMS workplace customer_id & shared_key (unsure if I need to change the log_type).
Modified the nxlog.conf file
- Added Input command to use im_file and messages file.
- Added output commands to send to use om_exec module to launch the om-pipe.py script.
- Added Route to_LogAnalytics to send input => output

Verified configuration syntax was correct, restarted the service.

But I am not seeing any logs sent to my Azure OMS environment.
Any help, information, or guidance to see if this is possible is appreciated.

AskedApril 5, 2019 - 7:44pm

Answer (1)

The first step would be to actually verify whether data is collected, e.g. set up om_file as your output and check if data is being written. You can then try sending to OMS with the script. The script can be also tested standalone to verify that piece.

Comments (7)

  • mwurz's picture

    I tracked the issue from the nxlog.log file to a permissions issue. I resolved that and set the output module to om_file. This created the output.log file and now data is being written. The 1 error in the nxlog.log file is ERROR SSL error, failed to load ca cert from '/opt/nxlog/var/lib/nxlog/cert/agent-ca.pem' reason,: no such file or directory

    When I changed the output module back to om_exec & command oms-pipe.py and verified syntax was correct, restarted the service but nothing is being sent to my Azure OMS environment.
    There are 2 errors in the nxlog.log file now:

    1.) SSL error
    2.) ERROR apr_file_write failed in om_exec;Broken pipe

    Thank you for pointing me in the right direction, any help with these errors would be appreciated as well.

    The oms-pipe.py script is below, I removed my customerID and key for obvious reasons.

    #!/usr/bin/env python

    # This is a PoF script that can be used with 'om_exec' NXLog module to
    # ship logs to Microsoft Azure Cloud (Log Analytics / OMS) via REST API.

    # NXLog configuration:
    # -------------------
    # <Output out>
    # Module om_exec
    # Command /tmp/samplepy
    # </Output>
    # -------------------

    import requests
    import datetime
    import hashlib
    import hmac
    import base64
    import fileinput

    # Update the customer ID to your Operations Management Suite workspace ID
    customer_id = 'MY-CUSTOMER-ID'

    # For the shared key, use either the primary or the secondary Connected Sources client authentication key
    shared_key = "MY-SHARED-ID"

    # The log type is the name of the event that is being submitted
    log_type = 'STDIN_PY'

    # Build the API signature
    def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
    x_headers = 'x-ms-date:' + date
    string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
    bytes_to_hash = bytes(string_to_hash).encode('utf-8')
    decoded_key = base64.b64decode(shared_key)
    encoded_hash = base64.b64encode(hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest())
    authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
    return authorization

    # Build and send a request to the POST API
    def post_data(customer_id, shared_key, body, log_type):
    method = 'POST'
    content_type = 'application/json'
    resource = '/api/logs'
    rfc1123date = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
    content_length = len(body)
    signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
    uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'

    headers = {
    'content-type': content_type,
    'Authorization': signature,
    'Log-Type': log_type,
    'x-ms-date': rfc1123date

    response = requests.post(uri,data=body, headers=headers)
    if (response.status_code >= 200 and response.status_code <= 299):
    print 'Accepted'
    print "Response code: {}".format(response.status_code)

    for body in fileinput.input():
    post_data(customer_id, shared_key, body, log_type)

  • b0ti's picture

    failed to load ca cert from '/opt/nxlog/var/lib/nxlog/cert/agent-ca.pem

    You'll need to disable xm_soapadmin which is declared in the included log4ensics.conf file by default if you are not using NXLog Manager.

    The oms-pipe.py script will send the data that it receives via STDIN, so basically you can test it with e.g. the following I think:

    $ cat testfile.log |oms-pipe.py

  • mwurz's picture

    Hello, I now only have the apr_file_write failed in om_exec;Broken pipe error message.
    Piping a log file to the python script did not seem to work, it error'd-out on the first line of code.
    $ cat output.log | /opt/nxlog/etc/oms-pipe.py
    Traceback (most recent call last):
    File "/opt/nxlog/etc/oms-pipe.py", line 14, in <module>
    import requests
    ImportError: No module named requests

    When I commented out the include %CONFDIR/log4ensics.conf line and used the LogFile line, the SSL error seems have been gone away, here is my nxlog.conf file:
    User nxlog
    Group nxlog
    Panic Soft

    # default values:
    # PidFile /opt/nxlog/var/run/nxlog/nxlog.pid
    # CacheDir /opt/nxlog/var/spool/nxlog
    # ModuleDir /opt/nxlog/lib/nxlog/modules
    # SpoolDir /opt/nxlog/var/spool/nxlog

    define CERTDIR /opt/nxlog/var/lib/nxlog/cert
    define CONFDIR /opt/nxlog/var/lib/nxlog

    # Note that these two lines define constants only; the log file location
    # is ultimately set by the `LogFile` directive (see below). The
    # `MYLOGFILE` define is also used to rotate the log file automatically
    # (see the `_fileop` block).
    define LOGDIR /opt/nxlog/var/log/nxlog
    define MYLOGFILE %LOGDIR%/nxlog.log

    # By default, `LogFile %MYLOGFILE%` is set in log4ensics.conf. This
    # allows the log file location to be modified via NXLog Manager. If you
    # are not using NXLog Manager, you can instead set `LogFile` below and
    # disable the `include` line.
    LogFile %MYLOGFILE%
    #include %CONFDIR%/log4ensics.conf

    <Extension _syslog>
    Module xm_syslog

    # This block rotates `%MYLOGFILE%` on a schedule. Note that if `LogFile`
    # is changed in log4ensics.conf via NXLog Manager, rotation of the new
    # file should also be configured there.
    <Extension _fileop>
    Module xm_fileop

    # Check the size of our log file hourly, rotate if larger than 5MB
    Every 1 hour
    if ( file_exists('%MYLOGFILE%') and
    (file_size('%MYLOGFILE%') >= 5M) )
    file_cycle('%MYLOGFILE%', 8);

    # Rotate our log file every week on Sunday at midnight
    When @weekly
    Exec if file_exists('%MYLOGFILE%') file_cycle('%MYLOGFILE%', 8);

    # INPUTS #

    <Input in>
    Module im_file
    File "/var/log/system.log"
    Exec parse_syslog();

    # Output #

    <Output out>
    Module om_exec
    Command /opt/nxlog/etc/oms-pipe.py

    <Route R1>
    Path in => out

  • mwurz's picture

    I installed the request module, receive the following output error message after running the cat command - Response code: 400

  • mwurz's picture

    I noticed the Content_type in the python script was set to application/json. I tested changing this to text/plain but still receive the 400 Response code

  • mwurz's picture

    After reviewing this article, https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api I changed the format type to json and have changed the modules in the nxlog.conf. The Nxlog.log file does not show any errors now, and data is now being sent to OMS environment. xm_json xm_syslog - for parsing macOS system.log file

    Thank you for the information B0ti!