Commit 76af41c9 authored by Botond Botyanszki's avatar Botond Botyanszki

sync azure-oms and amazon-s3

parent 5bf85fcb
This diff is collapsed.
This diff is collapsed.
#!/bin/sh
# Generate HTML format
asciidoctor amazon-s3.adoc
# Generate PDF format
asciidoctor-pdf amazon-s3.adoc
<Input in>
<Input s3>
Module im_python
PythonCode s3_read.py
</Input>
<Output out>
<Output file>
Module om_file
File "output.log"
</Output>
<Route exec_to_file>
Path in => out
<Route s3_to_file>
Path s3 => file
</Route>
<Input file>
Module im_file
File "input.log"
# These may be helpful for testing
SavePos FALSE
ReadFromLast FALSE
</Input>
<Output s3>
Module om_python
PythonCode s3_write.py
</Output>
<Route file_to_s3>
Path file => s3
</Route>
<Input in>
Module im_file
File "input.log"
SavePos FALSE
ReadFromLast FALSE
</Input>
<Output out>
Module om_python
PythonCode s3_write.py
</Output>
<Route exec_to_file>
Path in => out
</Route>
:addon-name: azure-oms
[id="addon-{addon-name}"]
[desc="Send logs to Azure Cloud OMS Log Analytics via the REST API"]
= Azure OMS
include::../../_asciidoctor/public.adoc[]
The Azure OMS add-on supports connecting to the Microsoft Azure Cloud
Operational Management Suite (OMS) Log Analytics system via its REST API and
sending or receiving log data. See the
link:https://docs.microsoft.com/en-us/azure/operations-management-suite/operations-management-suite-overview[Azure
OMS] and
link:https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-overview[Log
Analytics] documentation for more information about configuring and using
Azure OMS and its log management service.
== Forwarding Data to Log Analytics
The `oms-pipe.py` script performs REST API calls to send log data to the Log
Analytics service. To configure NXLog, complete the following steps.
. Log in to the Azure portal and go to the *Log Analytics* service (for
instance by typing the service name into the search bar).
. Select an existing *OMS Workspace* or create a new one by clicking the *Add*
button.
. From the *Management* section in the main workspace screen, click *OMS
Portal*.
+
image::azure-oms_1.png["Shipping data to Log Analytics, screen 1", pdfwidth=604px]
. In the *Microsoft Operations Management Suite*, click the settings icon in
the top right corner, navigate to *Settings > Connected Sources > Linux
Servers*, and copy the *WORKSPACE ID* and *PRIMARY KEY* values. These are
needed for API access.
+
image::azure-oms_2.png["Shipping data to Log Analytics, screen 2", pdfwidth=512px]
. Enable *Custom Logs*. As of this writing it is a preview feature, available
under *Settings > Preview Features > Custom Logs*.
+
image::azure-oms_3.png["Shipping data to Log Analytics, screen 3", pdfwidth=511px]
. Place the `oms-pipe.py` script in a location accessible by NXLog and make
sure it is executable by NXLog.
. Set the customer ID, shared key, and log type values in the script.
. Configure NXLog to execute the script with the <<om_exec,om_exec>>
module. The contents of the `$raw_event` field will be forwarded.
.Sending Raw Syslog Events
====
This configuration reads raw events from file and forwards them to Azure OMS.
.nxlog.conf
[source,config]
----
include::snippets/oms-pipe-raw-syslog.conf[tag=doc_include]
----
====
.Sending JSON Log Data
====
With this configuration, NXLog Enterprise Edition reads W3C records with from
file with <<im_file,im_file>>, parses the records with <<xm_w3c,xm_w3c>>,
converts the internal event fields to JSON format with _xm_json_
<<xm_json_proc_to_json,to_json()>>, and forwards the result to Azure OMS with
<<om_exec,om_exec>>.
.nxlog.conf
[source,config]
----
include::snippets/oms-pipe-json.conf[tag=doc_include]
----
====
== Downloading Data From Log Analytics
The `oms-download.py` Python script implements the OMS API and performs a REST
API call for downloading data from Log Analytics. To set it up with NXLog,
follow these steps:
. Register an application in *Azure Active Directory* and generate an access
key for the application.
. Under your *Subscription*, go to *Access control (IAM)* and assign the *Log
Analytics Reader* role to this application.
. Place the `oms-download.py` script in a location accessible by NXLog.
. Set the resource group, workspace, subscription ID, tenant ID, application
ID, and application key values in the script. Adjust the query details as
required.
+
NOTE: The Tenant ID can be found as *Directory ID* under the Azure Active
Directory *Properties* tab.
. Configure NXLog to execute the script with the <<im_python,im_python>>
module.
Detailed instructions on this topic can be found in the
link:https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal[Azure
documentation].
.Collecting Logs From OMS
====
This configuration uses the <<im_python,im_python>> module and the
`oms-download.py` script to periodically collect log data from the Log
Analytics service.
.nxlog.conf
[source,config]
----
include::snippets/oms-download.conf[tag=doc_include]
----
====
This diff is collapsed.
include common.conf
# tag::doc_include[]
<Input oms>
Module im_python
PythonCode oms-download.py
</Input>
# end::doc_include[]
<Output out>
Module om_null
</Output>
include common.conf
# tag::doc_include[]
<Extension _json>
Module xm_json
</Extension>
<Extension w3c_parser>
Module xm_w3c
</Extension>
<Input messages>
Module im_file
File '/var/log/httpd-log'
InputType w3c_parser
</Input>
<Output azure_oms>
Module om_exec
Command oms-pipe.py
Exec to_json();
</Output>
# end::doc_include[]
include common.conf
# tag::doc_include[]
<Input messages>
Module im_file
File '/var/log/messages'
</Input>
<Output azure_oms>
Module om_exec
Command oms-pipe.py
</Output>
# end::doc_include[]
import datetime
import json
import requests
import adal
import nxlog
class LogReader:
def __init__(self, time_interval):
# Details of workspace. Fill in details for your workspace.
resource_group = '<YOUR_RESOURCE_GROUP>'
workspace = '<YOUR_WORKSPACE>'
# Details of query. Modify these to your requirements.
query = "Type=*"
end_time = datetime.datetime.utcnow()
start_time = end_time - datetime.timedelta(seconds=time_interval)
num_results = 100000 # If not provided, a default of 10 results will be used.
# IDs for authentication. Fill in values for your service principal.
subscription_id = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
tenant_id = 'xxxxxxxx-xxxx-xxxx-xxx-xxxxxxxxxxxx'
application_id = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx'
application_key = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
# URLs for authentication
authentication_endpoint = 'https://login.microsoftonline.com/'
resource = 'https://management.core.windows.net/'
# Get access token
context = adal.AuthenticationContext('https://login.microsoftonline.com/' + tenant_id)
token_response = context.acquire_token_with_client_credentials('https://management.core.windows.net/', application_id, application_key)
access_token = token_response.get('accessToken')
# Add token to header
headers = {
"Authorization": 'Bearer ' + access_token,
"Content-Type":'application/json'
}
# URLs for retrieving data
uri_base = 'https://management.azure.com'
uri_api = 'api-version=2015-11-01-preview'
uri_subscription = 'https://management.azure.com/subscriptions/' + subscription_id
uri_resourcegroup = uri_subscription + '/resourcegroups/'+ resource_group
uri_workspace = uri_resourcegroup + '/providers/Microsoft.OperationalInsights/workspaces/' + workspace
uri_search = uri_workspace + '/search'
#store log data for NXLog here
self.lines = ""
# Build search parameters from query details
search_params = {
"query": query,
"top": num_results,
"start": start_time.strftime('%Y-%m-%dT%H:%M:%S'),
"end": end_time.strftime('%Y-%m-%dT%H:%M:%S')
}
# Build URL and send post request
uri = uri_search + '?' + uri_api
response = requests.post(uri,json=search_params,headers=headers)
# Response of 200 if successful
if response.status_code == 200:
# Parse the response to get the ID and status
data = response.json()
search_id = data["id"].split("/")
id = search_id[len(search_id)-1]
status = data["__metadata"]["Status"]
# If status is pending, then keep checking until complete
while status == "Pending":
# Build URL to get search from ID and send request
uri_search = uri_search + '/' + id
uri = uri_search + '?' + uri_api
response = requests.get(uri,headers=headers)
# Parse the response to get the status
data = response.json()
status = data["__metadata"]["Status"]
else:
# Request failed
print (response.status_code)
response.raise_for_status()
print ("Total records:" + str(data["__metadata"]["total"]))
print ("Returned top:" + str(data["__metadata"]["top"]))
#write a JSON dump of all events
for event in data['value']:
self.lines += json.dumps(event) + '\n'
def getlogs(self):
if not self.lines:
return None
return self.lines
def read_data(module):
# log pull time interval in seconds
time_interval = 300
module['reader'] = LogReader(time_interval)
reader = module['reader']
logdata = module.logdata_new()
line = reader.getlogs()
if line:
logdata.set_field('raw_event', line)
logdata.post()
nxlog.log_debug("Data posted")
module.set_read_timer(time_interval)
nxlog.log_info("INIT SCRIPT")
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment