Daily Azure / Sentinel Backup (and Reporting) with GitHub
Microsoft Sentinel is inherently designed as a portal application, providing Security Operations staff with the ability to swiftly create new detections using the data sets they manage.
Microsoft Sentinel is inherently designed as a portal application, providing Security Operations staff with the ability to swiftly create new detections using the data sets they manage.
Two distinct functions needs to be adressed for Security Operations teams when we talk about logging and alerting.
At this point in time, Microsoft Sentinel is undisputedly the preferred Security Alert management product. It provides broad capabilities over correlating real-time alerts and integrating alerting with SOAR capabilities. There are many positive things to call out about Sentinel as a SIEM. The other aspect of modern security is hunting and this cability was never going to be a core capability of Sentinel.
The primary method for adding new data streams to Azure Data Explorer (ADX) is through using Event Hubs / Kafka topics. We can directly write to ADX tables if needed but using Event Hubs provides a scalable resiliency for message ingestion that ensure that messages aren't lost if for any reason our ADX cluster was offline or unable to process messaging.
The diagram below represents the components used in receiving raw data from an Event Hub and transforming that data into a structured form that can be used by our Technology teams.
I've been working extensively with Azure Monitor table schemas recently. In preparation for the deprecation of the legacy API data ingestion method for Log Analytics workspaces, I needed a simple method to recreate custom log tables as Data Collection Rule (DCR)-based tables while also migrating them to a new workspace.
Azure Function Apps can leverage OpenID Connect to exchange Microsoft-issued tokens for AWS tokens. By utilizing the AWS Security Token Service (AWS STS), you can eliminate the need to manage expiring keys, streamlining automation solutions that interact with AWS.
In this blog, we’ll walk through the process of integrating AWS STS with an Azure App Registration, an Enterprise Application, and a User-Defined Managed Identity to enable Function App automation.
Microsoft's Azure Monitor Agent allows events to be directly written to certain Sentinel tables. In a previous blog piece 'Writing data to Sentinel's tables with REST and Data Collection Rules', I described how custom Data Collection Rules are written to allow Log Analytics / Sentinel tables to be written to using REST.
A frustration in dealing with table schemas for Log Analytics and Azure Data Explorer is inacuracies with Microsoft's published documentation for the schema. Another surprise is that the schema returned by using a GET against the workspace (https://learn.microsoft.com/en-us/rest/api/loganalytics/schema/get?view=rest-loganalytics-2023-09-01&tabs=HTTP) has inaccuracies too!
The following script provides an example of using PowerShell to directly query Azure Data Explorer Records using REST.
You will also need to ensure that the Application you use is granted the Database Viewer role permission. This role can be added through the portal on the database Overview -> Permissions -> Add. Note that ADX does not support table level viewer permissions.
The following script provides an example of directly writing to an Azure Data Explorer table using Powershell. For resiliency, the preference for ADX data ingest remains a data connection against an Event Hub (Kafka) but there can be situations where ad-hoc writing is needed.
Streaming ingestion must be enabled on the cluster level of ADX. This may be done through the portal under Settings -> Configurations.
With Azure Data Explorer acting as my large data repository for security data, I need to plan for Disaster Recovery and long term archive of event data. This is achieved by establishing a Continuous Export of ingested data to a nominated Storage Account.