Collecting Defender Vulnerability Data - a PowerShell Core Durable Function
![M-21-31 Vulnerability data](/sites/default/files/inline-images/image_151.png)
M-21-31 represents the current expectation for Federal Government Agencies in the United States and Australia.
M-21-31 represents the current expectation for Federal Government Agencies in the United States and Australia.
With my current Security related projects I'm doing a lot of work using Azure's native automation capabilities. It's been a major surprise for me to realise that the wisdom of the majority of Security Providers is to argue for the purchase of XSOAR licenses to provide an automation capability with Microsoft Sentinel when all the tooling for automation has existed in Azure well before Sentinel was a product.
I'm convinced that Security teams within every major organisation will be running Azure Data Explorer (ADX) clusters in the near future.
Very quietly, the last two weeks has seen the general availability of AI capability take another big step forward. Anthropic's announcement of support for Model Context Protocol with Claude Desktop provides wide access for Artificial Intelligence to utilise tools when asked to perform tasks.
This article is intended to show how Logic Apps can be used with Azure Data Explorer (ADX). Normally, I would try to ensure that all data ingested into ADX came through Event Hubs as they provide a resiliency and ability to support Regional redundancy with clusters.
In the early days of Azure, well before the arrival of Bicep, most engineers grappled with deployment automation. ARM templates were tough going and using PowerShell scripts seemed to be a useful alternative approach. We all learned that supporting a production environment based on hundreds of different functions that were continually changing with every version update was impossible. New Azure services required the newest versions of modules which broke existing deployment and support scripts. This scenario was a modern equivalent of "DLL hell".
One of the benefits of Azure Data Explorer is it's UI portal for users. Staff members don't need Infrastructure permissions to An Azure subscription, the portal URL is enough to be able to do run queries against any data in a database.
When opening Azure Data Explorer (Web UI) from a shared URI, staff members may be prompted with a dialog stating: "This link uses a service that isn't in your connection list. Do you trust the link source and want to add the service to your connections?"
Azure Event Hubs are a core component for large scale log collection for Security teams. They have a Kafka compliant Interface which provides broad capability with multiple systems. Most of Microsoft's major systems natively support log data writing to Event Hubs.
Microsoft Sentinel is inherently designed as a portal application, providing Security Operations staff with the ability to swiftly create new detections using the data sets they manage.
In my previous blog posts over Azure Data Explorer, I suggested the need to alter the Azure Monitor schema with ADX to include a Timestamp field.
Being able to accurately correlate logs between different systems based on event times is critical for Security Operations. Anyone in a SOC team will have learned that the standard TimeGenerated field used with Log Analytics doesn't represent the time of an event - it represents the time a message was received by Log Analytics.