Rethinking the role of Azure PowerShell Modules

In the early days of Azure, well before the arrival of Bicep, most engineers grappled with deployment automation.  ARM templates were tough going and using PowerShell scripts seemed to be a useful alternative approach.  We all learned that supporting a production environment based on hundreds of different functions that were continually changing with every version update was impossible.  New Azure services required the newest versions of modules which broke existing deployment and support scripts.  This scenario was a modern equivalent of "DLL hell".

VersionShowCommand error

One of the benefits of Azure Data Explorer is it's UI portal for users.  Staff members don't need Infrastructure permissions to An Azure subscription, the portal URL is enough to be able to do run queries against any data in a database.

When opening Azure Data Explorer (Web UI) from a shared URI, staff members may be prompted with a dialog stating:  "This link uses a service that isn't in your connection list.  Do you trust the link source and want to add the service to your connections?"

The problem of TimeGenerated... and an altered Azure Monitor Schema

In my previous blog posts over Azure Data Explorer, I suggested the need to alter the Azure Monitor schema with ADX to include a Timestamp field.

Being able to accurately correlate logs between different systems based on event times is critical for Security Operations.  Anyone in a SOC team will have learned that the standard TimeGenerated field used with Log Analytics doesn't represent the time of an event - it represents the time a message was received by Log Analytics.  

ADX's Role in Large data retention for Security

Two distinct functions needs to be adressed for Security Operations teams when we talk about logging and alerting.

At this point in time, Microsoft Sentinel is undisputedly the preferred Security Alert management product.  It provides broad capabilities over correlating real-time alerts and integrating alerting with SOAR capabilities.  There are many positive things to call out about Sentinel as a SIEM.  The other aspect of modern security is hunting and this cability was never going to be a core capability of Sentinel.

Adding data streams to Azure Data Explorer

The primary method for adding new data streams to Azure Data Explorer (ADX) is through using Event Hubs / Kafka topics.  We can directly write to ADX tables if needed but using Event Hubs provides a scalable resiliency for message ingestion that ensure that messages aren't lost if for any reason our ADX cluster was offline or unable to process messaging.

The diagram below represents the components used in receiving raw data from an Event Hub and transforming that data into a structured form that can be used by our Technology teams.

Migrating Log Analytics tables between workspaces

I've been working extensively with Azure Monitor table schemas recently. In preparation for the deprecation of the legacy API data ingestion method for Log Analytics workspaces, I needed a simple method to recreate custom log tables as Data Collection Rule (DCR)-based tables while also migrating them to a new workspace.

Tags

Configuring AWS Token Exchange for an Azure Managed Identity

Azure Function Apps can leverage OpenID Connect to exchange Microsoft-issued tokens for AWS tokens. By utilizing the AWS Security Token Service (AWS STS), you can eliminate the need to manage expiring keys, streamlining automation solutions that interact with AWS.

In this blog, we’ll walk through the process of integrating AWS STS with an Azure App Registration, an Enterprise Application, and a User-Defined Managed Identity to enable Function App automation.

Tags