Migrating Log Analytics tables between workspaces

I've been working extensively with Azure Monitor table schemas recently. In preparation for the deprecation of the legacy API data ingestion method for Log Analytics workspaces, I needed a simple method to recreate custom log tables as Data Collection Rule (DCR)-based tables while also migrating them to a new workspace.

Tags

Configuring AWS Token Exchange for an Azure Managed Identity

Azure Function Apps can leverage OpenID Connect to exchange Microsoft-issued tokens for AWS tokens. By utilizing the AWS Security Token Service (AWS STS), you can eliminate the need to manage expiring keys, streamlining automation solutions that interact with AWS.

In this blog, we’ll walk through the process of integrating AWS STS with an Azure App Registration, an Enterprise Application, and a User-Defined Managed Identity to enable Function App automation.

Tags

Writeable Sentinel tables (Update - now includes DCRs)

Microsoft's Azure Monitor Agent allows events to be directly written to certain Sentinel tables.  In a previous blog piece 'Writing data to Sentinel's tables with REST and Data Collection Rules', I described how custom Data Collection Rules are written to allow Log Analytics / Sentinel tables to be written to using REST.

Deriving the Log Analytics table schema

A frustration in dealing with table schemas for Log Analytics and Azure Data Explorer is inacuracies with Microsoft's published documentation for the schema.  Another surprise is that the schema returned by using a GET against the workspace  (https://learn.microsoft.com/en-us/rest/api/loganalytics/schema/get?view=rest-loganalytics-2023-09-01&tabs=HTTP) has inaccuracies too!

Querying ADX with PowerShell and REST

 The following script provides an example of using PowerShell to directly query Azure Data Explorer Records using REST.

Prerequisites

You will also need to ensure that the Application you use is granted the Database Viewer role permission.   This role can be added through the portal on the database Overview -> Permissions -> Add.   Note that ADX does not support table level viewer permissions.  

Tags

PowerShell - Writing data directly to Azure Data Explorer with REST

The following script provides an example of directly writing to an Azure Data Explorer table using Powershell.  For resiliency, the preference for ADX data ingest remains a data connection against an Event Hub (Kafka) but there can be situations where ad-hoc writing is needed.


Prerequisites

Streaming ingestion must be enabled on the cluster level of ADX.  This may be done through the portal under Settings -> Configurations.

Fluent-bit on Windows - forwarding data to Event Hubs (Kafka) and ADX

Fluent-bit natively supports forwarding data to Event Hubs with Kafka support built in with the Linux packages.  With Windows, this module was left out of the standard package simply because no testing against the Apache kafka redistributable had occurred.  

Using Fluent-bit with Windows and ADX provides a cost effective way of harvesting large amounts of security data from monitored systems. 

Official installers for the latest fluent-bit package are available for download here:

Sentinel / Azure Monitor Query Packs

Log Analytics Query Packs allow for commonly used queries to be saved and made accessible within Sentinel.

When a staff member with contributor permissions saves their first query from within Log Analytics to the default query pack, the default resource group and default query pack name are created for the subscription.