top of page

Deploying OCSF to Microsoft Sentinel: A Step-by-Step Implementation Guide (Part 2 of 2)

  • 2 days ago
  • 4 min read

In Part 1, we covered what OCSF is and why it solves critical problems for Microsoft Sentinel deployments. This post walks through the technical implementation: creating custom tables for OCSF event classes, configuring Data Collection Rules (DCR) to transform logs at ingestion, and building analytics rules that query normalized data across multiple vendors.


Prerequisites

  • Completed Part 1 of this series

  • Microsoft Sentinel workspace deployed

  • Contributor or Sentinel Contributor role on the workspace

  • At least one data source emitting authentication or network logs (e.g., Azure AD, Okta, Palo Alto)

  • Azure CLI or Azure PowerShell installed

  • Familiarity with KQL and ARM templates


Step 1 – Create Custom Tables for OCSF Event Classes

Sentinel doesn't include native OCSF tables, so you'll create custom tables aligned with OCSF schema. We'll start with the Authentication (3002) event class.

Log in to Azure Portal and navigate to your Sentinel workspace. Go to Settings > Workspace settings > Tables and select Create > New custom log (DCR-based).

Define the table schema for OCSF_Authentication_CL using these mandatory fields:

{
  "properties": {
    "schema": {
      "name": "OCSF_Authentication_CL",
      "columns": [
        {"name": "TimeGenerated", "type": "datetime"},
        {"name": "class_uid", "type": "int"},
        {"name": "activity_id", "type": "int"},
        {"name": "severity_id", "type": "int"},
        {"name": "status_id", "type": "int"},
        {"name": "time", "type": "datetime"},
        {"name": "actor_user_name", "type": "string"},
        {"name": "actor_user_uid", "type": "string"},
        {"name": "src_endpoint_ip", "type": "string"},
        {"name": "dst_endpoint_ip", "type": "string"},
        {"name": "auth_protocol", "type": "string"},
        {"name": "logon_type", "type": "string"},
        {"name": "is_mfa", "type": "bool"},
        {"name": "message", "type": "string"},
        {"name": "metadata_product_vendor", "type": "string"},
        {"name": "metadata_product_name", "type": "string"}
      ]
    }
  }
}

Alternatively, deploy using Azure CLI:

az monitor log-analytics workspace table create \
  --resource-group <ResourceGroup> \
  --workspace-name <SentinelWorkspace> \
  --name OCSF_Authentication_CL \
  --columns @ocsf_auth_schema.json

Repeat this process for other event classes you need, such as OCSF_NetworkActivity_CL (4001) or OCSF_ProcessActivity_CL (1007).


Step 2 – Configure a Data Collection Endpoint

Create a Data Collection Endpoint (DCE) to receive logs from your sources.

az monitor data-collection endpoint create \
  --name "ocsf-dce-prod" \
  --resource-group <ResourceGroup> \
  --location eastus \
  --public-network-access Enabled

Note the Logs Ingestion URI from the output—you'll use this in your data source configuration.


Step 3 – Build a Data Collection Rule with KQL Transformation

Create a Data Collection Rule (DCR) that transforms incoming logs to OCSF schema. This example maps Azure AD Sign-In Logs to OCSF Authentication format.

Create a file named ocsf-auth-dcr.json:

{
  "location": "eastus",
  "properties": {
    "dataCollectionEndpointId": "/subscriptions/<SubscriptionID>/resourceGroups/<ResourceGroup>/providers/Microsoft.Insights/dataCollectionEndpoints/ocsf-dce-prod",
    "streamDeclarations": {
      "Custom-OCSF-Auth": {
        "columns": [
          {"name": "TimeGenerated", "type": "datetime"},
          {"name": "UserPrincipalName", "type": "string"},
          {"name": "IPAddress", "type": "string"},
          {"name": "ResultType", "type": "string"},
          {"name": "AuthenticationProtocol", "type": "string"}
        ]
      }
    },
    "destinations": {
      "logAnalytics": [
        {
          "workspaceResourceId": "/subscriptions/<SubscriptionID>/resourceGroups/<ResourceGroup>/providers/Microsoft.OperationalInsights/workspaces/<SentinelWorkspace>",
          "name": "SentinelWorkspace"
        }
      ]
    },
    "dataFlows": [
      {
        "streams": ["Custom-OCSF-Auth"],
        "destinations": ["SentinelWorkspace"],
        "transformKql": "source | extend class_uid = 3002 | extend activity_id = case(ResultType == '0', 1, 2) | extend severity_id = case(ResultType == '0', 1, 3) | extend status_id = case(ResultType == '0', 1, 2) | extend time = TimeGenerated | extend actor_user_name = UserPrincipalName | extend src_endpoint_ip = IPAddress | extend auth_protocol = AuthenticationProtocol | extend metadata_product_vendor = 'Microsoft' | extend metadata_product_name = 'Azure AD' | project-away UserPrincipalName, IPAddress, ResultType, AuthenticationProtocol",
        "outputStream": "Custom-OCSF_Authentication_CL"
      }
    ]
  }
}

Deploy the DCR:

az monitor data-collection rule create \
  --name "ocsf-auth-dcr" \
  --resource-group <ResourceGroup> \
  --location eastus \
  --rule-file ocsf-auth-dcr.json

This transformation maps Azure AD fields to OCSF attributes:

  • ResultType == '0' (success) → status_id = 1

  • UserPrincipalNameactor_user_name

  • IPAddresssrc_endpoint_ip


Step 4 – Associate the DCR with Your Data Source

For Azure AD Sign-In Logs, configure the diagnostic settings to send logs to the DCR.

Navigate to Azure AD > Monitoring > Diagnostic settings > Add diagnostic setting.

Select SignInLogs, choose Send to Log Analytics workspace, and select the DCR you created.

For third-party sources (Okta, CrowdStrike), configure them to send logs to the DCE Logs Ingestion URI using their native integration or a Logstash pipeline with the OCSF output plugin.


Step 5 – Validate OCSF Data Ingestion

Wait 5–10 minutes for data to populate, then query the custom table:

OCSF_Authentication_CL
| take 100
| project time, class_uid, actor_user_name, src_endpoint_ip, status_id, metadata_product_name

Verify that:

  • class_uid is 3002

  • Field names match OCSF specification

  • Multiple vendors populate the same table with consistent schema


Step 6 – Build a Unified Analytics Rule

Create a Scheduled Query Rule that detects brute-force attacks across all authentication sources.

OCSF_Authentication_CL
| where time >= ago(10m)
| where status_id == 2  // Failed authentication
| summarize FailureCount = count(), 
            FirstFailure = min(time), 
            LastFailure = max(time), 
            SourceIPs = make_set(src_endpoint_ip)
    by actor_user_name, metadata_product_name
| where FailureCount >= 10
| project actor_user_name, FailureCount, FirstFailure, LastFailure, SourceIPs, metadata_product_name

This rule works identically whether the logs come from Azure AD, Okta, AWS IAM, or any OCSF-compliant source.

Set Severity to Medium, Frequency to 10 minutes, and configure Incident creation.


Step 7 – Create OCSF-to-ASIM Mapping Functions (Optional)

If you have existing analytics that use ASIM, create a KQL function that maps OCSF to ASIM schema:

let OCSFtoASIM_Authentication = () {
    OCSF_Authentication_CL
    | extend EventType = case(status_id == 1, "Logon", "LogonFailed")
    | extend TargetUsername = actor_user_name
    | extend SrcIpAddr = src_endpoint_ip
    | extend EventResult = case(status_id == 1, "Success", "Failure")
    | extend EventProduct = metadata_product_name
    | project-rename EventStartTime = time
};
OCSFtoASIM_Authentication

Save this as a Workspace Function named imAuthentication_OCSF so ASIM-based analytics can query OCSF tables.


Troubleshooting

Logs not appearing in custom table Verify the DCR is associated with the data source and check DCE ingestion metrics in Azure Monitor. Ensure the transformation KQL is valid using the Log Analytics Query Editor.

Schema mismatch errors Confirm that your transformation output exactly matches the custom table schema. Use getschema to compare:

OCSF_Authentication_CL | getschema

Performance degradation with large transformations Avoid complex joins or lookups in DCR transformations. Perform enrichment in scheduled analytics or use Watchlists for static mappings.

Missing OCSF attributes Not all vendors populate every OCSF field. Use extend with default values:

| extend is_mfa = coalesce(is_mfa, false)

Considerations

Restrict DCE Access — Configure Private Link for the Data Collection Endpoint to prevent public internet access.

Enable Immutable Tables — Set Retention and Archival policies to prevent modification of OCSF logs for compliance.

Validate Schema on Ingestion — Use DCR transformations to filter malformed events that don't meet OCSF requirements.

Tag OCSF Tables — Apply Azure Tags to OCSF resources for cost tracking and governance.

Monitor Transformation Failures — Create alerts for DCR ingestion errors using Azure Monitor Metrics.

Comments


Subscribe

Thanks for submitting!

bottom of page