Info icon
End of Life Notice: For Trend Cloud One™ - Conformity Customers, Conformity will reach its End of Sale on “July 31st, 2025” and End of Life “July 31st, 2026”. The same capabilities and much more is available in Trend Vision One™ Cloud Risk Management. For details, please refer to Upgrade to Trend Vision One
Use the Knowledge Base AI to help improve your Cloud Posture

Configure Diagnostic Log Delivery for Azure Databricks

Trend Vision One™ provides continuous assurance that gives peace of mind for your cloud infrastructure, delivering over 1100 automated best practice checks.

Risk Level: High (not acceptable risk)

Ensure that Diagnostic Log Delivery is Configured for Azure Databricks workspaces. Azure Databricks Diagnostic Logging provides insights into system operations, user activities, and security events within a Databricks workspace.

Security
Operational
excellence

Enabling diagnostic logs helps organizations detect security threats by logging access, job executions, and cluster activities, ensure compliance with industry regulations, and monitor operational performance to troubleshoot issues proactively. Without diagnostic logging enabled, organizations lack visibility into security and operational activities within Databricks workspaces, making it difficult to maintain an audit trail for forensic investigations and meet regulatory compliance standards. Organizations need comprehensive logging to detect unauthorized access attempts, track job executions, monitor cluster state changes, and understand user account activities. The logs must be securely stored in approved locations such as Azure Log Analytics workspace for analysis and querying, Azure Storage Account for long-term retention, or Azure Event Hubs for integration with SIEM tools.

To enable diagnostic logging features, your Microsoft Azure Databricks workspaces must be on the Premium pricing tier. Logs consume storage and may require additional monitoring tools, leading to increased operational overhead and costs. Incomplete log configurations may result in missing critical events, reducing monitoring effectiveness. Organizations should carefully plan log retention policies and monitor storage costs associated with diagnostic logging.


Audit

To determine if diagnostic logging is configured for your Azure Databricks workspaces, perform the following operations:

Using Azure Console

01 Sign in to the Microsoft Azure Portal.

02 Navigate to Azure Databricks blade available at https://portal.azure.com/#browse/Microsoft.Databricks%2Fworkspaces.

03 Choose the Azure subscription that you want to access from the Subscription equals all filter box and choose Apply.

04 Click on the name (link) of the Azure Databricks workspace that you want to examine.

05 In the left-hand menu, under Monitoring, select Diagnostic settings to access the diagnostic logging configuration for the selected workspace.

06 Verify if a diagnostic setting is configured. If the diagnostic settings list is empty, diagnostic logging is not enabled for the selected Azure Databricks workspace.

07 If a diagnostic setting exists, click on the Edit setting next to the setting name to review its configuration and verify the following:

  1. Under Logs, ensure that logging is enabled for the following categories:
    • accounts: User account activities
    • Filesystem: Databricks Filesystem Logs
    • clusters: Cluster state changes and errors
    • notebook: Execution events
    • jobs: Job execution tracking
  2. Under Destination details, verify that logs are being sent to one or more of the following approved destinations:
    • Send to Log Analytics workspace: For analysis and querying
    • Archive to a storage account: For long-term retention
    • Stream to an event hub: For integration with SIEM tools

08 Repeat steps no. 4 - 7 for each Azure Databricks workspace available within the selected subscription.

09 Repeat steps no. 3 – 8 for each Azure subscription created in your Microsoft Azure cloud account.

Using Azure CLI

01 Run account list command (Windows/macOS/Linux) with custom output filters to list the IDs of the cloud subscriptions available in your Azure cloud account:

az account list \
	--query '[].{id:id, name:name}'

02 The command output should return the requested subscription identifiers (IDs) and names:

[
	{
		"id": "abcdabcd-1234-abcd-1234-abcdabcdabcd",
		"name": "Production Subscription"
	},
	{
		"id": "abcd1234-abcd-1234-abcd-abcd1234abcd",
		"name": "Development Subscription"
	}
]

03 Run account set command (Windows/macOS/Linux) with the ID of the Azure cloud subscription that you want to examine as the identifier parameter to set the selected subscription to be the current active subscription (the command does not produce an output):

az account set \
	--subscription abcdabcd-1234-abcd-1234-abcdabcdabcd

04 Run databricks workspace list command (Windows/macOS/Linux) with custom output filters to list the resource ID of each Azure Databricks workspace available in the selected Azure subscription:

az databricks workspace list \
	--query '[*].id'

05 The command output should return the requested Databricks workspace resource IDs:

[
	"/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-project9-data-workspace",
	"/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-prod-databricks-workspace"
]

06 Run monitor diagnostic-settings list command (Windows/macOS/Linux) with the resource ID of the Azure Databricks workspace that you want to examine as the identifier parameter to check if diagnostic logging is enabled:

az monitor diagnostic-settings list \
	--resource "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-project9-data-workspace"

07 The command output should return the diagnostic settings configuration for the selected workspace:
If no diagnostic settings exist:

[]

If the monitor diagnostic-settings list command output returns an empty array ([]) as shown above, no diagnostic settings are configured for the selected Azure Databricks workspace.
If diagnostic settings exist:
[
	{
		"id": "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourcegroups/cloud-shell-storage-westeurope/providers/microsoft.databricks/workspaces/cc-project9-data-workspace/providers/microsoft.insights/diagnosticSettings/DatabricksLogging",
		"logs": [
			{
				"category": "accounts",
				"enabled": true
			},
			{
				"category": "clusters",
				"enabled": true
			},
			{
				"category": "notebook",
				"enabled": true
			},
			{
				"category": "jobs",
				"enabled": true
			},
			{
				"category": "dbfs",
				"enabled": true
			}
		],
		"name": "DatabricksLogging",
		"resourceGroup": "cloud-shell-storage-westeurope",
		"storageAccountId": "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Storage/storageAccounts/cc-databricks-logs-storage",
		"workspaceId": "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.OperationalInsights/workspaces/cc-log-analytics-workspace"
	}
]

Review the output to verify that the required log categories (accounts, clusters, notebook, jobs, dbfs) have "enabled": true and that logs are sent to approved destinations (storageAccountId and/or workspaceId).

08 Repeat steps no. 6 - 7 for each Azure Databricks workspace available in the selected Azure subscription.

09 Repeat steps no. 3 – 8 for each Azure subscription created in your Microsoft Azure cloud account.

Remediation / Resolution

To enable diagnostic logging for your Azure Databricks workspaces, perform the following operations:

Using Azure Console

01 Sign in to the Microsoft Azure Portal.

02 Navigate to Azure Databricks blade available at https://portal.azure.com/#browse/Microsoft.Databricks%2Fworkspaces.

03 Choose the Azure subscription that you want to access from the Subscription equals all filter box and choose Apply.

04 Click on the name (link) of the Azure Databricks workspace that you want to configure.

05 In the left-hand menu, under Monitoring, select Diagnostic settings.

06 Click + Add diagnostic setting to create a new diagnostic setting.

07 In the Diagnostic setting name field, provide a name for the diagnostic setting (e.g., "DatabricksLogging").

08 Under Category details, select the log categories you wish to capture:

  • accounts: User account activities
  • Filesystem: Databricks Filesystem Logs
  • clusters: Cluster state changes and errors
  • notebook: Execution events
  • jobs: Job execution tracking

09 Under Destination details, choose one or more destinations for the logs:

  • Send to Log Analytics workspace: Select this option and choose the appropriate Log Analytics workspace for advanced querying and monitoring
  • Archive to a storage account: Select this option and choose the appropriate storage account for long-term retention
  • Stream to an event hub: Select this option and choose the appropriate event hub for integration with third-party systems or SIEM tools

10 Click Save to apply the diagnostic logging configuration.

11 Repeat steps no. 4 - 10 for each Azure Databricks workspace that you want to configure, available in the selected subscription.

12 Repeat steps no. 3 – 11 for each Azure subscription created in your Microsoft Azure cloud account.

Using Azure CLI

01 Run account list command (Windows/macOS/Linux) with custom output filters to list the IDs of the cloud subscriptions available in your Azure cloud account:

az account list \
	--query '[].{id:id, name:name}'

02 The command output should return the requested subscription identifiers (IDs) and names:

[
	{
		"id": "abcdabcd-1234-abcd-1234-abcdabcdabcd",
		"name": "Production Subscription"
	},
	{
		"id": "abcd1234-abcd-1234-abcd-abcd1234abcd",
		"name": "Development Subscription"
	}
]

03 Run account set command (Windows/macOS/Linux) with the ID of the Azure cloud subscription that you want to access as the identifier parameter to set the selected subscription to be the current active subscription (the command does not produce an output):

az account set \
	--subscription abcdabcd-1234-abcd-1234-abcdabcdabcd

04 Run databricks workspace list command (Windows/macOS/Linux) with custom output filters to list the resource ID of each Azure Databricks workspace available in the selected Azure subscription:

az databricks workspace list \
	--query '[*].id'

05 The command output should return the requested Databricks workspace resource IDs:

[
	"/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-project9-data-workspace",
	"/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-prod-databricks-workspace"
]

06 Run monitor diagnostic-settings create command (Windows/macOS/Linux) to enable diagnostic logging for your Azure Databricks workspace. Use the resource ID from step 5 and specify at least one destination (Log Analytics workspace, storage account, or event hub):

Option 1: Send logs to Log Analytics workspace:
az monitor diagnostic-settings create \
	--name "DatabricksLogging" \
	--resource "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-project9-data-workspace" \
	--logs '[{"category": "accounts", "enabled": true}, {"category": "clusters", "enabled": true}, {"category": "notebook", "enabled": true}, {"category": "jobs", "enabled": true}, {"category": "dbfs", "enabled": true}]' \
	--workspace "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.OperationalInsights/workspaces/cc-log-analytics-workspace"

Option 2: Archive logs to storage account:
az monitor diagnostic-settings create \
	--name "DatabricksLogging" \
	--resource "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-project9-data-workspace" \
	--logs '[{"category": "accounts", "enabled": true}, {"category": "clusters", "enabled": true}, {"category": "notebook", "enabled": true}, {"category": "jobs", "enabled": true}, {"category": "dbfs", "enabled": true}]' \
	--storage-account "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Storage/storageAccounts/cc-databricks-logs-storage"

Option 3: Stream logs to event hub:
az monitor diagnostic-settings create \
	--name "DatabricksLogging" \
	--resource "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.Databricks/workspaces/cc-project9-data-workspace" \
	--logs '[{"category": "accounts", "enabled": true}, {"category": "clusters", "enabled": true}, {"category": "notebook", "enabled": true}, {"category": "jobs", "enabled": true}, {"category": "dbfs", "enabled": true}]' \
	--event-hub-rule "/subscriptions/abcdabcd-1234-abcd-1234-abcdabcdabcd/resourceGroups/cloud-shell-storage-westeurope/providers/Microsoft.EventHub/namespaces/cc-event-hub-namespace/authorizationrules/RootManageSharedAccessKey"

Note: Replace the resource IDs with your actual Databricks workspace and destination resource IDs. You can combine multiple destinations by including multiple parameters (e.g., both --workspace and --storage-account).

07 Repeat steps no. 4 - 6 for each Azure Databricks workspace that you want to configure, available in the selected subscription.

08 Repeat steps no. 3 – 7 for each Azure subscription created in your Microsoft Azure cloud account.

References

Publication date Jan 28, 2026