Search code examples
azureloggingazure-data-factorydatabricks

Log Azure data factory and azure databricks


I'm looking for feedback on how to professionally manage logs in Azure Data Factory and Azure Databricks, similar to how it's typically done in enterprises. On my end, I have an Azure Data Factory pipeline that executes Databricks notebooks. Thanks in advance.


Solution

  • These are some of the ways to manage the logs of ADF pipelines.

    Export logs to log analytics workspace or storage account:

    Go to ADF Monitor -> Diagnostic settings -> add diagnostic setting.

    Create the diagnostic setting like below.

    enter image description here

    You can send the logs to a log analytics workspace or any storage account as per your requirement. Provide the subscription and resource names like above and save it.

    The logs will be saved in the storage account like below.

    enter image description here

    In this JSON, the pipeline runs logs will be stored in this JSON. It will be automatically updated as per the pipeline runs from ADF. If any error arises, you can check the logs from this JSON.

    The same logs will be stored in the Log analytics workspace as well. Go to the workspace and you can query logs using KQL.

    If you want to get alert mails based on the pipeline runs, you can create alert rules for that.

    enter image description here

    Go through this SO answer to learn more about it.

    You can use the above ways to manage the logs as per your requirement. Check this blog by @Marjory Shrader to know about remaining ways.

    UPDATE:

    The JSON file will hold the pipeline runs information as an array of Objects. This is one pipeline run in that array.

    { "EventName": "DiagnosticsLogs", "category": "PipelineRuns", "correlationId": "39ebbb88-9ccd-450d-a1f7-d80c16b774a0", "end": "2024-01-29T06:50:30.0000000Z", "env_dt_spanId": "6a64f8994ea499b4", "env_dt_traceId": "f3b88fa27eb81f538dd3d269e834dd4f", "env_name": "DiagnosticsLogsSource", "env_time": "2024-01-29T06:50:30.8557880Z", "env_ver": "4.0", "failureType": "UserError", "groupId": "39ebbb88-9ccd-450d-a1f7-d80c16b774a0", "level": "Error", "location": "eastus", "name": "DiagnosticsLogs", "operationName": "pipeline1 - Failed", "pipelineName": "pipeline1", "properties": {  "Parameters": {},  "SystemParameters": {    "ExecutionStart": "2024-01-29T06:50:29.8345435Z",    "TriggerId": "8863785bddc94a4ca3acdaebcacee774",    "SubscriptionId": "<sub_id>",    "PipelineRunRequestTime": "2024-01-29T06:50:28.8629073+00:00"  },  "Predecessors": [    {      "Name": "Manual",      "Id": "8863785bddc94a4ca3acdaebcacee774",      "InvokedByType": "Manual"    }  ],  "UserProperties": {},  "Annotations": [],  "Message": "Operation on target Set variable1 failed: The variable 'var1' of type 'String' cannot be initialized or updated with value of type 'Integer'. The variable 'var1' only supports values of types 'String'."}, "resourceId": "/SUBSCRIPTIONS/<sub_id>/RESOURCEGROUPS/<RG_name>/PROVIDERS/MICROSOFT.DATAFACTORY/FACTORIES/RAKESHDFACTORY", "runId": "39ebbb88-9ccd-450d-a1f7-d80c16b774a0", "severityNumber": 9, "severityText": "Information", "start": "2024-01-29T06:50:28.0000000Z", "status": "Failed", "tags": "{\"Reason\":\"Repro\",\"CreatedDate\":\"1/29/2024 6:05:37 AM\",\"CreatedBy\":\"NA\",\"OwningTeam\":\"NA\"}", "timestamp": "2024-01-29T06:50:30.0000000Z", "time": "2024-01-29T06:50:30.8557880Z"}