I have developed a Python script to upload some logs in JSON format into a custom table using the Azure log ingestion API. The script worked at least twice and data rows appeared exactly as I want them but now, although it appears to work and get a Response [204] every time it runs, no new data appears in the custom table.
What troubleshooting steps can I carry out on the Azure side to see where the error might be?
I have followed this Microsoft article and I have used two different forms of Python methods to post the data
request.post
command with the relevant parameters.client.upload
command shown in the Python part of the linked article.Debugging in Python shows the data formatted apparently correctly and the response seems good (depending on the method used). It is hard to provide details here as much of the data needs to be redacted, but I will try.
The first Python request looks like this:
logCombined = <JSON data array>
payload = json.dumps(logCombined)
applicationId
scope = "https://monitor.azure.com//.default"
appSecret = <application (client) secret>
tenantId = <tenant ID>
dceUri= <dedicated endpoint URI>
dcrImmutableId = <DCR immutable ID>
table= "MyTable_CL"
body = f"client_id={applicationId}&scope={scope}&client_secret={appSecret}&grant_type=client_credentials"
headers = {"Content-Type": "application/x-www-form-urlencoded"}
uri = f"https://login.microsoftonline.com/{tenantId}/oauth2/v2.0/token"
response = req.post(uri, data=body, headers=headers) # find a bearer token
bearerToken = respons.json().get("access_token")
headers2 = { "Authorization": f"Bearer {bearerToken}", "Content-Type": "application/json" }
uri = f"{dceUri}/dataCollectionRules/{dcrImmutableId}/streams/Custom-{table}?api-version=2021-11-01-preview"
uploadResponse = req.post(uri, data=payload, headers=headers2)
print("Response: ", uploadResponse)
At this point the response is always successful: <Response [204]>
. There is no indication in Azure that anything has happened at all - no new table rows and nothing in the operation
log.
How can I look under the hood in Azure to see why this is being ignored?
Your issue may be stemming from a mismatch between the JSON data being posted and what is defined in your Data Collection Rule (DCR) in Azure. Azure's log ingestion API requires that the JSON data structure being sent matches with the one defined in the DCR. If these two don't match, even if your post request is successful, your data will not appear in the custom table.
Here's how you can troubleshoot:
Check your Data Collection Rule (DCR): Make sure that the DCR in Azure matches exactly with the structure of the JSON data that you are sending. Verify the keys and their associated value types.
Check the JSON data being posted: Save the JSON being sent to a local file and examine its contents to ensure it matches with the DCR.
Here is an example of how you can modify your existing Python code to write the JSON data to a local file:
import json
# Other parts of your code...
payload = json.dumps(logCombined)
# Save the JSON payload to a file
with open('payload.json', 'w') as outfile:
json.dump(logCombined, outfile)
# Continue with the rest of your code...
In the code snippet above, json.dump(logCombined, outfile) writes the logCombined JSON data to a file named payload.json. You can open this file and verify its contents.
Try ingesting sample data: You could try to ingest a small sample of data that you know for sure matches the structure defined in your DCR. If the sample data is successfully ingested, then the issue likely lies in the JSON data structure of your main dataset.
As an example in your DCR if you have defined
[
{
"blah": "1",
"blah2" : "2"
}
]
but if you are sending something like
[
"Errors": {
"blah": "1",
"blah2" : "2"
},
{
"blah": "1",
"blah2" : "2"
}
]
Then it would be a mismatch, so I would recommend save the output you are sending to a file and cross check the content is what is expected by the DCR