Log Analytics Workspace: Logs Ingestion API with python script hosted on a Linux server

I have a Python script that I want to run daily that produces a Pandas dataframe.

I want this dataframe to be added daily to my log analytics workspace.

I have a Linux server that I can use to run my Python script.

What do I need to do to make this work using

Is there a minimal reproducible example available?


  • You can use below code to send data frame to Log Analytics Workspace and I have followed SO-Thread:

    Firstly created a function app in Local in VS code:

    import logging
    import requests
    import datetime
    import hashlib
    import hmac
    import base64
    import azure.functions as func
    import pandas as pd
    def rith_build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
        x_headers = 'x-ms-date:' + date
        string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
        bytes_to_hash = bytes(string_to_hash, encoding="utf-8")  
        decoded_key = base64.b64decode(shared_key)
        encoded_hash = base64.b64encode(, bytes_to_hash, digestmod=hashlib.sha256).digest()).decode()
        authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
        return authorization
    def post_data1(customer_id, shared_key, body, log_type):
        method = 'POST'
        content_type = 'application/json'
        resource = '/api/logs'
        rfc1123date = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
        content_length = len(body)
        signature = rith_build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
        uri = 'https://' + customer_id + '' + resource + '?api-version=2016-04-01'
        headers = {
            'content-type': content_type,
            'Authorization': signature,
            'Log-Type': log_type,
            'x-ms-date': rfc1123date
        response =,data=body, headers=headers)
        if (response.status_code >= 200 and response.status_code <= 299):
            print('Rithwik Bojja Data Frame is sent, Go and check')
            print("Response code: {}".format(response.status_code))
    rithwik_data1 = pd.DataFrame({
        'Name': ['Bojja', 'Rithwik', 'Chotu'],
        'Age': [8, 8008, 24]
    def main(req: func.HttpRequest) -> func.HttpResponse:'Python HTTP trigger function processed a request.')
        post_data1('Workspaceid', 'Primary key',rithwik_data1.to_json(orient="records"), logtype2)
        return func.HttpResponse(f"Hello Rithwik Bojja Sent Data Successfully.")

    Here given data frame in code directly, you can send the data frame to trigger in the body of the trigger while calling it(POST).


      "IsEncrypted": false,
      "Values": {
        "AzureWebJobsStorage": "con",
        "FUNCTIONS_WORKER_RUNTIME": "python"


      "version": "2.0",
      "logging": {
        "applicationInsights": {
          "samplingSettings": {
            "isEnabled": true,
            "excludedTypes": "Request"
      "extensionBundle": {
        "id": "Microsoft.Azure.Functions.ExtensionBundle",
        "version": "[3.*, 4.0.0)"


      "scriptFile": "",
      "bindings": [
          "authLevel": "function",
          "type": "httpTrigger",
          "direction": "in",
          "name": "req",
          "methods": [
          "type": "http",
          "direction": "out",
          "name": "$return"



    Now deployed my function app to azure.


    Now executed it:

    Log Analytics Workspace:

    enter image description here

    enter image description here