I have a table in an Azure Data Explorer database with the that's defined as follows:
.create table GetPrice (lpname: string, TimeUTC: datetime, base_volume: real, highest_bid: real, currency_pair: string, price: real, lowest_ask: real, change_percentage: real, quote_volume: real, high_24h: real, low_24h: real, EventProcessedUtcTime: datetime, current_time: datetime, ['guid']: string, epochtime: int, time_ms: real, channel: string, event: string)
I am fetching and object from an external API and parsing it to JSON with the following result:
{
{
"Time": 0,
"Time_ms": 0,
"Channel": "external",
"event": "update",
"Result": {
"currency_pair": "BABA",
"last": "0",
"lowest_ask": null,
"highest_bid": "71.93",
"change_percentage": "0",
"base_volume": "97",
"quote_volume": null,
"high_24h": "0",
"low_24h": "0",
"GUID": "<GUID>"
},
"TimeUTC": "2024-04-10T10:13:13.1732158Z",
"LPName": "NAME",
"Error": {
"Code": 0,
"Message": null
}
}
}
Problem: in the GetPrice table only TimeUTC, LPName, EventProcessedUtcTime and current_time are stored.
I don't fully understand why, as if there was an issue with the nested Result
object, then also Channel
and event
should be populated as well, but they are not.
I have tried to flatten the Result
before sending it to the EventHub but this did not work either. Also, I tried to create a new table and pass the data there but this failed too.
I am using Azure.Messaging.EventHubs
library to connect and communicate with ADX.
Any leads or suggestions are wellcome.
The reason that only TimeUTC
, LPName
, EventProcessedUtcTime
, and current_time
are stored in the GetPrice
table is due to the resource group being present in the region, and the event being triggered with respect to that time frame. One way to address this issue is to change the resource location.
// Create mapping command
.create table ['ravi'] ingestion json mapping 'ravi_mapping_1' '[{"column":"topic", "Properties":{"Path":"$[\'topic\']"}},{"column":"subject", "Properties":{"Path":"$[\'subject\']"}},{"column":"eventType", "Properties":{"Path":"$[\'eventType\']"}},{"column":"id", "Properties":{"Path":"$[\'id\']"}},{"column":"data", "Properties":{"Path":"$[\'data\']"}},{"column":"dataVersion", "Properties":{"Path":"$[\'dataVersion\']"}},{"column":"metadataVersion", "Properties":{"Path":"$[\'metadataVersion\']"}},{"column":"eventTime", "Properties":{"Path":"$[\'eventTime\']"}}]'
// Create table command
.create table ['ravi'] (['topic']:string, ['subject']:string, ['eventType']:string, ['id']:guid, ['data']:dynamic, ['dataVersion']:string, ['metadataVersion']:long, ['eventTime']:datetime)
Ingest data with Kusto Azure Data Explorer using .net
Send or receive events using .NET to Azure Event Hubs
Note :
Azure.Messaging.EventHubs
is only to connect with event hub and Microsoft.Azure.Kusto.Ingest
is to connect to Azure Data Explorer. One way to ingest data from Event Hub into Azure Synapse Data Explorer. other way is to get data from event hub using Azure Synapse Data Explorer
(or)
use Analyze data in Event Hubs Instance to Analyze event data with Azure Data Explorer.
Azure.Messaging.EventHubs
with c# with the above reference.Azure Data Explorer: I created a table and mapping command for this sample data
[
{
"currency_pair": "BABA",
"last": "0",
"lowest_ask": null,
"highest_bid": "89.93",
"change_percentage": "10",
"base_volume": "197",
"quote_volume": null,
"high_24h": "0",
"low_24h": "0",
"GUID": "<GUID>",
"nestedKey": {
"nestedKey1": "nestedValue1"
},
"arrayKey": [
"arrayValue1",
"arrayValue2"
]
}
]
// Create table command
.create table ['teja'] (['currency_pair']:string, ['last']:long, ['lowest_ask']:string, ['highest_bid']:real, ['change_percentage']:long, ['base_volume']:long, ['quote_volume']:string, ['high_24h']:long, ['low_24h']:long, ['GUID']:string, ['nestedKey']:dynamic, ['arrayKey']:dynamic)
// Create mapping command
.create table ['teja'] ingestion json mapping 'teja_mapping' '[{"column":"currency_pair", "Properties":{"Path":"$[\'currency_pair\']"}},{"column":"last", "Properties":{"Path":"$[\'last\']"}},{"column":"lowest_ask", "Properties":{"Path":"$[\'lowest_ask\']"}},{"column":"highest_bid", "Properties":{"Path":"$[\'highest_bid\']"}},{"column":"change_percentage", "Properties":{"Path":"$[\'change_percentage\']"}},{"column":"base_volume", "Properties":{"Path":"$[\'base_volume\']"}},{"column":"quote_volume", "Properties":{"Path":"$[\'quote_volume\']"}},{"column":"high_24h", "Properties":{"Path":"$[\'high_24h\']"}},{"column":"low_24h", "Properties":{"Path":"$[\'low_24h\']"}},{"column":"GUID", "Properties":{"Path":"$[\'GUID\']"}},{"column":"nestedKey", "Properties":{"Path":"$[\'nestedKey\']"}},{"column":"arrayKey", "Properties":{"Path":"$[\'arrayKey\']"}}]'
make sure to add Channel
and event
etc. in table /table mapping command.