I am trying to extract data from the following JSON and pivot the data in "rows" to columns.
{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{
"name": "timestamp",
"type": "datetime"
},
{
"name": "message",
"type": "string"
},
{
"name": "severityLevel",
"type": "int"
},
{
"name": "FriendlySeverityLevel",
"type": "string"
},
{
"name": "CycleId",
"type": "dynamic"
}
],
"rows": [
[
"2024-07-25T12:00:00",
"Report Started - User: [email protected] MrmClient: 400600 CcID: 000a000b-0000-111c-0a0b-11111a1111a1",
1,
"Information",
"000a000b-0000-111c-0a0b-11111a1111a1"
],
[
"2024-07-24T12:00:00",
"Report Started - User: [email protected] MrmClient: 807231 CcID: 000c000d-0000-11c1-0c0d-11111b11111b",
1,
"Information",
"000c000d-0000-11c1-0c0d-11111b11111b"
]
]
}
]
}
This is what I'm trying to get for the final result:
Using JSON_VALUE(RawJSON, '$.tables[0].name')
gives me PrimaryResult but I can't figure out how to pull the data from "rows". I've tried JSON_VALUE(RawJSON, '$.tables[0].rows[0]')
, but it returns a NULL. Any help would be greatly appreciated.
If the provided JSON content contains information about more than one table, a possible approach is a dynamic statement. The idea is to generate a statement which uses OPENJSON()
with the appropriate columns definitions for each table using the information from the $.tables[*].columns
parts in the input JSON.
Test JSON with information for two tables:
DECLARE @json nvarchar(max) = '{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{"name": "timestamp", "type": "datetime"},
{"name": "message", "type": "string"},
{"name": "severityLevel", "type": "int"},
{"name": "FriendlySeverityLevel", "type": "string"},
{"name": "CycleId", "type": "dynamic"}
],
"rows": [
["2024-07-25T12:00:00", "Report Started - User: [email protected] MrmClient: 400600 CcID: 000a000b-0000-111c-0a0b-11111a1111a1", 1, "Information", "000a000b-0000-111c-0a0b-11111a1111a1"],
["2024-07-24T12:00:00", "Report Started - User: [email protected] MrmClient: 807231 CcID: 000c000d-0000-11c1-0c0d-11111b11111b", 1, "Information", "000c000d-0000-11c1-0c0d-11111b11111b"]
]
},
{
"name": "SecondaryResult",
"columns": [
{"name": "timestamp", "type": "datetime"}
],
"rows": [
["2024-07-25T12:00:00"]
]
}
]
}';
T-SQL:
-- Dynamic statement
DECLARE @stmt nvarchar(max)
; WITH TablesAndColumnsCTE AS (
SELECT t.[key] AS TableIndex, c.*
FROM OPENJSON(@json, '$.tables') t
CROSS APPLY (
SELECT
[key] AS ColumnIndex,
JSON_VALUE([value], '$.name') AS ColumnName,
JSON_VALUE([value], '$.type') AS ColumnType
FROM OPENJSON(t.[value], '$.columns')
) c
), StatementsCTE AS (
SELECT CONCAT(
N'SELECT * FROM OPENJSON(@json, ''$.tables[', TableIndex, N'].rows'') WITH (',
STRING_AGG(
CONCAT(N'[', ColumnName, N'] varchar(max) ''$[', ColumnIndex, N']'''),
N','
) WITHIN GROUP (ORDER BY ColumnIndex),
N')'
) AS Statement
FROM TablesAndColumnsCTE
GROUP BY TableIndex
)
SELECT @stmt = STRING_AGG(Statement, N'; ')
FROM StatementsCTE
-- Statement execution
DECLARE @err int
EXEC @err = sp_executesql @stmt, N'@json nvarchar(max)', @json
IF @err <> 0 PRINT 'Error'
The generated statement, based on the test data, is posted below. All columns are defined as nvarchar(max)
columns, but you can easily include the appropriate CASE
expression and map the columns data types (datetime
, string
, int
and dynamic
in the example) to the appropriate SQL Server data types.
SELECT *
FROM OPENJSON(@json, '$.tables[0].rows') WITH (
[timestamp] varchar(max) '$[0]',
[message] varchar(max) '$[1]',
[severityLevel] varchar(max) '$[2]',
[FriendlySeverityLevel] varchar(max) '$[3]',
[CycleId] varchar(max) '$[4]'
);
SELECT *
FROM OPENJSON(@json, '$.tables[1].rows') WITH (
[timestamp] varchar(max) '$[0]'
)