I am trying to load a JSON file (output from Mongo's Export-MdbcData) into a SQL Server table using Powershell. The example JSON file data is as follows:
{ "code" : "0088", "name" : "BUTTON", "detail" : { "quantity" : 1 } }
{ "code" : "0081", "name" : "MATTERHORN", "detail" : { "quantity" : 2 } }
{ "code" : "0159", "name" : "BANKSTON", "detail" : { "quantity" : 1 } }
In the Powershell script below, the file is read into an array and the array is converted into a datatable to load into a SQL server table. Is there a better/faster way to read in the JSON file? With a small input file, it only takes seconds to load the data but with more than 4M records, it is taking hours for the whole process.
$encoding = [System.Text.Encoding]::UTF8
$output = [System.Collections.ArrayList]::new()
foreach ($line in [System.IO.File]::ReadLines($pathToJsonFile, $encoding))
{
$json = $line | ConvertFrom-Json
foreach ($detail in $json.detail)
{
[void]$output.Add(
[pscustomobject]@{
code = $json.code
name = $json.name
quantity = $detail.quantity
}
)
}
}
$dataTable = [System.Data.DataTable]::new()
$dataTable = $output | ConvertTo-DataTable
.
.
UPDATE:
I modified the script using @Charlieface's suggestion and removed the inner foreach statement to see if it will speed it up more. It loaded 4M+ records in about 17 minutes. I used batchsize = 80K and each insert iteration took about 14 seconds. However, comparing to a CSV file input with the same batch size and record count, the insert iteration takes about 3 seconds. I'm guessing the parsing of the JSON takes longer than a delimited file.
foreach ($line in [System.IO.File]::ReadLines($pathToJsonFile, $encoding))
{
$json = $line | ConvertFrom-Json;
[void]$dataTable.Rows.Add($json.code, $json.name, $json.detail.quantity);
$i++;
if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($dataTable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
Might be faster to just create and add the data directly to the datatable, and not use ArrayList
or pscustomobject
$dataTable = [System.Data.DataTable]::new();
[void]$dataTable.Columns.Add('code', [string]);
[void]$dataTable.Columns.Add('name', [string]);
[void]$dataTable.Columns.Add('quantity', [int]);
$encoding = [System.Text.Encoding]::UTF8;
foreach ($line in [System.IO.File]::EnumerateLines($pathToJsonFile, $encoding))
{
$json = $line | ConvertFrom-Json;
foreach ($detail in $json.detail)
{
[void]$dataTable.Rows.Add($json.code, $json.name, $detail.quantity);
}
}
You may also want to pre-allocate the datatable capacity to some large enough capacity to prevent resizing of the underlying array.
$dataTable.MinimumCapacity = 4100000;