Search code examples
c#json.netdeserializationhandsontable

How to convert Handsontable data back into C# class


I am using an ajax post to send my column headers and data from a handsontable back to an ashx handler.

   $.ajax({
       type: 'POST',
       url: "Scripts/SaveExcelData.ashx",
       contentType: "application/json; charset=utf-8",
       data: JSON.stringify({"columns": hot.getColHeader(), "rows": hot.getData()}),
       success: function (data) {}
   });

Currently I am using to following to deserialize the request, but have not been able to successfully come up with anything that converts the rows into an array of DataBaseRow class objects.

var jsonString = String.Empty;
context.Request.InputStream.Position = 0;
using (var inputStream = new StreamReader(context.Request.InputStream))
{
    jsonString = inputStream.ReadToEnd();
    var results = JsonConvert.DeserializeObject<dynamic>(jsonString);
    var columns = results.columns;
    var rows = results.rows;

    //use columns here to convert rows into DataBaseRow class
}

columns looks like: {["Col1","Col2","Col3"]}
rows looks like: {[["Val1","Val2","Val3"],["Val1","Val2","Val3"],["Val1","Val2","Val3"]]}

How can I do this?

UPDATE
Instead of trying to convert the dynamic class into the DataBaseRow class, I found I could actually just manually loop through the array values and write them into new instances of the DataBaseRow class.

using (DBEntities edmx = new DBEntities())
{
    foreach (var row in rows)
    {
        DataBaseRow dbr = new DataBaseRow();
        edmx.DataBaseRow.Add(dbr);
        dbr.LoadedTime = DateTime.Now;

        for (int i = 0; i < row.Count; i++)
        {
            string colval = row[i].ToString();
            string colname = columns[i].ToString();
            switch (colname)
            {
                case "Col1":
                    dbr.DBCol1 = colval;
                    break;
                case "Col2":
                    dbr.DBCol2 = colval;
                    break;
                case "Col3":
                    dbr.DBCol3 = colval;
                    break;
            }
        }
    }
    edmx.SaveChanges();
}

This works, but is very slow (see comment for timings). Is there a faster/better way to process this data? (if it matters - I actually have 14 columns that I'm mapping in the switch)


Solution

  • So the technical answer to my question can be found in the Update I added (just reference the dynamic objects as arrays, don't try to convert them).

    However, seems like Entity Framework is very poor at handling saving large datasets. This can be sped up by grouping the saves into chunks and recreating the context for every chunck. https://stackoverflow.com/a/5942176/266592

    I ended up rewriting this to insert the values into a DataTable and then using SqlBulkCopy to save the records to the database.

    var jsonString = String.Empty;
    context.Request.InputStream.Position = 0;
    using (var inputStream = new StreamReader(context.Request.InputStream))
    {
        jsonString = inputStream.ReadToEnd();
        var results = JsonConvert.DeserializeObject<dynamic>(jsonString);
        var columns = results.columns;
        var rows = results.rows;
    
        var dt = new DataTable();
        for (int i = 0; i < columns.Count; i++)
        {        
            dt.Columns.Add(columns[i].ToString());
        }
    
        foreach (var row in rows)
        {
            var datarow = dt.NewRow();
            for (int i = 0; i < row.Count; i++)
            {
                datarow[i] = row[i];
            }
            dt.Rows.Add(datarow);
        }
    
        using (var connection = new SqlConnection(ConnectionString))
        {
            SqlTransaction transaction = null;
            connection.Open();
            try
            {
                transaction = connection.BeginTransaction();
                using (var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock, transaction))
                {
                    sqlBulkCopy.DestinationTableName = "TABLENAME";
                    sqlBulkCopy.BatchSize = 100000;
                    sqlBulkCopy.BulkCopyTimeout = 0;
                    foreach (DataColumn col in dt.Columns)
                    {
                        sqlBulkCopy.ColumnMappings.Add(col.ColumnName, col.ColumnName);
                    }
                    sqlBulkCopy.WriteToServer(dt);
                }
                transaction.Commit();
            }
            catch (Exception)
            {
                transaction.Rollback();
            }
        }
    }