I upload a text file into my system. It consists of the file content as byte array. Then this content is parsed and converted in a C# class object with many relations.
I need to move this upload process in a background task. So, after upload, the UI thread lets the user do anything else, while in the background the files table is populated.
The background task is in a separate Console Library project public class UploadFilesService : BackgroundService
. So it has no access to the uploaded files. I think I need to store them in a database table then the background task will fetch them and just call the upload process.
But the input files are big objects for a database table. I consider having a column 'ObjectAsJson' where the object plus the content byte array are converted to json. nvarchar
has a limit for characters. The byte array surpasses that limit. The many-to-many relations add a lot of characters to this json and it surpasses this limit.
Before the background task there was no problem, because everything was temporary in memory. Now i consider the following solution:
Use SQL Blob data type for file's content.
Parse and convert again the input file in the C# class. This adds duplicity, because during upload I already parse and convert the file to validate the fields. And if the file is valid then theoretically its added in a queue to be fetched by the background task.
There is also an alternative FileStream SQL datatype, but I think Blob is better.
How would you approach this problem? Of providing the input files to your background task.
EF maps a property string to nvarchar(max). According to others answer, it can keep 1 billion characters. This is enough to keep any object as json, even if it has many relations.