I've implemented a back-end functionality where one can upload a CSV or excel an file and then the server(Django API) will read and save all the data into the database but I feel like it's not the best approach and would like to get advise about this.
I don't see the reason why the server is required to read and validate the CSV file while it's possible for the client(Angular app) to do that job. The client can read validate the file and the sends the data to the server.
Then the server will do the rest of the job to save the data into the database. But I've come to think of it, suppose the file contains a million entries this means the server endpoint will be adding a million items. This will cause performance issues. What is the best approach to handle this?
I implemented an Angular/Python App that processes large Excel files, even with multiple sheets and the best approach is to let the client upload the file to the server and then your server does the rest of the job.
The processing can take a lot of time, so you might need to save the file to a temporary location and process it asynchronously as a job. This job will read your file, process each row and save the data to the database. Optionally, it can also save the progress and job status to the database and so, you can even let the client track the processing progress given the job ID.