I am building a rails app that I am deploying with Heroku, and I need to be able to import and process large csv files (5000+ lines).
Doing it in the controller using the built in ruby csv parser takes over 30 seconds and causes the Heroku dyno to time out
I was thinking of putting the csv into the database then processing it with a delayed_job but this method limits out at just over 4200 lines.
I am using mysql and longtext for the column containing the file so the db should be able to handle it
Any ideas for this use case?
below is sample code to import file
n = SmarterCSV.process(params[:file].path) do |chunk|
Resque.enqueue(ImportDataMethod, chunk)
end
after it read file, passed the data record to resque and then import it in background (if you using rails 4.2 above you can combine with rails active job)