We are developing an API with Laravel.
My app needs to get data from another server (large sets of data) in the background every 20 minutes or so, then process these data (basically compare them in for loops) and write all of it to the database (thousands of rows created or updated).
This operation needs to be done infinitely while my app uses these data to serve users;And there are mobile and web apps that connect to this application.
This processes will not be changed or stopped and I don't know if there is way to do this without creating a command schedule (cron)
how would you implement this considering these factors :
there are a lot of users for this app from the get go.
these data that are being processed gets bigger with time
this is a db intense operation (update and write)
there are jobs that are being added dynamically by users like image upload or sms sending, so this process needs to be done with high priority which ignores other processes without human instantiation
Create jobs for each API call to receive the data (query jobs).
Store these data using multi dimensional arrays and chunk on DB query builder
Create jobs for processing the data that will be dispatched inside the query jobs (don't know if this is a good idea or not) or maybe use chain method.
Queue these jobs in separate queue like api_calls
Create a command like server:update
which dispatches the initial jobs (query jobs)
The query jobs will run every 30 minutes (schedule) and manually dispatch the process jobs
Use supervisor to monitor the queues and maybe this
Deploy and run server:update