In my rails app, I use a worker process to scan 45k database records once in 6 hours and send out mails if certain condition is met. This causes the server CPU/Load to spike when the worker is processing. As a result of which other server request gets a performance hit. I tried using find_in_batch to retrieve 1000 records at a time and do the processing. But the CPU utilization is still at the peak level. No big difference i was able to see. Is there any way to handle this , so the CPU utilization doesn't hit the max limit?
Fiddling with the process priority level using nice
is one way to do it, but another is to tell your app to chill out a little bit now and then using the sleep
or select
command:
while (doing_stuff)
do_stuff
# Take a break for 0.2 seconds
select(nil, nil, nil, 0.2)
end
The select
call will block for a brief period of time allowing other tasks on the system to run freely. The higher you set this value the slower your job will run, but the lower the impact on the CPU load level.