Search code examples
laravelqueuejobsworkerrate-limiting

Laravel multiple workers and being rate limited


I have an application which has a queue with multiple workers. The workers are scaled based on the number of jobs in the queue. One of the main point of the jobs being processed by the workers is to make API calls to an external service to retrieve some data. The external service implements api rate limiting at a rate of 2 calls / second with a leaky bucket algorithm (https://en.wikipedia.org/wiki/Leaky_bucket). My api calls are not towards a single service but against accounts of the service. So the rate limit only applies when I make multiple api calls towards an account of the service. The rate limit is not shared among different accounts. The number of the jobs that make api calls are in the number of thousands (for one account alone) so the bucket will always get full and I will start to get rate limited.

The problem with something like this is that if my jobs are not observing the rate limit before I hit it, I might try to delay them and place them back in the queue when that happens but because I have multiple workers and thousands of jobs I might end up in an infinite loop of rate limit.

I've thought about using something like memcache or redis to store some kind of information that can be shared among the workers so that before I attempt to make an api call in the job I will stop, delay the job and place it back in the queue so it won't raise the rate limit. The problem with this approach is that every time I would do something like that I would increase the number of attempts of the job. I have a max number of attempts per job. I also realized that the number of attempts cannot be change in any way with the Laravel worker and I am fine with that, I believe that makes the worker more stable and predictable. The only option to reset that attempts number is to delete the job and redispatch it as a new job (but this feels very messy).

If anyone has ever came across an issue like this, would you be willing to share how would the queue implementation look like in this case? Should workers share state between them (this feels wrong)?

NOTE: I don't believe that sleeping a worker is a good way of solving the rate limit due to the fact that I might be making api calls to different accounts of the service (based on what jobs are next in the queue) and therefore if the worker sleeps for one account while it could service another this would feel a lost time.

Thanks


Solution

  • Sounds like you are using the Shopify API?

    I managed to use this in Laravel 5.5 to get this working. From the documentation:

    Redis::throttle('key')->allow(2)->every(1)->then(function () {
        // Job logic...
    }, function () {
        // Could not obtain lock...
    
        return $this->release(10);
    });
    

    https://laravel.com/docs/5.5/queues#rate-limiting