Search code examples
phpbeanstalkdpheanstalk

Beanstalk in a centralized server, how to avoid duplicate work for a worker?


I have a server that runs Beanstalk, and some independent servers that runs the workers, in PHP, with Pheanstalk.

From the moment a worker A get a job :

$job = $pheanstalk->watch('tube')
    ->ignore('default')
    ->reserve();

$data = json_decode($job->getData(), true);

And the moment it delete the job ($pheanstalk->delete($job);), it could happens a few ten seconds.

Does Beanstalk know the job is being processed and no other worker will have it, or will I have a concurrency problem? (two worker taking the same job).

Thank you for your help.


Solution

  • BeanstalkD knows the job that is being processed, so it won't serve the job to another worker while it is processing it.