i have two systemd service one handles my celery workers(10 queue for different tasks) and one handles celery beat
after deploying new code i restart celery worker service to get new tasks and update celery jobs
Should i restart celery beat with celery worker service too?
or it gets new tasks automatically ?
It depends on what type of scheduler you're using.
If it's default PersistentScheduler
then yes, you need to restart beat daemon to allow it to pick up new configuration from the beat_schedule
setting.
But if you're using something like django-celery-beat
which allows managing periodic tasks at runtime then you don't have to restart celery beat.