I was looking today into Amazon SQS as an alternative approach to installing my own RabbitMQ on EC2 instance.
I have followed the documentation as described here
Within a paragraph it says:
SQS does not yet support events, and so cannot be used with celery events, celerymon or the Django Admin monitor.
I am a bit confused what events
means here. e.g. in the scenario below I have a periodic task every minute where I call the sendEmail.delay(event)
asynchronously.
e.g.
@celery.task(name='tasks.check_for_events')
@periodic_task(run_every=datetime.timedelta(minutes=1))
def check_for_events():
now = datetime.datetime.utcnow().replace(tzinfo=utc,second=00, microsecond=00)
events = Event.objects.filter(reminder_date_time__range=(now - datetime.timedelta(minutes=5), now))
for event in events:
sendEmail.delay(event)
@celery.task(name='tasks.sendEmail')
def sendEmail(event):
event.sendMail()
When running it with Amazon SQS I get this error message:
tasks.check_for_events[7623fb2e-725d-4bb1-b09e-4eee24280dc6] raised exception: TypeError(' is not JSON serializable',)
So is that the limitation of SQS as pointed out in the documentation or am I doing something fundamentally wrong? Many thanks for advice,
I might have found the solution. Simply refactor the sendMail() function inside event into the main task therefore there won't be any need to parse the object into json:
@celery.task(name='tasks.check_for_events')
@periodic_task(run_every=datetime.timedelta(minutes=1))
def check_for_events():
now = datetime.datetime.utcnow().replace(tzinfo=utc,second=00, microsecond=00)
events = list(Event.objects.filter(reminder_date_time__range=(now - datetime.timedelta(minutes=5), now)))
for event in events:
subject = 'Event Reminder'
link = None
message = ...
sendEmail.delay(subject, message, event.user.email)
@celery.task(name='tasks.sendEmail')
def sendEmail(subject, message, email):
send_mail(subject, message, settings.DEFAULT_FROM_EMAIL, [email])
This works both with Rabbitmq and Amazon SQS