I am working with AI image processing job where I am using Django rest framework, Python3, tensorflow and keras along with Celery to process asynchronous task. I am also using the redis server. But while I am executing the celery task it is receiving the tasks but getting stuck in the middle. It's happening all the time. I am trying to serve it for amazon ec2 g3s.xlarge instance though it running fine in my local machine.
I am trying to deploy it in amazon ec2 g3s.xlarge instance with Deep learning AMI (linux) version.
@task(name="predict")
def work_out(cow_front_image,cow_back_image):
return detect_cow_weight(cow_front_image,cow_back_image)
This is a large project not getting any idea how to show it here all the codes.
I repeat its running fine and quite comfortably in the local machine and also I used the all the configuration from one of our existing server served product which is production grade.
I am expecting to celery task to get executed like I will pass two image as argument then it will process the image and back the the result what he has seen in the background.
I got the fix --pool=solo
celery -A prodapi worker -l info --without-gossip --without-mingle --without-heartbeat -Ofair --pool=solo