like this I added in my celery.py
@app.task(bind=True)
def execute_analysis(id_=1):
task1 = group(news_event_task.si(i) for i in range(10))
task2 = group(parallel_task.si(i) for i in range(10))
return chain(task1, task2)()
Problem: You are calling too many functions(tasks) in same process sequentially so if any task (scrapping news data) gets blocked all other will keep waiting and might go in block state.
Solution : A better design would be to run news_event_task in delay and and with each news_event_task if you want to call parallel_task then both can be done in same process. So now all tasks will run in parallel ( Use celery eventlet to achieve this).
Another approach could be send these tasks in queue (rather than keeping its sequence in memory) and then process each news_event_task one by one.