I am trying to run my tasks using Airflow and Celery Executor. I have created one DAG inside that I have approximately 60000 tasks. I am trying to run this and When I trigger a DAG it starts running but not scheduling my tasks. It stays in running state for 2 days without scheduling the tasks.
And the main bug is If i schedule 5000 tasks then it takes less than 1 minute to schedule it but if I am trying to schedule more than that limit it is scheduling my tasks.
I have tried different solutions also to solve this the first one is:
But, this does not work as New Version of airflow does not support non_pooled_task_slot_count.
I have tried to use CELERYD_COUNT which is also not working in my case. I have tries mostly every changes which can be useful but none of them is working for me.
Can someone let me know how can schedule this much amount of tasks /