Component configuration
All of the components within Airflow will be shared by all users (and DAGs) of a deployment. There are ways to potentially use operational paradigms and conventions to isolate information and provide a semblance of security, but these will likely not be able to be strictly enforced as a policy.
The Celery Executor
The Celery Executor launches multiple threads that pick up work from the broker for execution. By design, each thread shares CPU, memory, and local disk space within the worker, so neighboring workloads have a possibility of colliding and clobbering each other. If you wish to direct workloads to specific workers (or isolate them), you can utilize the queueing mechanism to do so. To do so, you need to ensure that each worker starts with a queue name that it is listening to (airflow celery worker -q queue_name
). You can then assign tasks to specific queues by assigning the queue name to the operator using the queue
keyword argument.
Each worker...