I have apache airflow that is running in a docker environment
I have DAGS that where developed for different virtual python environments (different package compositions, different python versions)
What is a good way to execute dags in different virtual environments? how should these environments be created?
What I have tried:
I have tried different ways to create/copy virtual environment so that it was accessible for the airflow-container. However the process was messy and i didn't work. For example I have tried to enter the airflow container:
docker exec -it docker-airflow-scheduler-1 bash
when inside the container i create an environment:
python -m venv env_name source env_name/bin/activate
If i try to install packages "pip install some_package" i get at error that i can cannot perform user install ???? So as a hack i tried changing a line in 'env_name/pyvenv.cfg'
from : include-system-site-packages = false to : include-system-site-packages = true
I deactivated and reactivated the environment and tried installing again
pip install -r requirements pip list
But it seems i am still seeing the global installed
I found that adding "--isolated" solves the problem:
pip install --isolated some_package