I cannot install the pip requirements in the Dockerfile.
Here is my Dockerfile
FROM apache/airflow:2.7.1
ADD requirements.txt .
# increase pip timeout time
ENV PIP_DEFAULT_TIMEOUT 60
RUN pip install --upgrade pip
RUN pip install apache-airflow==${AIRFLOW_VERSION} -r requirements.txt
and the requirements.txt
apache-airflow==2.7.1
pandas==2.0.3
fastparquet
pytest
when I try to run the docker compose up --build
command it gets stuck here:
> [airflow-local-env-airflow-triggerer 4/4] RUN pip install apache-airflow==2.7.1 -r requirements.txt:
=> => # Requirement already satisfied: apache-airflow==2.7.1 in /home/airflow/.local/lib/python3.8/site-packages (2.7.1)
=> => # Requirement already satisfied: pandas==2.0.3 in /home/airflow/.local/lib/python3.8/site-packages (from -r requirements.txt (line 2)) (2.0.3)
=> => # WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ConnectTimeoutError(<pip._vendor.urllib3.connection.HTTPSConnectio
=> => # n object at 0x7f4941a58340>, 'Connection to 10.12.9.140 timed out. (connect timeout=60.0)')': /simple/fastparquet/
=> => # WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ConnectTimeoutError(<pip._vendor.urllib3.connection.HTTPSConnectio
=> => # n object at 0x7f4941a58820>, 'Connection to 10.12.9.140 timed out. (connect timeout=60.0)')': /simple/fastparquet/
airflow and pandas are requirements already satisfied by the base image so they are skipped and other requirements cannot be installed.
I tried entering the running container with docker exec -it docker exec -it airflow-local-env-airflow-worker-1 sh
and running pip install pytest
and everything works fine but for some reason it won't work if I try to install the requirements in the Dockerfile.
I am behind my company's proxy but I don't think that's the problem since in my .docker/config.json
I have configured the proxy.
{
"auths": {
"artifactory.ocp-sdp.p4avd.mycompany.com": {
"auth": "mytoken"
}
},
"credsStore": "desktop",
"proxies": {
"default": {
"httpProxy": "http://myuser:mypwd@the_ip:8080",
"httpsProxy": "http://myuser:mypwd@the_ip:8080",
"noProxy": "localhost,127.0.0.1,172.17.0.,.mycompanyname.com,cc-artifactory.mycompanyname.net,artifactory.ocp-sdp.p4avd.mycompanyname.com"
}
},
"currentContext": "desktop-linux"
}
I tried the following things suggested in this other SO question:
--proxy=http://user:pass@addr:port
to the pip commanddocker-compose.yaml
(operation not permitted)EDIT 1: I modified the sistemd files as suggested in @Paolo 's comment
cat .config/systemd/user/docker.service.d/http-proxy.conf
[Service]
Environment="HTTP_PROXY=http://user:pwd@ip:8080"
Environment="HTTPS_PROXY=http://user:pwd@ip:8080"
Environment="NO_PROXY=localhost,127.0.0.1,172.17.0.,.company.com,cc-artifactory.company.net,artifactory.ocp-sdp.p4avd.company.com"
but I still get the same errors (Could not connect to 10.12.9.140:8080 (10.12.9.140), connection timed out
)
The problem is that the network is not set to host. To make it work use the network option like this: docker build --network=host
or, if you're using docker compose
like I was, use the following syntax:
...
build:
context: .
network: host
...
and then build with docker compose up --build