I am running a dockerized django app with nginx on an ubuntu (ec2-instance) server. My idea is to deploy with docker-machine.
When I ssh into my EC2 and copy my app there and then spin up my docker-compose file, everything works well and the app runs perfectly. But I would like to spin up my container without ssh'ing into the server but using docker machine.
When I point my execution commands to my EC2 instance and run the same command (docker-compose -f production.yml up --build
) I get the following error:
Cannot start service production-nginx-container: OCI runtime create failed: container_linux.go:346: starting container process caused "process_linux.go:449: container init caused \"rootfs_linux.go:58: mounting \\"/Users/myuser/myfolder/myproject/compose/production/nginx/myconf.conf\\" to rootfs \\"/var/lib/docker/overlay2/44675a2cf4ac6e3052c9df3bd6fbb35b1ece33736d632199572c6a1c90965c12/merged\\" at \\"/var/lib/docker/overlay2/44675a2cf4ac6e3052c9df3bd6fbb35b1ece33736d632199572c6a1c90965c12/merged/etc/nginx/conf.d/default.conf\\" caused \\"not a directory\\"\"": unknown: Are you trying to mount a directory onto a file (or vice-versa)? Check if the specified host path exists and is the expected type
The line I think which docker is complaining about is this one:
- ./compose/production/nginx/myconf.conf:/etc/nginx/conf.d/default.conf
I am trying to map my config from my nginx file into the default conf of the nginx docker container. myconf.conf is there and it is a file.
My guess is that there is a problem mounting the file from my local computer to the container on the server? But I have no idea how I could fix this.... Also the other questions on this error don't get me anywhere.
Shouldn't the command I run with docker machine pointing to the instance do the exact same thing like when I am ssh'ing into my instance and run it from there? Unfortunately my knowledge of docker is limited. If someone could lend me a helping hand I would be very grateful.
Thanks so much in advance and please see my configuration in the following:
production-nginx-container:
container_name: 'production-nginx-container'
image: nginx:latest
ports:
- "80:80"
- "443:443"
volumes:
- ./compose/production/nginx/myconf.conf:/etc/nginx/conf.d/default.conf
- /etc/letsencrypt/live/mydomain.de/fullchain.pem:/etc/letsencrypt/live/mydomain.de/fullchain.pem
- /etc/letsencrypt/live/mydomain.de/privkey.pem:/etc/letsencrypt/live/mydomain.de/privkey.pem
depends_on:
- django
Even if you're using a remote $DOCKER_HOST
Docker has no way to mount local content on to a remote container. docker run -v
options and Docker Compose volumes:
bind mounts are always taken as paths on the host where the Docker daemon is running, not where you're running the docker
command. You'll need to copy things like config files and TLS certificates to the remote host, and by the time you're doing that, you might as well just use ssh to launch the container too.
You might consider whether an automation tool like Ansible, Chef, or Salt Stack meets your needs here. These generally have built-in tools for both "make sure this file is on the remote system" and "start this container on some remote system". I wouldn't recommend using Docker Machine except in the specific case of needing a local VM to run Docker (usually via the Docker Toolbox application, on Windows 7 or other environments where there isn't a "native" Docker application you can use).