Search code examples
dockerdocker-composedocker-machinedocker-swarm

How to put docker container for database on a different host in production?


Let's say we have a simple web app stack, something like the one described in docker-compse docs. Its docker-compose.yml looks like this:

version: '2'
services:
  db:
    image: postgres
  web:
    build: .
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db

This is great for development on a laptop. In production, though, it would be useful to require the db container to be on its own host. Tutorials I'm able to find use docker-swarm to scale out the web container, but pay no attention to the fact that the instance of db and one instance of web run on the same machine.

Is it possible to require a specific container to be on its own machine (or even better, on a specific machine) using docker ? If so, how? If not, what is the docker way to deal with database in multi-container apps?


Solution

  • In my opinion, databases sit on the edge of the container world, they're useful for development and testing but production databases are often not very ephemeral or portable things by nature. Flocker certainly helps as do scalable types of databases, like Cassandra, but databases can have very specific requirements that might be better treated as a service that sits behind your containerised app (RDS, Cloud SQL etc).

    In any case you will need a container orchestration tool.

    You can apply manual scheduling constraints for Compose + Swarm to dictate the docker host a container can run on. For your database, you might have:

    environment:
      - "constraint:storage==ssd"
    

    Otherwise you can setup a more static Docker environment with Ansible, Chef, Puppet

    Use another orchestration tool that supports docker: Kubernetes, Mesos, Nomad

    Use a container service: Amazon ECS, Docker Cloud/Tutum