Search code examples
dockerjenkinsdockerfilehost

Is it possible to limit the maximum number of containers per docker image?


Problem: I have a couple of Docker images on a hosting server. I start multiple containers from a bunch of jenkins job. Due to limited capabilities of the host, I'd like to limit the maximum number of container per image. Setting the limit for the number of jenkins executors doesn't really solve the problem since some jobs can spin up 16 containers. It is possible though to split them into several threads of parallel executions, but this is still not ideal. I'd like to have one solution for all jobs

Question #1 (main): Is it possible to set the maximum limit of containers Docker runs on a single machine to 10, and queue the rest of them?

Question #2: If there is no such functionality or there are better options in this case, what is the workaroud for this?


Solution

  • One way is to use kubetnetes as mentioned above. But this is very time consuming route

    A simpler way is to set a master job that spins up your containers. Your pipeline will be calling this job, eg 16 times spinning up 16 containers. Then set a maximum of executors on your jenkins host for example to 6. When you kick off your job it will be 1 executor plus 16 in queue, total 17. It will start first 6, and then wait until then will be stopped. Once any of running containers is done, it will allow the next container to run