I am trying to set up a bitbucket pipeline that uses a database service provided by a docker container. However, in order to get the database service started correctly, I need to pass an argument to be received by the database container's ENTRYPOINT. I see from the pipeline service doc that it's possible to send variables
to the service's docker container, but the option I need to set isn't settable by an environment variable, only by a command line argument.
When I run the database's docker image locally using docker run
, I am able to set the option just by adding it to the end of the docker run
command, and it gets correctly applied to the container's ENTRYPOINT, so it seems like this should be straightforward, I just can't figure out where to put the argument in bitbucket-pipelines.yml.
Below is my bitbucket-pipelines.yml. Everything about it works great except that I need a way to pass a command line argument to the victoria-metrics container at the end of the file.
image: node:14.16.1
pipelines:
default:
- step:
caches:
- node
script:
- npm install
- npm test
services:
- mongo
- victoriaMetrics
definitions:
services:
mongo:
image: mongo:3.6
victoriaMetrics:
image: victoriametrics/victoria-metrics:v1.75.1
According to Mark C from Atlassian, there is presently no way to pass command line arguments to service containers. However, he has created a feature request for this capability, which you are welcome to vote for if interested.
In the meantime, the suggested workarounds are:
- You can start the service container by running a Docker command within Pipelines as long as the command is not restricted. You can check this link for more information about Docker restricted commands on Pipelines.
- You can create your own Docker image (using Dockerfile) and upload it to Docker Hub then use that image as a service container on Pipelines