I am using external docker image from dockerhub.
In each step the dockerimage is pulled from dockerhub again and again. Yes it is desired workflow.
My question is can we cache this image, so that it wont pull from dockerhub in each step? This DockerImage is not going to change frequently, as it has only node and meteor as preinstalled.
So is it possible to cache the docker image?
Original bitbucket-pipeline.yml
image: tasktrain/node-meteor-mup
pipelines:
branches:
'{develop}':
- step:
name: "Client: Install Dependencies"
caches:
- node
script:
- npm install
- npm run setup-meteor-client-bundle
artifacts:
- node_modules/**
- step:
name: "Client: Build for Staging"
script:
- npm run build-browser:stag
artifacts:
- dist/**
- step:
name: "Client: Deploy to Staging"
deployment: staging
script:
- pipe: atlassian/aws-s3-deploy:0.2.2
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
S3_BUCKET: $S3_STAGING_BUCKET_NAME
LOCAL_PATH: 'dist'
ACL: "public-read"
DELETE_FLAG: "true"
EXTRA_ARGS: "--follow-symlinks --quiet"
- step:
name: "Server: Build and Deploy to Staging"
script:
- cd server
- mup setup --config=.deploy/mup-settings.stag.js
- mup deploy --config=.deploy/mup-settings.stag.js --settings=meteor-settings.stag.json
It is indeed possible to cache dependencies and docker is one of the pre-defined caches of Bitbucket Pipelines
pipelines:
default:
- step:
services:
- docker
caches:
- docker
script:
- docker pull my-own-repository:5000/my-image