I can use before_script
to pip install...
all the project needs but it means dependencies are installed before each job independently. This adds additional time, especially in case of sequential jobs.
Is there any common approach?
GitLab CI has support for cacheing dependencies across jobs:
image: python:latest
# Change pip's cache directory to be inside the project directory since we can
# only cache local items.
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
# Pip's cache doesn't store the python packages
# https://pip.pypa.io/en/stable/topics/caching/
#
# If you want to also cache the installed packages, you have to install
# them in a virtualenv and cache it as well.
cache:
paths:
- .cache/pip
- venv/
before_script:
- python --version # For debugging
- pip install virtualenv
- virtualenv venv
- source venv/bin/activate
test:
script:
- python setup.py test
- pip install tox flake8 # you can also use tox
- tox -e py,flake8
note cacheing deps in python can be a little weird so def double check the docs.
https://docs.gitlab.com/ee/ci/caching/#cache-python-dependencies