Search code examples
dockergitlab-cicypresse2e-testingangular-e2e

Gitlab CI: How to configure cypress e2e tests with multiple server instances?


My goal is to run a bunch of e2e tests every night to check if the code changes made the day before break core features of our app.

Our platform is an Angular app which calls 3 separate Node.js backends (auth-backend, old- and new-backend). Also we use a MongoDB as Database.

Let's consider every of the 4 projects to have a branch called develop which should only be testet.


My approach would be the following:

  1. I am running every backend plus the database in a separate docker container.

  2. Therefor I need to get either the latest build of that project from gitlab using ssh

  3. or clone the repo to the docker container and run a build inside it.

  4. After all project are running on the right ports (which I'd specify somewhere) I start the npm script for running cypress e2e tests.

All of that should be defined in some file. Is that even possible?


Solution

  • I do not have experience with the gitlab CI, but I know, that other CI-systems provide the possibility, to run e.g. bash scripts.

    So I guess you can do the following:

    • Write a local bash script that pulls all the repos (since gitlab can provide secret keys, you can use these in order to authenticate against your gitlab repos)
    • After all of these repos were pulled, you can run all your build commands for your different repos
    • Since you have some repos working and depending on each other, you possibly have to add a build command for exactly this use case, so that you always have production state, or whatever you need
    • After you have pulled and built your repos, you should start your servers for your backends
    • I guess your angular app uses some kind of environment variables to define the servers to send the request to, so you also have to define them in your build command/script for your app
    • Then you should be able to run your tests

    Personally I think that docker is kind of overdose for this use case. Possibly you should define and run a pipeline to always create a new develop state of your backend, push the docker file to your sever. Then you should be able to create your test-pipeline which first starts the docker-container on your own server (so you do not have an "in-pipeline-server"). This should then have started all your backends, so that your test pipeline can now run your e2e tests against those set up Backend servers.

    I as well advise, that you should not run this pipeline every night, but when the develop state of one of those linked repos changes.

    If you need help setting this up, feel free to contact me.