Search code examples
scalaamazon-s3kubernetesamazon-eksgatling

Uploading gatling report to s3


I'm running some stresses tests in cluster. I compile Gatling code in a jar and run it in a dockerized environment. Likewise, I was wondering if there is a way to upload the final Gatling report to S3. There is after hook in Gatling simulation class, but I think this is getting executed before the report is generated.


Solution

  • An easy way I can think to do this without changing even the gatling code or the jar, is to simply:

    • Make the docker container running the tests run as an init container
    • After the init container had run, a main container starts and can do the s3 upload, either with bash commands or with whatever (example, with tinys3 - https://github.com/smore-inc/tinys3 )

    Just a general example:

    apiVersion: v1
    kind: Pod
    metadata:
      labels:
        app.kubernetes.io/instance: gatling-stress-test
        app.kubernetes.io/name: gatling-stress-test
      name: gatling-stress-test
    spec:
      initContainers:
      - name: gatling-stress-test-runner
        image: gatling-docker-image:latest
        imagePullPolicy: Always
        volumeMounts:
        - mountPath: /full/path/to/gatling/reports
          name: reports
      containers:
      - name: stress-test-s3-copier
        image: ubuntu-image-with-tinys3:latest
        imagePullPolicy: Always
        volumeMounts:
        - mountPath: /full/path/to/gatling/reports
          name: reports
      volumes:
      - emptyDir: {}
        name: reports