Maybe a bit of a broad question, but I think this is relevant for any maintainer of python packages that uses github and could reduce their workload significantly, so hopefully the powers that be will let it stand.
Essentially, it seems to me that :
So my question is:
I can think of various solutions involving e.g. an intermeidary s3 bucket, but I'm likely hugely mistaken in regard to how pypi and/or github actions work in this regard so there might be a very simple issue I'm glancing over.
As mentioned in my comment, here is one possible way to run parallel builds but a single upload:
name: 'Aggregation'
on: [push]
env:
ARTIFACT: artifact.bin
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os:
- windows-latest
- ubuntu-latest
- macos-latest
steps:
- uses: actions/checkout@v2
- shell: bash
run: |
echo "Run your build command here"
echo "This is a fake ${{ matrix.os }} build artifact" >$ARTIFACT
- uses: actions/upload-artifact@v2
with:
name: build-${{ matrix.os }}-${{ github.sha }}
path: ${{ env.ARTIFACT }}
publish:
runs-on: ubuntu-latest
needs: build
steps:
- uses: actions/download-artifact@v2
with:
path: artifacts
- shell: bash
working-directory: artifacts
run: |
for i in $( ls ); do
cat $i/$ARTIFACT
done
Each matrix job build and upload its own artifact to github. The publish job waits for all the previous jobs to complete, before downloading all the artifacts and in this case iterating over them. One added benefit is that if any matrix job fails the publishing will fail. Of course this is simple only if the build steps and commands are the same on all the OSes.