We regularly need to upload large asset packages (1GB+) to Azure BlobStorage where it is downloaded from later on. These packages need to be zipped up before being stored so a while ago we invested in a single powerful PC for the office that you RDP onto so that we can zip these packages up in good time.
We're now looking at re-tooling, and this process now needs to be (a) scalable, so that multiple clients can use it, (b) disaster proof, so that the process isn't ruined if our office gets robbed, and (c) efficient, if we have a solution that gets point a and b, but is slow no one will use it.
I've been looking at writing either an Azure function or AWS Lambda that we could uses to run the zip process, this would solve point a and b, but this would probably require us to upload the package to storage, where the function to zip it up and pass it on would be triggered. But the initial upload process would need to be optimised so that we don't lose too much speed.
tl;dr
What is the most efficient way to upload large packages from a local dev environment to Azure BlobStorage ?
Probably the easiest and performant solution will be to use AzCopy (see https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10). AzCopy is considered fast, it's CLI-based so you can run it manually or script it, and it's cross-platform (Win/Mac/Linux). It has a ton of options (read the docs) for all sorts of situations and can handle various auth methods. It also has built-in resiliency -- it will automatically retry failed uploads up to 20 times (using it's own exponential back-off logic).