I want to provide a file-upload API which supports uploading a big file by multiple http requests. The requests can go to different servers, but I don't want to have the whole file copied to all those servers - they should accept the file fragment and forward it to another service, but I somehow need to get the SHA256 checksum from the individual parts before that. ds?
In Python, I know hashlib
can call update
to calculate the sha256
signature step by step, but that obviously requires the steps to execute in the same Python process.
Could anyone help me figure out how to continue calculating sha256 during http requests for a big file when the parts are not all available to the same Python instance at the same time?
Finally, I got the answer from this link: https://bugs.python.org/issue11771
It suggests a third-party library called rehash for sharing the intermediate hashlib
object between Python instances.