Search code examples
pythonamazon-web-servicesamazon-s3sigintatexit

upload a file to s3 after script end/crashes: cannot schedule new futures after interpreter shutdown


I need to upload a file to s3 no matter how a script end/interrupts.

I have done:

import atexit
import signal
atexit.register(exit_handler)
signal.signal(signal.SIGINT, exit_handler)
signal.signal(signal.SIGTERM, exit_handler)
def exit_handler():
    s3_client = boto3.client('s3')
    s3_client.upload_file(file,bucket, file)
    

But when sending CTRL+C I get:

File "/python3.10/site-packages/s3transfer/futures.py", line 474, in submit
    future = ExecutorFuture(self._executor.submit(task))
  File "/python3.10/concurrent/futures/thread.py", line 169, in submit
    raise RuntimeError('cannot schedule new futures after '
RuntimeError: cannot schedule new futures after interpreter shutdown

Solution

  • The main process could create a multiprocessing.Process (worker process) which does the actual work.

    The main process watches if the worker process is still alive using join(), is_alive() or the sentinel attribute.

    As soon as the worker terminates, the main process can upload the file.