Search code examples
pythonmultiprocessingtimeoutconcurrent.futures

Any concurrent.futures timeout that actually works?


Tried to write a process-based timeout (sync) on the cheap, like this:

from concurrent.futures import ProcessPoolExecutor

def call_with_timeout(func, *args, timeout=3):
    with ProcessPoolExecutor(max_workers=1) as pool:
        future = pool.submit(func, *args)
        result = future.result(timeout=timeout)

But it seems the timeout argument passed to future.result doesn't really work as advertised.

>>> t0 = time.time()
... call_with_timeout(time.sleep, 2, timeout=3)
... delta = time.time() - t0
... print('wall time:', delta)
wall time: 2.016767978668213

OK.

>>> t0 = time.time()
... call_with_timeout(time.sleep, 5, timeout=3)
... delta = time.time() - t0
... print('wall time:', delta)
# TimeoutError

Not OK - unblocked after 5 seconds, not 3 seconds.

Related questions show how to do this with thread pools, or with signal. How to timeout a process submitted to a pool after n seconds, without using any _private API of multiprocessing? Hard kill is fine, no need to request a clean shutdown.


Solution

  • You might want to take a look at pebble.

    Its ProcessPool was designed to solve this exact issue: enable timeout and cancellation of running tasks without the need of shutting down the entire pool.

    When a future times out or is cancelled, the worker gets actually terminated effectively stopping the execution of the scheduled function.

    Timeout:

    pool = pebble.ProcessPool(max_workers=1)
    future = pool.schedule(func, args=args, timeout=1)
    try:
        future.result()
    except TimeoutError:
        print("Timeout")
    

    Example:

    def call_with_timeout(func, *args, timeout=3):
        pool = pebble.ProcessPool(max_workers=1)
        with pool:
            future = pool.schedule(func, args=args, timeout=timeout)
            return future.result()