Search code examples
pythonfiletimeoutdeadlock

Use a timeout to prevent deadlock when opening a file in Python?


I need to open a file which is NFS mounted to my server. Sometimes, the NFS mount fails in a manner that causes all file operations to deadlock. In order to prevent this, I need a way to let the open function in python time out after a set period. E.g. something like open('/nfsdrive/foo', timeout=5). Of course, the default open procedure has no timeout or similar keyword.

Does anyone here know of a way to effectively stop trying to open a (local) file if the opening takes too long?

Note: I've already tried the urllib2 module, but it's timeout options only work for web requests, not local ones.


Solution

  • You can try using stopit

    from stopit import SignalTimeout as Timeout
    
    with Timeout(5.0) as timeout_ctx:
        with open('/nfsdrive/foo', 'r') as f:
            # do something with f
            pass
    

    There may be some issues with SignalTimeout in multithreaded environments (like Django). ThreadingTimeout on the other hand may cause problems with resources on some virtual hostings when you run too many "time-limited" functions

    P.S. My example also limits processing time of opened file. To only limit file opening you should use different approach with manual file opening/closing and manual exception handling