Search code examples
pythonfile-iomemory-leaksgarbage-collectioncontextmanager

Does an open()ed file get closed when nothing holds a reference to it?


I know that the safe and recommended way to open files is to use context managers:

with open("x") as fh:
    do_something_with(fh)

I frequently encounter situations where I don't want to do anything with the file handle returned by open() other than reading or writing a file completely and then closing it. Closing the file is automatically handled by the context manager. But what if I don't use one?

For example, is this

with open("x.pickle", "rb") as fh:
    foo = pickle.load(fh)
more_code()

more or less equivalent to this?

foo = pickle.load(open("x.pickle", "rb"))
more_code()

Assuming that pickle (or whatever else creates/consumes the file) doesn't somehow keep the file handle around, and lets it go out of scope, it should eventually be garbage collected and the file closed, right?

I tested this trivial example by inserting sleep(100000) before more_code() and then checking lsof to see if the file still showed up as opened. In this test, it did not. So, again with the assumption that whatever reads/writes the file lets the file handle go out of scope, is this a safe practice?


Solution

  • In the Python implementation of iobase, you can see that part of garbage collecting (finalize) an IO object is calling its close method with the attribute _finalizing set to True. This class is a superclass of TextIOBase.

    Note, of course, that this is a low-level implementation detail. I doubt that the Python documentation requires this to happen.