Search code examples
pythontraversalglobos.walkdirectory-walk

Quicker to os.walk or glob?


I'm messing around with file lookups in python on a large hard disk. I've been looking at os.walk and glob. I usually use os.walk as I find it much neater and seems to be quicker (for usual size directories).

Has anyone got any experience with them both and could say which is more efficient? As I say, glob seems to be slower, but you can use wildcards etc, were as with walk, you have to filter results. Here is an example of looking up core dumps.

core = re.compile(r"core\.\d*")
for root, dirs, files in os.walk("/path/to/dir/")
    for file in files:
        if core.search(file):
            path = os.path.join(root,file)
            print "Deleting: " + path
            os.remove(path)

Or

for file in iglob("/path/to/dir/core.*")
    print "Deleting: " + file
    os.remove(file)

Solution

  • I made a research on a small cache of web pages in 1000 dirs. The task was to count a total number of files in dirs. The output is:

    os.listdir: 0.7268s, 1326786 files found
    os.walk: 3.6592s, 1326787 files found
    glob.glob: 2.0133s, 1326786 files found
    

    As you see, os.listdir is quickest of three. And glog.glob is still quicker than os.walk for this task.

    The source:

    import os, time, glob
    
    n, t = 0, time.time()
    for i in range(1000):
        n += len(os.listdir("./%d" % i))
    t = time.time() - t
    print "os.listdir: %.4fs, %d files found" % (t, n)
    
    n, t = 0, time.time()
    for root, dirs, files in os.walk("./"):
        for file in files:
            n += 1
    t = time.time() - t
    print "os.walk: %.4fs, %d files found" % (t, n)
    
    n, t = 0, time.time()
    for i in range(1000):
        n += len(glob.glob("./%d/*" % i))
    t = time.time() - t
    print "glob.glob: %.4fs, %d files found" % (t, n)