Search code examples
pythonmd5tarhashlibtarfile

Determine whether any files have been added, removed, or modified in a directory


I'm trying to write a Python script that will get the md5sum of all files in a directory (in Linux). Which I believe I have done in the code below.

I want to be able to run this to make sure no files within the directory have changed, and no files have been added for deleted.

The problem is if I make a change to a file in the directory but then change it back. I get a different result from running the function below. (Even though I changed the modified file back.

Can anyone explain this. And let me know if you can think of a work-around?

def get_dir_md5(dir_path):
    """Build a tar file of the directory and return its md5 sum"""
    temp_tar_path = 'tests.tar'
    t = tarfile.TarFile(temp_tar_path,mode='w')  
    t.add(dir_path)
    t.close()

    m = hashlib.md5()
    m.update(open(temp_tar_path,'rb').read())
    ret_str = m.hexdigest()

    #delete tar file
    os.remove(temp_tar_path)
    return ret_str

Edit: As these fine folks have answered, it looks like tar includes header information like date modified. Would using zip work any differently or another format?

Any other ideas for work arounds?


Solution

  • As the other answers mentioned, two tar files can be different even if the contents are the same either due to tar metadata changes or to file order changes. You should run the checksum on the file data directly, sorting the directory lists to ensure they are always in the same order. If you want to include some metadata in the checksum, include it manually.

    Untested example using os.walk:

    import os
    import os.path
    def get_dir_md5(dir_root):
        """Build a tar file of the directory and return its md5 sum"""
    
        hash = hashlib.md5()
        for dirpath, dirnames, filenames in os.walk(dir_root, topdown=True):
    
            dirnames.sort(key=os.path.normcase)
            filenames.sort(key=os.path.normcase)
    
            for filename in filenames:
                filepath = os.path.join(dirpath, filename)
    
                # If some metadata is required, add it to the checksum
    
                # 1) filename (good idea)
                # hash.update(os.path.normcase(os.path.relpath(filepath, dir_root))
    
                # 2) mtime (possibly a bad idea)
                # st = os.stat(filepath)
                # hash.update(struct.pack('d', st.st_mtime))
    
                # 3) size (good idea perhaps)
                # hash.update(bytes(st.st_size))
    
                f = open(filepath, 'rb')
                for chunk in iter(lambda: f.read(65536), b''):
                    hash.update(chunk)
    
        return hash.hexdigest()