with the help of this thread
https://codereview.stackexchange.com/questions/147056/short-script-to-hash-files-in-a-directory
I managed to get almost exactly what i needed. The given code is
from os import listdir, getcwd
from os.path import isfile, join, normpath, basename
import hashlib
def get_files():
current_path = normpath(getcwd())
return [join(current_path, f) for f in listdir(current_path) if isfile(join(current_path, f))]
def get_hashes():
files = get_files()
list_of_hashes = []
for each_file in files:
hash_md5 = hashlib.md5()
with open(each_file, "rb") as f:
for chunk in iter(lambda: f.read(4096), b""):
hash_md5.update(chunk)
list_of_hashes.append('Filename: {}\tHash: {}\n'.format(basename(each_file), hash_md5.hexdigest()))
return list_of_hashes
def write_hashes():
hashes = get_hashes()
with open('list_of_hashes.txt', 'w') as f:
for md5_hash in hashes:
f.write(md5_hash)
if __name__ == '__main__':
write_hashes()
However, additionally i'd like to consider all the files that are in subfolders of my given path and include them into the output. I tried using os.walk() but i didn't manage to succeed.
Can you help me adjusting the function get_files() such that it generates the MD5 hashes for all files in subfolders (i.e. considers the entire folder structure?)
Thanks for any help!
Try this:
current_path = normpath(getcwd())
listOfFiles = []
for (dirpath, dirnames, filenames) in walk(current_path):
listOfFiles += [join(dirpath, file) for file in filenames]
(based on this source)