dict_A = {'file1_xx': '8-04-22', 'file2_xx': '8-04-22', 'file3_xx': '8-04-22', 'file4_xx': '8-04-22'}
dict_test
. Files recorded in both dicts are compared for new files: i.e. compare each file last modified date i.e file1_xx
against the last processed date in dict_A
. There's a condition which will update the dict_A
if the file last modified date is greater than last processed date per single file.dict_A
should be updated with the latest modified date per file of same category. This dict_A
is then uploaded to PostgreSQL db through sqlalchemy.def compare_rec(i):
a = dict_A[i]
b = dict_test[i]
if a >= b:
print("none")
else:
lock.acquire()
print("found")
a = b
lock.release()
def init(l):
global lock
lock = l
if __name__ == '__main__':
file_cat=['a', 'b', 'c', 'd']
dict_A={'a': '10', 'b': '10', 'c': '10', 'd': '10'}
dict_test={'a': '11', 'b': '11', 'c': '11', 'd': '11'}
l = multiprocessing.Lock()
pool = multiprocessing.Pool(initializer=init, initargs=(l,))
pool.map(compare_rec, file_cat)
pool.close()
pool.join()
Processes don't share variables.
In function I would use return
to send filename
and date
back to main process
if ...:
return i, a
else:
return i, b
main thread should get results from all processes
results = pool.map(compare_rec, file_cat)
and it should update dictonary
dict_A.update(results)
Full code:
import multiprocessing
def compare_rec(key):
print('key:', key)
a = dict_A[key]
b = dict_test[key]
if a >= b:
print("none", key, a)
return key, a
else:
print("found:", key, b)
return key, b
if __name__ == '__main__':
file_cat = ['a', 'b', 'c', 'd']
dict_A = {'a': '10', 'b': '10', 'c': '10', 'd': '10'}
dict_test = {'a': '11', 'b': '11', 'c': '11', 'd': '11'}
pool = multiprocessing.Pool()
results = pool.map(compare_rec, file_cat)
print(results)
print('before:', dict_A)
dict_A.update(results)
print('after :', dict_A)
pool.close()
pool.join()