I am looping on a huge shelve dictionary to do something using each key:value pair (without changing it). While the dictionary is bigger than memory, my concern is whether this will run out of the memory.
Sample code:
dictFileName = 'dict.txt'
dict = shelve.open(dictFileName)
for key, value in dict.items():
# doing something using key and value
dict.close()
Will the key:value pair read in an iteration of the loop be discarded from the memory by the next iteration? I think it must, otherwise the memory will be used out since the loop essentially read the whole dictionary, right?
In general, when looping on a huge file by
f = open('someFileName', 'r')
for each_line in f:
# do something
Will the line in memory be discarded by the next iteration of the loop?
Memory used by objects that are no longer reachable will be reclaimed by the garbage collector, yes. So you won't run out of memory reading a file line by line.