The following code always gives me the error
OSError: [Errno 24] Too many open files
when reaching 508 processes executed, which means every process leaves two file descriptors open, and then I reach the system's limit:
from multiprocessing import Process
def do_job(task):
print("Task no " + str(task))
def main():
number_of_processes = 1000
processes = []
for i in range(number_of_processes):
p = Process(target=do_job, args=(i,))
processes.append(p)
# creating processes
for p in processes:
p.start()
p.join()
return True
if __name__ == "__main__":
main()
If I try the same code on the vs code terminal then it completes with no issue. I looked at many similar online threads but yet to find a working solution.
The issue was that the list was still keeping a live reference to each process. Popping the processes from the list solved the issue:
from multiprocessing import Process
def do_job(task):
print("Task no " + str(task))
def main():
number_of_processes = 1000
processes = []
for i in range(number_of_processes):
p = Process(target=do_job, args=(i,))
processes.append(p)
# creating processes
for i in range(number_of_processes):
p = processes.pop(0)
p.start()
p.join()
return True