Search code examples
pythonpython-3.xoperating-systemsubprocesspython-asyncio

Asyncio: How to handle multiple open files OS error


I am trying to run ~500 async subprocesses. I am passing the files as list p_coros in the main function below.

async def run_check(shell_command):
    p = await asyncio.create_subprocess_shell(shell_command,
                    stdin=PIPE, stdout=PIPE, stderr=STDOUT)
    fut = p.communicate()
    try:
        pcap_run = await asyncio.wait_for(fut, timeout=5)
    except asyncio.TimeoutError:
        p.kill()
        await p.communicate()

def get_coros():
    for pcap_loc in print_dir_cointent():
        for pcap_check in get_pcap_executables():
            tmp_coro = (run_check('{args}'
            .format(e=sys.executable, args=args)))
            if tmp_coro != False:
                coros.append(tmp_coro)
     return coros

async def main(self):
    ## Here p_coros has over 500 files
    p_coros = get_coros()
    for f in asyncio.as_completed(p_coros):
        res = await f




loop = asyncio.get_event_loop()
loop.run_until_complete(get_coros())
loop.close()

The issue I think here is asyncio.as_completed, as it's trying to open all files parallelly, because if I remove asyncio.as_completed it works properly but takes a lot of time. I wanted to handle the open file issue OSError(24, 'Too many open files') without losing much time.

Logs:

Exception ignored when trying to write to the signal wakeup fd:
BlockingIOError: [Errno 11] Resource temporarily unavailable

ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending coro=<ClassificationCheck.run_check() running at ./regression.py:74> wait_for=<Future finished exception=RuntimeError('Event loop is closed',)> cb=[as_completed.<locals>._on_completion() at /usr/lib/python3.5/asyncio/tasks.py:478]>

Traceback:

Traceback (most recent call last):
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
    result = coro.send(None)
  File "./regression.py", line 74, in run_check
    stdin=PIPE, stdout=PIPE, stderr=STDOUT)
  File "/usr/lib/python3.5/asyncio/subprocess.py", line 197, in create_subprocess_shell
    stderr=stderr, **kwds)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 1049, in subprocess_shell
    protocol, cmd, True, stdin, stdout, stderr, bufsize, **kwargs)
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 184, in _make_subprocess_transport
    **kwargs)
  File "/usr/lib/python3.5/asyncio/base_subprocess.py", line 40, in __init__
    stderr=stderr, bufsize=bufsize, **kwargs)
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 640, in _start
    stdin, stdin_w = self._loop._socketpair()
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 53, in _socketpair
    return socket.socketpair()
  File "/usr/lib/python3.5/socket.py", line 478, in socketpair
    a, b = _socket.socketpair(family, type, proto)
OSError: [Errno 24] Too many open files
ERROR:asyncio:Task exception was never retrieved
future: <Task finished coro=<ClassificationCheck.run_check() done, defined at ./regression.py:72> exception=OSError(24, 'Too many open files')>

Solution

  • As I was passing a lot of files to asynchronously work, it was throwing OS Error. The way I handled it by creating a list of lists and each sub-list containing a fixed number of PCAP that will not cause OS Error and then pass one list at a time.

    So I learned that it is important to close already opened files before going ahead to work on more files.

    def get_coros(pcap_list):
        for pcap_loc in pcap_list:
            for pcap_check in get_pcap_executables():
                tmp_coro = (run_check('{args}'
                .format(e=sys.executable, args=args)))
                if tmp_coro != False:
                    coros.append(tmp_coro)
         return coros
    
    async def main():
        pcap_list_gen = print_dir_cointent() # Passing a list of lists
        for pcap_list in pcap_list_gen:
            p_coros = get_coros(pcap_list)
            for f in asyncio.as_completed(p_coros):
                res = await f