Search code examples
pythonpython-asyncioaiohttp

Why am I getting a "Task was destroyed but it is pending" error in Python asyncio?


I use asyncio and beautiful aiohttp. The main idea is that I make request to server (it returns links) and then I want to download files from all links in parallel (something like in an example).

Code:

import aiohttp
import asyncio

@asyncio.coroutine
def downloader(file):
    print('Download', file['title'])
    yield from asyncio.sleep(1.0) # some actions to download
    print('OK', file['title'])


def run():
    r = yield from aiohttp.request('get', 'my_url.com', True))
    raw = yield from r.json()
    tasks = []
    for file in raw['files']:
        tasks.append(asyncio.async(downloader(file)))
        asyncio.wait(tasks)

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(run())

But, when I try to run it, I have many "Download ..." outputs and

Task was destroyed but it is pending!

And nothing about 'OK + filename'.

How can I fix that?


Solution

  • You forgot to yield from the call to asyncio.wait. You also probably have the indentation on it wrong; you only want to run it after you've iterated over the entire raw['files'] list. Here's the complete example with both mistakes fixed:

    import aiohttp
    import asyncio
    
    @asyncio.coroutine
    def downloader(file):
        print('Download', file['title'])
        yield from asyncio.sleep(1.0) # some actions to download
        print('OK', file['title'])
    
    @asyncio.coroutine
    def run():
        r = yield from aiohttp.request('get', 'my_url.com', True))
        raw = yield from r.json()
        tasks = []
        for file in raw['files']:
            tasks.append(asyncio.async(downloader(file)))
        yield from asyncio.wait(tasks)
    
    if __name__ == '__main__':
        loop = asyncio.get_event_loop()
        loop.run_until_complete(run())
    

    Without the call to yield from, run exits immediately after you've iterated over the entire list of files, which will mean your script exits, causing a whole bunch of unfinished downloader tasks to be destroyed, and the warning you saw to be displayed.