Search code examples
pythonaiohttppython-asyncio

asyncio tasks using aiohttp.ClientSession


I'm using python 3.7 and trying to make a crawler that can go multiple domains asynchronously. I'm using for this asyncio and aiohttp but i'm experiencing problems with the aiohttp.ClientSession. This is my reduced code:

import aiohttp
import asyncio

async def fetch(session, url):
    async with session.get(url) as response:
        print(await response.text())

async def main():
    loop = asyncio.get_event_loop()
    async with aiohttp.ClientSession(loop=loop) as session:
        cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
        asyncio.gather(*cwlist)

if __name__ == "__main__":
    asyncio.run(main())

The thrown exception is this:

_GatheringFuture exception was never retrieved future: <_GatheringFuture finished exception=RuntimeError('Session is closed')>

What am i doing wrong here?


Solution

  • You forgot to await the asyncio.gather result:

        async with aiohttp.ClientSession(loop=loop) as session:
            cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
            await asyncio.gather(*cwlist)
    

    If you ever have an async with containing no await expressions you should be fairly suspicious.