Search code examples
pythonasynchronouspython-requestspython-asyncioaiohttp

How to avoid "too many requests" error with aiohttp


Here's a snippet of my parser code. It does 120 requests asynchronously. However, every response returns 429 "too many requests" error. How do I make it "slower", so the api won't reject me?

def get_tasks(self, session):
    tasks = []
    for url in self.list_of_urls:
        tasks.append(asyncio.create_task(session.get(url, ssl=False)))
    return tasks


async def get_symbols(self):
    print('Parsing started')
    async with aiohttp.ClientSession() as session:
        tasks = self.get_tasks(session)
        responses = await asyncio.gather(*tasks)
        for response in responses:
            response = await response.json()
            print(response)

Error:

{'message': 'Too many requests'}
{'message': 'Too many requests'}
{'message': 'Too many requests'}
{'message': 'Too many requests'}
{'message': 'Too many requests'}
...

Solution

  • Try to use asyncio.Semaphore:

    # Initialize a semaphore object with a limit of 3 (max 3 downloads concurrently)
    limit = asyncio.Semaphore(3)
    
    
    async def make_one_request(url):
        async with limit:
            return await session.get(url, ssl=False)
    
    
    def get_tasks(self, session):
        tasks = []
        for url in self.list_of_urls:
            tasks.append(asyncio.create_task(make_one_request(url)))
        return tasks
    
    
    async def get_symbols(self):
        print("Parsing started")
        async with aiohttp.ClientSession() as session:
            tasks = self.get_tasks(session)
            responses = await asyncio.gather(*tasks)
            for response in responses:
                response = await response.json()
                print(response)