Search code examples
python-3.xconcurrent.futures

Python3 Concurrent.Futures with Requests


I'm trying to implement concurrent requests to speed up the checking of a list of URL's but it doesn't seem to be working with my code as it's still checking them 1 by 1.

for domain in list:
    try:
        follow_url = requests.head(f'http://{domain}', allow_redirects=True, timeout=60)

        with concurrent.futures.ThreadPoolExecutor(max_workers=20) as executor:
            executor.submit(follow_url)

        with open("alive.txt", "a") as file:
            file.write(f'{domain}\n')

    except Exception as e:
        print(e)

Solution

  • You are not applying it correctly. You are creating parallel processes inside an iteration. Correct way could be like this:

    def parallel_req(domain):
        try:
            follow_url = requests.head(f'http://{domain}', allow_redirects=True, timeout=60)
            with open("alive.txt", "a") as file:
                file.write(f'{domain}\n')
        except requests.exceptions.RequestException as e:
            print(e)
    
    with ThreadPoolExecutor() as e:
        e.map(parallel_req, domains)