Search code examples
parallel-processingpython-requestspython-multiprocessingpython-multithreading

How to speedup the requests query?


I want to basically brute-force multiple links to check if the file exists in any of the links, is there any possibility that i can increase the speed of requests, maybe multiple request processes at the same time, because it's currently too slow.

import requests

for no in range(100000,500000):
    part1 = "https://url/"
    part2 = "_[12시.jpg"
    url = part1+str(no)+part2
    response = requests.get(url)
    if response.status_code == 200:
        print(url)
        break
    elif response.status_code == 404:
      print('Not Found @',no)

Solution

  • Without going into the complexities of sending requests at the same time, here's an approach that will already speed up your calls by using requests.Session

    import requests
    
    
    s = requests.Session()
    
    for no in range(100000, 500000):
        url = f"https://url/{no}_[12시.jpg"
        response = s.get(url)
    
        if response.status_code == 200:
            print(url)
            break
    
        elif response.status_code == 404:
            print(f"Not Found @{no}")