Search code examples
pythonweb-scrapingproxygoogle-search

How to implement proxy rotation when scraping urls with google search


Currently I am using 'googlesearch' python library and a function to do google searches and store the websites inside a list, this is my code:

from googlesearch import search

def get_urls(tag, n, language):
    urls = [url for url in search(tag, stop=n, lang=language)][:n]
    return urls


google_urls = get_urls('future', 5000, 'it') #future is the keyword.

When I run this code i get this error:

HTTPError: HTTP Error 429: Too Many Requests

How can i handle it? How can i implement a proxy rotation considering my code? Thanks. If you have a more efficient way, i'm happy to hear it


Solution

  • This question is allready answered in googlesearch library for python.

    ==> dublicate.

    You can also have a look at the Documentation