Search code examples
pythonseleniumwebdriverselenium-chromedriver

How to iterating google pages using Selenium Python


I'm trying to scrape google search data using Selenium and I can able to get all data as well but at the end when Page is end I can see an error.

line 23, in <module>
    next_page = driver.find_element_by_link_text('Next')

 line 428, in find_element_by_link_text
    return self.find_element(by=By.LINK_TEXT, value=link_text)

Here is my code and I get this error after successfully collected all data.

from selenium import webdriver
from time import sleep


driver = webdriver.Chrome('./chromedriver.exe')
driver.maximize_window()
driver.get('https://www.google.com/search?q=florida+time+now')

driver.find_element_by_link_text('Change to English').click()
sleep(2)

print(driver.current_url)
print(driver.title)


while True:
    titles = driver.find_elements_by_xpath('//*[@class="LC20lb DKV0Md"]')
    for title in titles:
        link_title = title.find_element_by_xpath('.//span').text
        print(link_title)

    sleep(3)
    next_page = driver.find_element_by_link_text('Next')
    if next_page:
        next_page.click()
    else:
        break


sleep(2)
driver.close()

Solution

  • If element not present it will throw error. what you could do check the length of the element using driver.find_elements_by_link_text('Next') which will returns list of elements.Check length if more than 0 then click else break.

        next_page = driver.find_elements_by_link_text('Next')
        if len(next_page)>0:
            next_page[0].click()
        else:
            break