Search code examples
pythonseleniumselenium-webdriverweb-scrapingstaleelementreferenceexception

Why is this code not clicking EVERY element in list?


I am trying to go into every city one by one here but after the program comes back from the first page, it does not go into the next page and displays StaleElementReferenceException error.

This is my code:

url = "https://www.agoda.com/region/punjab-province-pk.html"

s=Service(ChromeDriverManager().install())

driver = webdriver.Chrome(service=s)
driver.implicitly_wait(5)
driver.get(url)

a = ActionChains(driver)

cities = driver.find_elements(By.CSS_SELECTOR, value='dt[data-selenium="neighbor-name"]')

for city in cities:
    print(city.text)
    a.double_click(city).perform()
    print(driver.current_url)
    driver.back()

Even though when I just print elements in list it displays all of them but does not print or click when I write these lines of code used to navigate forward and back.

What am I doing wrong here?
Any help would be appreciated.


Solution

  • When you going to another page by clicking the city all the web elements initially collected in cities list on the main page are becoming stale. In Selenium Web Element is actually a reference to a physical web element. When you coming back to the main page it is re-rendered so the previously collected references to web elements no more pointing to those elements.
    To make your code working you need to collect the cities list again when getting back to the main page.
    The following should work:

    cities = driver.find_elements(By.CSS_SELECTOR, value='dt[data-selenium="neighbor-name"]')
    
    for idx, city in enumerate(cities):
        city = cities[idx]
        print(city.text)
        a.double_click(city).perform()
        print(driver.current_url)
        driver.back()
        cities = wait.until(EC.presence_of_all_elements_located((By.CSS_SELECTOR, 'dt[data-selenium="neighbor-name"]')))