Search code examples
pythonseleniumnosuchelementexception

Selenium GDPR NoSuchElementException


I want to scrape some data from "https://www.techadvisor.co.uk/review/wearable-tech/". I figured out that looping through the pages with Beautifulsoup does not work. This is the reason why I tried to open it with selenium. The "Accept All" Button to overcome the GDPR blocker cannot be located.

I tried:

browser = webdriver.Chrome()
browser.get("https://www.techadvisor.co.uk/review/wearable-tech/")
# button = browser.find_element_by_xpath('/html/body/div/div[3]/div[5]/button[2]')
# WebDriverWait(browser, 20).until(EC.element_to_be_clickable((By.XPATH, "html/body/div/div[3]/div[5]/button[2]"))).click()

I always receive NoSuchElementException

To be honest, I found the Xpath really weird, but I got this from the Google Chrome inspect.

Every solution proposal or tip is appreciated :)


Solution

  • To click on Accept All button which is inside an iframe.You need to switch to iframe first in order to click the button.

    Induce WebDriverWait() and wait for frame_to_be_available_and_switch_to_it() and use the following css selector.

    Induce WebDriverWait() and wait for element_to_be_clickable() and use the following xpath selector.

    from selenium import webdriver
    from selenium.webdriver.common.by import By
    from selenium.webdriver.support.ui import WebDriverWait
    from selenium.webdriver.support import expected_conditions as EC
    
    browser = webdriver.Chrome()
    browser.get("https://www.techadvisor.co.uk/review/wearable-tech/")
    WebDriverWait(browser,10).until(EC.frame_to_be_available_and_switch_to_it((By.CSS_SELECTOR,"iframe[id^='sp_message_iframe']")))
    WebDriverWait(browser, 10).until(EC.element_to_be_clickable((By.XPATH, "//button[text()='Accept All']"))).click()