I was asked to use a python web scraping script that was created in Google Colab, but I'm getting the following error:
SessionNotCreatedException: Message: session not created: This version of ChromeDriver only supports Chrome version 90 Current browser version is 117.0.5938.88 with binary path /root/.cache/selenium/chrome/linux64/117.0.5938.88/chrome
Am I correct in thinking that because Google Colab is a hosted coding platform that my local chromedriver version is irrelevant? How can I fix this?
If it helps, I believe this is the function causing the error...
def web_driver():
options = webdriver.ChromeOptions()
options.add_argument("--verbose")
options.add_argument('--no-sandbox')
options.add_argument('--headless')
options.add_argument('--disable-gpu')
options.add_argument("--window-size=1920, 1200")
options.add_argument('--disable-dev-shm-usage')
driver = webdriver.Chrome(options=options)
return driver
You might be able use a different driver manager, SeleniumBase, to get around the issue.
(pip install seleniumbase
, and run with python
)
Here's a simple example:
from seleniumbase import Driver
driver = Driver(browser="chrome")
driver.get("https://www.google.com")
driver.quit()
Here's your updated script, with the options you specified. Other options, (such as --no-sandbox
and --disable-gpu
), are already set by default.)
from seleniumbase import Driver
def web_driver():
driver = Driver(browser="chrome", headless=True)
driver.set_window_size(1920, 1200)
return driver