Search code examples
pythonseleniumfacebookweb-scrapingcookies

Allow facebook cookies to track me through multiple sessions in selenium


I'm working on scraping data using selenium, for an academic research which will test how certain user behaviors across facebook and the web will affect the ads they see.

For this, I need to have a kinds of fake user which will first interact with facebook, then visit some sites with facebook cookies, allowing facebook to continue tracking its behavior, and then go back to facebook.

I haven't done much web development, and it seems I'm confused about how exactly to keep and load cookies for this scenario.

I've been trying to save and load cooking using the following code snippets:

# saving
pickle.dump(driver.get_cookies(), cookiesfile)

# loading
cookies = pickle.load(cookiesfile)
for cookie in cookies:
    driver.add_cookie(cookie)

On facebook , this will either create an error message popup telling me to reload, or redirect me to the login page. On other sites, even ones that explicitly state they have facebook trackers, this will cause an InvalidCookieDomainException.

What am I doing wrong?


Solution

  • Instead of handling cookies yourself, I would recommend using ChromeOptions to persist a browser session. This could be more helpful in maintaining local storage and other cookies.

    The next time you open a browser session, the chrome instance will have loaded the previous "profile" and will continue maintaining it.

    options = webdriver.ChromeOptions()
    options.add_argument('user-data-dir={}'.format(<path_to_a_folder_reserved_for_browser_data>))
    
    driver = webdriver.Chrome(executable_path=<chromedriver_exe_path>, options=options)