Search code examples
windowscurlwgetpageload

cURL or wget for measuring page load time?


What's the best/most accurate way to measure a pages load time that would include the time to load all of the pages resources? (Basically trying to get a load time that a real end user might have).

Is it better to use Wget or cURL for this type of task? (The operating system in use will be Windows due to other dependencies)


Solution

  • You can download all the resources requested by a page with wget, using the -p option:

    wget -p https://www.example.com/
    

    curl doesn't parse HTML so it can't be used for this. It will just print the initial page's HTML but if the HTML requests images or CSS or JS files, it won't know about that (because it doesn't parse HTML) so it won't download any of them.

    wget won't be a very accurate measure of the page's load time if the page requests resources through JavaScript, because wget doesn't parse or execute JavaScript. A more accurate way to get user perceived load times is to open the page in Chrome and then look at how long it took. You can see an accurate breakdown by looking at the Network tab in the Dev Tools. If you're trying to automate it, you can use Chrome Headless through Puppeteer, something like this:

    const puppeteer = require('puppeteer');
    
    (async () => {
      const browser = await puppeteer.launch();
      // Get the first tab
      const page = (await browser.pages())[0];
    
      await page.goto('https://example.com/');
      const loadTime = page.evaluate(() => window.performance.timing.loadEventEnd - window.performance.timing.navigationStart);
      console.log(loadTime);
    
      await browser.close();
    })();
    

    This on its own will also not be accurate due to caching of DNS results, the TLS handshake or resources by the browser.