Search code examples
sitemap

How to generate sitemap on a highly dynamic website?


Should a highly dynamic website that is constantly generating new pages use a sitemap? If so, how does a site like stackoverflow.com go about regenerating a sitemap? It seems like it would be a drain on precious server resources if it was constantly regenerating a sitemap every time someone adds a question. Does it generate a new sitemap at set intervals (e.g. every four hours)? I'm very curious how large, dynamic websites make this work.


Solution

  • On Stackoverflow (and all Stack Exchange sites), a sitemap.xml file is created which contains a link to every question posted on the system. When a new question is posted, they simply append another entry to the end of the sitemap file. It isn't that resource intensive to add to the end of the file but the file is quite large.

    That is the only way search engines like Google can effectively crawl the site.

    Jeff Atwood talks about it in a blog post: The Importance of Sitemaps

    This is from Google's webmaster help page on sitemaps:

    Sitemaps are particularly helpful if:

    • Your site has dynamic content.
    • Your site has pages that aren't easily discovered by Googlebot during the crawl process - for example, pages featuring rich AJAX or Flash.
    • Your site is new and has few links to it. (Googlebot crawls the web by following links from one page to another, so if your site isn't well linked, it may be hard for us to discover it.)
    • Your site has a large archive of content pages that are not well linked to each other, or are not linked at all.