Search code examples
botsgooglebotgoogle-crawlers

Cons of not showing graphs if the user agent is google bot


We have graphs getting loaded on the page a bit slow as far as google crawling is considered. Since the graphs doesnot contain any SEO content, is it advisable to put a user agent check saying not to load graphs if it is a google bot.

Our main objective of doing it is mainly to reduce the crawling speed.

Any Cons to it ??


Solution

  • The con to it would be that generally speaking - Google does not like you displaying content only for bot/or hiding content from Bots, because they might judge it as trying to 'cheat' the crawler.

    Especially because now a days page speed is a ranking factor for Google, it means that the increase in crawl speed by disabling content for the crawler could be considered trying to boost your position by making a specific "Bot crawl" page. This could - in theory - be considered by "black hat SEO" by Google, similar to how they view hidden text and specific bot-landing pages.

    Google wants to see the page as the user sees the page, to provide the best way of ranking sites (as per their definitions of best). And if the user experience a "slow loading page", Google will want to see that to use it as ranking information page. Both crawl speed and users speed.

    Now - that being said - it could also be considered similar to a "No-Script" tag in case you use some script for it; so in your specific case, it's impossible to say whether or not it would actually hurt and whether or not Google would even notice. So it's impossible to give a definitive answer.

    So the best answer I can give is that the con is that Google might (emphasis on might) view it as trying to cheat the bot, if (again emphasis on if) discovered. Personally, I wouldn't do it, but would try to find ways of optimizing the page instead.