Search code examples
web-crawlergoogle-crawlers

disable crawling subdomain google crawler


i would like to know how i can disallow google the crawling of my subdomains ?

i made a pic of my webspace folder. the awesom media folder is the folder where the main site www.awesom-media.de is.folders

the other once are subdomains. what i whant is that google should not crawl this one but i dont know how .

i dont have a robot.txt in the awesom media folder but as u can see in the / part. and the content of the robot.txt is User-agent: * Disallow:

and thats it.

how can i tell google not to crawl the subdomains


Solution

  • In case all your subdomains directly route to the specific folders (e.g. something like automagazin.awesom-media.de uses the folder auto-magazin), just place a robots.txt with

    User-agent: *
    Disallow: /
    

    in all your folders for the subdomains you want to disallow for Google. I guess these are auto-magazin and future-magazin (and maybe more).

    Currently you put it into the root folder, which Google probably cannot see at all. Just try to load [subdomain].awesom-media.de/robots.txt and see if it loads a robot.txt or not.