Checking Crawl errors with google Webmaster on my website I have hundreds of errors, all caused by bad links from scribd to URLs that have never existed on my webpage. E.g. to URLs like
jwsbook/BookQuote
TR/xhtml1/DTD/xhtml1-transitional.dtd
x3d/specifications/vrml/ISO_IEC_14772-All/
etc...
The number of these bad links increases over time, e.g. it has increased by 200 over the last 3 months!
Does this harm my page ranking? And is there anything I can do to resolve this problem?
Google is usually pretty good at catching flotsam like this, so I wouldn't worry too much.
They introduced a new tool last year that lets you disavow links if you are concerned. This will alert Google that the links aren't to be counted as incoming links.