My site has been successfully verified against Google Webmaster. My crawler access with the robot.txt is also 200 (Success). However, when I check "Crawl errors", nearly every page is "unreachable", including the domain main page itself. The only page that gets crawled with no error are the attachment/file page (e.g. pdf, xls, jpg etc.). This is really strange.
My web is created by Ruby on Rails and using MySQL database.
Do the pages take a long time to render? I suspect Google's crawler gives up if the page takes too long to respond. Consider putting Varnish in front of public pages that are expensive and don't contain any user-related or dynamic content?