Search code examples
reactjsamazon-s3react-routeramazon-cloudfrontgoogle-crawlers

Create-react-app with react-router v3 hosted on S3 not working with "fetch as google"


I currently have a React app built with create-react-app using react-router v3 hosted on S3 through CloudFront. The app is the view for a PHP api hosted elsewhere. The react-router is set up to use browserHistory.

Currently we are trying to set up the app so that it can be crawled by google and are testing this with google webmaster tools and "fetch as google".

The homepage fetches no problem but any internal page is unable to even render and returns a "not found".

The site also still has a 404 error show in console when trying to directly navigate to a route in a new tab (but loads the page as expected).

What I've tried so far: 1) importing babel-polyfill at the entry point for the googlebot. 2) set up CloudFront error pages to send 404 responses to /index.html with a 200 3) set the error page for s3 to index.html

From my reading, google shouldn't require server-side-rendering just to crawl the site (SEO is not a concern for us), but none of the other solutions I've found online seem to solve the problem.

Will I need to make the whole app be able to handle SSR, something simple like: https://github.com/facebook/create-react-app/blob/master/packages/react-scripts/template/README.md#serving-apps-with-client-side-routing, or are there other things I can try that will just make a page crawlable without setting anything up server side?

Any help or direction to further resources would be appreciated!


Solution

  • I found out the solution was pretty easy. In the cloudfront distribution, set the custom error pages to have 404 errors go to the target "/" with a http response of 200.

    A lot of other people have it posted with the target as "/index.html", if that doesnt work, just try the above.