I'm working on a website with 15,000 visits per day, we have hired ads on that, some of them are important (they are for banks) and the content of these kinds of ads will never change, but at some point they announced that they need some people with repeated content (Jobs and Requirements never changes :| ). Now we need to tell Googlebot to recrawl those URLs, so we updated the content of the target pages with different writing but Googlebot doesn't crawl them as fast as needed. Using Fetch as Google is an opinion but there are a lot of ads and we need an automatic way do that. What is your advice?
Unfortunately the way you want to achieve this is not possible, because Google decides how often to recrawl a page based on its popularity, the number of backlinks and how valuable it thinks your page is to its users. You can't force Google to recrawl a page if you don't want to use fetch as Google.
However, there is a workaround. Create a new page with the new content, using a new URL (containing a recent date for example, but the same path otherwise), and have the old page redirect to it with a 301, or set a canonical link from the old page to the new page. This will surely accelerate the indexing of the new page, while removing the old one pretty quickly. It will pass most of the PR juice from the old page to the new page. Make sure the new page is added to your sitemap.xml
too.
That will achieve what you want automatically (pretty much).