From the best I know, if you use AJAX or javascript links in your website, it hurts SEO as the google bot has troubles crawling your site, basically it likes anchor tags.
So it's a general saying to avoid AJAX or Flash in your website, though some saying that the google bot knows to read javascript links.
Now i believe it's possible to manipulate the google bot somehow, and when meaning manipulate i DON'T mean anything illegal or black hatting, I just want to Ajaxise my website.
My question is divided to 2:
Likewise, the possibilities you can add to robot.txt file is only no index and no follow, as far as i know.
You can't manipulate search engine bots to do things they don't normally do. You either work within their capabilities or not. Although search engines are getting better at handling JavaScript as a general rule dynamic content is not something they're going to be able to handle well or at all in most circumstances.
As far as getting search engines to read dynamic content created by JavaScript you have two options:
Build the site the right way from the beginning and use progressive enhancement. Your site should work without JavaScript enabled. In fact, it should be built that way first. Then you can go back and add JavaScript that enhances the experience for users who have JavaScript enabled. That way your content is accessible to everybody.
Use Google's Crawlable Ajax standard. This will allow Google to crawl content generated via Ajax. Keep in mind this only work for Google and leaves out other search engines and users without JavaScript enabled. So it is a bad idea.