How are search engines able to crawl questions posted in stackoverflow or quora or any other forums and show them in the search results. Hope that link works
This might be similar to Facebook's user profile visibility in search engines.
Do these sites keep updating their sitemap periodically?
Use Case: Trying to build a local events gathering website. Where events posted by users dynamically should be visible to search engines to crawl and appear in search results.
Some good & valid reference to understand the concept of sitemaps for this kind of usecases will really help.
Basic SEO,
A combination of pretty-urls (/questions/32728/slug), well formed HTML with easy to parse headers and canonical information allows google and other search engines to crawl to as many corners of the site as possible simply by following links.
Google itself doesnt do anything in real time and neither does any search engine. Periodically google sends out their crawl bot to collect new or updated information about websites, and this is also where having your site connected to Google Webmaster tools also aids in visibility and availability, if the site owner connects webmaster tools (from google in this case) to their site, they not only open a flood gate of reporting capabilities, but also throws googlebot a higher priority of crawling. Inside of the webmaster tools is options and and settings that help googlebot understand where to find content and how to show the listings in results, and how to resolve the links in those listings.
By periodically, i mean ever couple days, to a week, sometimes upwards to a month. If you post a question now, it wont show up on google for about a week easily. google also raises priority of links shown based the content of the query itself vs the content of the page (title and body content making sense together are most important to google, metakeys and description are now tertiary and lower priority. For example you cant have a title that says "How to make Good Food" and have the content about PHP configurations). ALSO Google will prioritize results based on your search history if your logged in, also by search history of that IP address if your not logged in
Sitemaps also really help in sites that are hard to crawl from the front page. Notice how Faccebook requires you log in first. Googlebot doesnt have an account nor does it fill anything in to get anywhere. Sitemaps allow googlebot to figure out where to go on the site to start crawling. Otherwise facebook would only have 1 visible result.
To check to see how a website is doing in terms of searchable pages, put "site:" in front a web address into google, and you will see all available results google has about that site (and it will provide an estimated number of results if there's a great deal of them)
site:www.google.com and site:google.com
the visibility and or requirement of the www. is also an important distinction as well.