I'm building a SPA (single page application) so that when a browser request a page from my server, it only receives a small HTML and a big JavaScript app that then asks the appropriate data from the server, renders the HTML locally and generally drives the local app. Think of apps such as Gmail or Google Maps that never ever reload the page again.
This makes apps feel very snappy but it means that if the user agent doesn't execute JavaScript, then there's no content. This is a problem when the web site is being indexed by search engines or when an app requires a some content for it (think about posting links to Facebook, Twitter, LinkedIn, how you get a snippet of the content).
To circumvent that problem, I'm pre-rendering pages by running the JavaScript part on the server. It is working, but it is rather slow. Since most of the time this execution of JavaScript won't be needed, I'm thinking of whitelisting some browsers, like Chrome, Safari, Firefox, even recent version of IEs to not get pre-render.
Would this work or are most useful bots out there identifying themselves as browsers? How can I gather this information? Any source on user agent of good and bad bots?