AngularJS is amazing. You build apps with the shortest possible code. Your code can be easily modulated and organised. Double binding makes it really easy to handle the refreshing of the views - or better to say, you don't have to care about it anymore.
Vicigo itself is written in AngularJS and it would be a major problem as Vicigo is about the content - we do want to be indexed by seach engines and be possibly on top of the search results!
The solution to this problem is to detect who requests your page and in case of a crawler, handle it accordingly.
So the first thing to do, is to check if the incoming request is done by one of the crawlers? How do we do it?
You check for the header in the request with the key "user-agent".
If you build in node.js and use Express framework, you can do it as follows:
For the sake of simplicity, we just check if the entire string contains words like facebook, google, spider etc. Also we want to test our server-rendered version of the page, so we also include a dependency on a query parameter - a parameter you would normally append to a page url. For example try to append to current page the following: ? crawler_view=true.
If the conditions are met, we use EJS templates to render the page.
We set up dynamically the metatags depending on the content of page:
With this technic, we are able to enjoy all the pluses of AngularJS without suffering from its biggest flows. You may think that it may be a little work to maintain two versions of a page simultanisly, but actually, it is not. For crawlers, do not neet super fancy design, the crawlers do not care. It's enough that you set the metatags in the head of the document and fill the body with text and links to other pages on your website. All you need is a static page. On Vicigo, we don't even incude any CSS. For example, this post looks like that for the crawlers: