Bustle actually uses PhantomJS to create static versions of the site, for serving to search spiders.
The advantage of using Ember.js for Bustle is that it's really, really, ridiculously fast. Seriously, try it. Go to bustle.com and click around.
They could make it work without JavaScript, but they're a new company with a long list of technical challenges to solve. They ran the numbers and the percentage of users with JS disabled is so microscopic it just doesn't make sense to spend time on it.
"Bustle actually uses PhantomJS to create static versions of the site, for serving to search spiders."
What value dose using PhantomJS offer above what progressive enhancement gives you for free?
This feels like a contradiction. Avoiding progressive enhancement, and then bolting on a hack for search spiders in a way that is less robust than the technique you're avoiding.
"The advantage of using Ember.js for Bustle is that it's really, really, ridiculously fast. Seriously, try it. Go to bustle.com and click around."
That's gonna hurt people following links to the page. That's exactly the reason why Twitter walked away from it's JavaScript driven content and went for progressive enhancement. A substantial portion of new visitors will have a cold cache, and have this annoying wait for what is effectively a one page article. (cf. http://statichtml.com/2011/google-ajax-libraries-caching.htm... )
"They could make it work without JavaScript, but they're a new company with a long list of technical challenges to solve. They ran the numbers and the percentage of users with JS disabled is so microscopic it just doesn't make sense to spend time on it."
Another contradiction. Progressive enhancement isn't a technical challenge, it is so brain dead simple.
So they figured out the number of users with JavaScript disabled is too small to warrant supporting. Interesting, except, progressive enhancement isn't solely about people with JavaScript disabled, right? cf: http://isolani.co.uk/blog/javascript/DisablingJavaScriptAski...
Also, so they ran the numbers of these, under the incorrect assertion that progressive enhancement only impacts people with JavaScript disabled. But did they run those same numbers that determined the number of search spiders wasn't microscopic, to justify the PhantomJS bodge you initially mentioned? Are you really implicitly asserting that there are far more search spider visitors to bustle.com than visitors with JavaScript disabled? (I'd love to understand the logic that lead to that determination).
So, what justification was there to spend time on building a PhantomJS site scraper to provide static content to search engine spiders, and yet fail to appreciate that progressive enhancement would have served that spider audience, as well as the JavaScript audience, as well as the variety of issues that progressive enhancement helps alleviate?
But, if bustle.com were a content site, then progressive enhancement is the way to go, right? But this is an ember app, so it is not a content site (clearly). Except when search spiders visit, then there's a need for a static version of each page. It's very confusion. is bustle.com an app or a website?
Also, how does bustle.com / ember, guarantee perfect delivery of assets other than HTML to a visitor's browser? How does it guarantee robustness?
How does bustle.com / ember protect your JavaScript so that when a third-party chunk of JavaScript (like Google Analytics, Disqus, Facebook, ChartBeat, Quantcase, WebTrends) does something funky, or hiccups and causes a JavaScript error?
The advantage of using Ember.js for Bustle is that it's really, really, ridiculously fast. Seriously, try it. Go to bustle.com and click around.
They could make it work without JavaScript, but they're a new company with a long list of technical challenges to solve. They ran the numbers and the percentage of users with JS disabled is so microscopic it just doesn't make sense to spend time on it.