7

As a programmer, and taking into account the overall "funness" of the process, I'm tempted to start a project in Sinatra where the back end's sole concern is the logic, and returning a Json API, and then writing a javascript application that would interact with that API for rendering the actual content to the user.

I'm fairly new to programming and I've never done anything remotely like this before. What would be the pitfalls, advantages and disadvantages of completely separating logic from presentation in such a way? Any examples of this being done in the wild?

One concern off the top of my head is how would search engines respond to a site whose content is almost entirely served as Json...

o_o_o--
  • 173
  • 5
  • Regarding your last paragraph: they wouldn't like it. But they also wouldn't use javascript, so you could serve them different pages. – Jer Jun 17 '13 at 22:33
  • http://programmers.stackexchange.com/questions/144717/advantages-and-disadvantages-of-building-a-single-page-web-application – psr Jun 18 '13 at 00:09
  • 1
    Here is how Google recommend you make an AJAX served site crawlable - https://developers.google.com/webmasters/ajax-crawling/ – matt freake Jun 18 '13 at 08:50

4 Answers4

5

Most of the web apps I've written for my firm works this way -- a single page, javascript driven application making ajax calls to a back-end service.

So when I started my own website, my first implementation was done the same way. However, more than halfway through the development, I realized I made a big mistake.

My website offers content to my users, and this content needs to be available via search engines. I also need to monetize my website via ads. The problem is that both these requirements depend on the static content avaiable from the website.

Static content is required to index keywords for search and to display relevant ads. These keywords are fetched from the text found in the underlying HTML. If your content is fetched from a remote source and displayed via javascript, it's likely that your underlying HTML would be mostly blank.

Because they provide a rich client experience, dynamic pages are ideal for web-based applications. However, if you intend to deliver content, it's best to avoid dynamic pages.

jramoyo
  • 610
  • 5
  • 13
3

By all means, it's worth trying out, but know that the learning curve can be a bit steep at times (Javascript is an easy language to hate.) Javascript web apps have some really nice features. For example, you can port your app to various mobile devices as native apps by wrapping the application with Phonegap.

There are a handful of options as far as frameworks go. I'd read over this article to get a high level overview of some of the more popular options: http://sporto.github.io/blog/2013/04/12/comparison-angular-backbone-can-ember/

As for search engine crawl-ability, I think (though I'm not 100% on this) that you can make some of the different "pages" of your app crawlable using what's called a "router" (that may be specific to Backbone.js though, and I could be wrong, I think that that's the case.) You can definitely use the Backbone "router" to make the different "pages" bookmarkable and linkable so it makes sense that they'd also be crawlable.

dave
  • 31
  • 3
2

Try it. I did a few projects (mostly with angular and web api), and it was a good experience overall.

Advantages - you have a very fluid user experience, and can create single-page apps with ease; have a clean separation between business and UI; you can use web service for different apps (e.g. to feed a mobile app, etc - data is data, you can reuse it).

Disadvantages - you probably will end up with more calls to the server, unless you will combine multiple calls in one chunk (e.g. to pull data for 5 different dropdowns in a single call). It can also be somewhat slower to load - first you load an empty page with JS, and then are making calls to the web services - this is obviously takes longer than just serving a complete page. Generally not a bit difference.

SEO can be an issue, yes. I guess you can serve a simple static pages for robots, if you need your data to be indexed.

Evgeni
  • 451
  • 2
  • 7
  • AngularJs: cool! I had heard about it, but I didn't know it was made precisely for this sort of thing. Awesome. -- And the rest of the problems all sound interesting to figure out how to solve... thanks. – o_o_o-- Jun 17 '13 at 23:45
1

The absolute best approach would be to have a framework, cms or such where you store and access the same content with both ajax and entire page load. Have the same function run when on going into the site's frontpage aswell and navigating to the front's through javascript.

By clever use of .pushState(), overriding onclick through jQuery on links, you can have a site that has the right links, but often uses ajax for loading new pages. The fallback in case push state isn't supported is there aswell; the user simple navigates to the page instead of jQuery and ajax overriding the onclick effect.

This approach will also be loved by SEO since it can navigate your content freely aswell. To mention an example, consider my own gallery. You can navigate to the previous image by pressing on the image's left side. This will run a .pushState() and load the content through ajax. However you can take the "new" page, link it to me and it gives me the previous image. SEO also only see the correct link aswell and doesn't have to worry about ajax and such.

Robin Castlin
  • 1,199
  • 1
  • 8
  • 11