0

My question is that we have a completely ajax based website: http://news.swalif.com/. Now we want to get all the 'pages' crawled by Google and other search engines, but we do not want to move to a PHP based solution or make an archive. I would like to get any ideas on how it could be possible.

Anto
  • 11,157
  • 13
  • 67
  • 103
Imran Omar Bukhsh
  • 1,959
  • 16
  • 25

3 Answers3

5

The basic idea is to think about it from the angle of: how will someone with javascript disabled use my site?

I can't exactly tell what your site is doing since I don't speak Arabic. :-) But basically, links that trigger Ajax updates should be implemented as normal links which go to a URL that will display the desired content; and then overridden with a javascript onclick that does the Ajax update.

So, e.g., a category link might be:

<a href="/category/stuff" onclick="DoCoolAjax('stuff'); return false;">Stuff</a>

The /category/stuff URL shows the page with that category displayed. But then the javascript onclick instead makes an Ajax request for that category listing, displays it in the content area, and cancels the link.

So as far as Google - or a user without javascript - is concerned, it's a perfectly normal non-Ajax full-page-reload site. But the majority of users see your glorious smooth Ajax updates.

Anyway, that's some ideas to be thinking about.

Carson63000
  • 10,510
  • 1
  • 28
  • 50
  • strange coincidence as I don't speak arabic as well. The website is a news website. Bascially its just getting different data from the database based on what you select e.g. subject, timeline etc. – Imran Omar Bukhsh Mar 15 '11 at 11:01
  • thanx for your reply. How sure are you that google will crawl it fine? – Imran Omar Bukhsh Mar 15 '11 at 11:02
  • 1
    @Imran get Firefox and the NoScript plugin. Enable it, then try to use your website. If you can use it and get to everything, you are safe in this respect at least. – Matthew Scharley Mar 15 '11 at 11:15
  • I use the Web Developer toolbar addon for Firefox: amongst many features it allows you to disable Javascript, CSS styles, images.. once that's done, you're basically seeing the website the way a search engine will see if. If the information is there and the navigation works, you can be confident that Google will crawl it OK. And then you can use Google's Webmaster Tools to verify that your site has indeed been indexed. – Carson63000 Mar 15 '11 at 23:05
2

Google has some advice how to make those pages searchable.

Making AJAX Applications Crawlable

If you're running an AJAX application with content that you'd like to appear in search results, we have a new process that, when implemented, can help Google (and potentially other search engines) crawl and index your content. Historically, AJAX applications have been difficult for search engines to process because AJAX content is produced dynamically by the browser and thus not visible to crawlers. While there are existing methods for dealing with this problem, they involve regular manual maintenance to keep the content up-to-date...

gnat
  • 21,442
  • 29
  • 112
  • 288
lfx
  • 804
  • 6
  • 7
1

I'm not an SEO person by any means. I did once work with an SEO professional on a site that was much like the one you listed in your question.

It came down to two things can the site be used without javascript, and is there a page that has actual paths. We used MVC on our site and had hard URLs for every page, even if the user didn't see it b/c it was dynamic. We also had some pages buried that were for crawlers that had just links on them so that those pages would get indexed.

Some references that might be of use to you:

Tyanna
  • 9,528
  • 1
  • 34
  • 54