3

I'm considering implementing slugs in my blog. My blog uses MongoDB. One of the side-effects of using MongoDB is that it uses relatively long hex string IDs.

Example

before: http://lastyearswishes.com/blog/view/5070f025d1f1a5760fdfafac

after: http://lastyearswishes.com/blog/view/5070f025d1f1a5760fdfafac/improvements-on-barelymvc

Of course, that's a relatively short title.. I have some longer ones, but intend to limit the maximum character limit for slugs to something reasonable.

At what point does a URL become so long that it hurts SEO instead of improves it? In this case, should I leave my URLs alone, or add slugs?

Earlz
  • 22,658
  • 7
  • 46
  • 60
  • 1
    Not sure if this is on topic. The usability/user friendliness aspects would be more appropriate for [User Experience](http://ux.stackexchange.com/), and SEO is more of black magic than science, not really a topic that has definitive answers. – yannis Oct 09 '12 at 03:00
  • 1
    [This question](http://webmasters.stackexchange.com/questions/24182/friendly-urls-is-there-a-max-length-for-search-engines) at [Pro Webmasters](http://webmasters.stackexchange.com) covers URL length and SEO. – John Conde Oct 10 '12 at 19:31

1 Answers1

3

Matt Cutts, who fights spam at Google1, has commented on the number of words used in slugs:

Next question: what is excessive in the length of a keyword-rich URL? We have seen clients use keyword URLs that have 10 to 15 words strung together with hyphens; or blogs – we have seen them even longer there. A typical WordPress blog will use the title of the post as the post slug, unless you defined something different and you can just go on and on and on. Can you give any guidelines or recommendations in that regard?

Matt Cutts: Certainly. If you can make your title four- or five-words long – and it is pretty natural. If you have got a three, four or five words in your URL, that can be perfectly normal. As it gets a little longer, then it starts to look a little worse. Now, our algorithms typically will just weight those words less and just not give you as much credit.

The thing to be aware of is, ask yourself: “How does this look to a regular user?” – because if, at any time, somebody comes to your page or, maybe, a competitor does a search and finds 15 words all strung together like variants of the same word, then that does look like spam, and they often will send a spam report. Then somebody will go and check that out.

So, I would not make it a big habit of having tons and tons of words stuffed in there, because there are plenty of places on a page, where you can have relevant words and have them be helpful to users – and not have it come across as keyword stuffing.

Stephan Spencer: So, would something like 10 words be a bit much then?

Matt Cutts: It is a little abnormal. I know that when I hit something like that – even a blog post – with 10 words, I raise my eyebrows a little bit and, maybe, read with a little more skepticism. So, if just a regular savvy user has that sort of reaction, then you can imagine how that might look to some competitors and others.

URLs, and by extension slugs, are important for indexing, using a few actual words in them that also appear in the page will help with indexing, but overdoing it might prove counter productive. Keep in mind that you can, to a point, teach Google how to treat your URLs, through their Webmaster Tools.

Anecdotally I've noticed that the Google bot seems to really appreciate Sitemaps, a solid sitemap is perhaps a better investment of your time than worrying (too much) about the slugs.

Some further reading:

1 ...awesome job title: Head of webspam!

yannis
  • 39,547
  • 40
  • 183
  • 216
  • Hmmm.. with this in mind it makes me wonder if I should limit either number of words or raw number of characters in the slug – Earlz Oct 09 '12 at 03:24
  • @Earlz I have limited experience with SEO and coincidentally fired every SEO expert I've had the "pleasure" of working with to date, even if I wasn't the one who hired them. My advice would be to design your URLs to be as human friendly as possible, search engine algorithms, especially Google's, is tuned every now and then to better judge what a human would think of as friendly and useful (sometimes though it seems that their version of human is a brain damaged one, but...). Anyway, I'd probably remove the hash from the URL, it looks ugly and raw number of characters _might_ be important. – yannis Oct 09 '12 at 03:37
  • Heh, having the hash is the easiest method, though I could turn to the familiar `/yearmonthday/time` type method... Either way, if I'm updating URLs, I might as well do them all at once. I hate implementing more redirects than I actually need. – Earlz Oct 09 '12 at 03:47
  • 2
    My guess is that most titles converted to 3-6 word URLs will be unique. In the rare cases where it isn't you can always force it with a differentiator (...-2, ...-3). The hash is for you and your DB, not for any (human) user. – Peter Rowell Oct 09 '12 at 05:18