Bad for SEO?
GHengeveld |
|
---|---|
I don't think there's ever been a discussion about this before. I see the problem you're having and you're right that we should do something about it. First we need to clarify the problem though.
The problem lies not with the URLs. The content of the different URLs is different. For example the forums URL with q=1234 references a forum topic, while the id=1234 references a single post. The page listing URLs for filters/sorting is a somewhat special case. The content is duplicate, but the order is different. In my opinion this counts as duplicate, so it shouldn't be indexed by Google. The solution is to not let Google index the duplicate pages. This can be achieved in a couple of ways. I think the easiest is to use rel="nofollow" on the URLs that shouldn't be indexed (such as filters/sorting URLs). Another solution is to use JavaScript for filtering and sorting content, but of course you should consider the chance that people don't have JavaScript enabled. Another solution could be to use robots.txt but that's something I haven't really looked in to. This article may be interesting (it outlines the problem foxhound is describing). |