How to prevent duplication of content on a page with too many filters?

Posted by Vikas Gulati on Pro Webmasters See other posts from Pro Webmasters or by Vikas Gulati
Published on 2013-09-18T10:55:41Z Indexed on 2013/10/18 16:14 UTC
Read the original article Hit count: 292

Filed under:
|
|

I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented with one filter as the base filter (part of the url that would get indexed) and other filters in the form of hash urls (which won't get indexed). This way the duplication is less.

Something like this

example.com/filter1value-items#by-filter3-filter3value-filter2-filter2value

Now as you may see, only one filter is within the reach of the search engine while the rest are hashed. This way I could have 6 pages.

Now the problem is I expect users to use two filters as well at times while searching. As per my analysis using the Google Keyword Analyzer there are a fare bit of users that might use two filters in conjunction while searching.

So how should I go about it?

Having all the filters as part of the url would simply explode the number of pages and sticking to the current way wouldn't let me target those users.

I was thinking of going with at max 2 base filters and rest as part of the hash url. But the only thing stopping me is that it would lead to duplication of content as per Google Webmaster Tool's suggestions on Url Structure.

© Pro Webmasters or respective owner

Related posts about url

Related posts about duplicate-content