5 Tips For Dealing With Duplicate Content

With the recent Panda 3.9 refresh, (not to mention a series of other algorithm updates that have surged through the web), the term “duplicate content” has been passed around somewhat more frequently. For those unsure of exactly what duplicate content is, this phrase basically refers to text that appears online at more than one URL.

Whilst this many sound harmless enough, duplicate content actually poses a serious threat to your site. When search engines are presented with more than one piece of identical content, they are forced to try and decide which is the original, which more often than not leads to a dip in rankings and traffic.

There are a number of reasons your site may contain duplicate content, from the actual website copy itself, to analytic codes and URL Parameters. The problem may not have been intentional, but with such severe consequences, it is an issue which needs resolving as soon as possible. There are a number of areas to tackle should this happen to you:
  • Perform a 301 redirect. This is the most suitable technique in many situations and basically just redirects the page containing duplicate content back to the original page, simply instructing both human readers and search bots that your page has permanently moved.
  • Return a 404 error. Probably the quickest way to deal with the problem is to just remove it and return a 404 error notice. This option is fine if the content has no real value and the site does not contain any significant inbound links or traffic, however in many situations this course of action would not be advised.
  • Revise your content. First of all think about who writes your content, if you use an SEO company or outsource it in any way be sure to check and approve their work before it is uploaded. If you are writing copy yourself, there are ways you can minimise the risk of duplicate content. For example if you own an online retailing site providing men’s and women’s clothes of the same brand, to prevent producing similar text, combine these pages into one, just be sure you still have a sufficient amount of content to assist your SEO campaign.
  • Use the “meta Robots” tag. This indicates to the search engines that this page is no longer in use, and is often considered a better approach to robots.txt
  • Finally, as more of a last chance attempt, Google Webmaster Tools allows individual pages to be removed from the index. However it is worth noting that as this tool is at Google’s discretion, it might be worth avoiding if possible. 

SEO Positive are a leading SEO company specialising in organic and paid search as well as copywriting services, reputation management and web design.

0 comments:

Post a Comment