As the vast majority of our SEO clients are built on Magento, I’ve faced a huge amount of technical issues with the platform, most of which come down to duplicate content.
Here are five of the issues I’ve faced, along with ways to resolve them and prevent them from happening again in the future:
Dynamic filter pages are one of the most common technical issues that people face with Magento and site owners often find that thousands of them are indexed by search engines.
Example of a dynamic filter page:
These pages are used to filter product results on category pages.
Resolving the issue:
Having tried and tested a number of alternative fixes, we’ve found that using meta robots rules is by far the most effective way to eliminate this issue. This is the process we follow to get the pages removed from the Google index:
Once you’ve done some removal requests, you should start to see Google taking note and taking other pages out quicker.
You can use our SEOpack Magento module to apply the meta robots rules automatically. This module also has several other useful features. Find out more here.
Search pages finding their way into Google’s index is a very common issue for Magento users—the vast majority of Magento websites that I’ve looked at have this issue.
Resolving this issue:
Resolving the issue is actually really simple. You just need to disallow the directory in the robots.txt file. Usually the URL for search pages is ‘/catalogsearch/result/?q=test’, so if you disallow the Catalog Search directory and then remove it in Google Webmaster Tools, the issue will be fixed within around 8 hours.
Review pages are interesting, because the issues can vary depending on how you structure your website and the plugin you’re using. We faced a duplicate content issue when we were displaying review content on product pages, but had a separate page for the same content—but I know that this is not always the case.
So, for the scenario that we faced, we removed the extra pages (one for each product), which looked like this:
In order to remove these pages, we simply disallowed the /review/ directory in the robots.txt file and then submitted a removal request for the folder.
Once again, review pages would only cause duplicate content issues if you’ve structured your website in the same way as we did, where you can go to a specific page to read product reviews, as well as on the product pages themselves.
The discussion of whether paginated versions of pages should be accessible to search engines had been going on for a number of years, but last year, Google announced the introduction of rel=next, prev, and all, allowing site owners to show search engines that these pages are in fact pagination.
So, my answer to eliminating the duplicate content issues caused by pagination is to implement these tags, which can also be simplified by using our SEOpack module.
In previous versions, Magento used to employ non-search friendly URLs by default, and they often seem to pop up in a newer versions, too. URLs, like the example below, are commonly indexed by search engines and cause duplicate content issues.
These pages are unlikely to rank for anything or appear whilst navigating through the website; they’re caused by issues with rewrite rules.
I would recommend disallowing these pages in your robots.txt file and then doing a removal request for the folder in Google Webmaster Tools.
Article Source: http://www.searchenginejournal.com/
Both comments and pings are currently closed.