Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content through product variants
-
Hi,
Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique.
The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals.
In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants.
As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product.
I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of!
Kind Regards,
Nico
-
Hehehe yes we do usually!
-
Thanks for the hint!
Personally, I am a big fan of schema.org and marking up all the products has been on my further ToDo list.
-
Hi Martijn,
Thanks for your reply. I'll have to check with the responsible developer - but I fear that this option is not on the table. Then again, I have been hinted at that a complete redesign might eventually be. As I said below: Nobody who does SEO seems to have been around when the site was created. And we all know what happens in such a case, don't we?
-
Hi Matt,
If it were only that easy... I have since learnt that way back when the client had that website developed he specifically asked to NOT have an ecommerce website. (I, nor anybody advising on SEO, was not around back then AFAIK.)
The products are not connected. They are litereally independently created pages with the same template. The URLs are not parameter based but look like
http://www.example.de/category/subcategory1/subcategory2/product_name-further_description_1
http://www.example.de/category/subcategory1/subcategory2/product_name-further_descripittion_2
So, identical apart from the last bit that is NOT a parameter. And the last bit might be "750-kg" or "Alu" or "with-brakes". Thanks for the advice; I agree that it is generally a good starting point but sadly not possible in this case.
-
Just implemented something similar to this, and used canonicals. Also, if you're able to add more than just canonicals, possibly worth looking at microdata? We used schema.org isVariantOf for colors and size variants, not sure how much this influences googles understanding / search display, but it's widely recommended and seems unlikely to hurt. Implementing took a little trial and error, this helped as did google's schema testing tool.
-
What do the duplicate content URLs look like? In a lot of ecommerce systems you end up with parameter-based URLs such as:
http://www.example.com/products/women/dresses/green.htm
http://www.example.com/products/women?category=dresses&color=greenAccording to Google "When Google detects duplicate content, such as the pages in the example above, a Google algorithm groups the duplicate URLs into one cluster and selects what the algorithm thinks is the best URL to represent the cluster (and) tries to consolidate what we know about the URLs in the cluster, such as link popularity, to the one representative URL. However, when Google can't find all the URLs in a cluster or is unable to select the representative URL that you prefer, you can use the URL Parameters tool to give Google information about how to handle URLs containing specific parameters." (see more at Google Support)
If your URLs are parameter based I would suggest looking into handling them at that level in Search Console or (last resort) robots.txt as well. However, I'd start with canonicals and parameters if possible.
-
Hi Nico,
As you said it's far from prefect but I would indeed go with using a canonical on the pages that have duplicate variants. But if you're doing this already then it might be not that much more effort to also link them back on the back-end of your site so you can do more advanced things.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
How do I handle duplicate content of the same product in Multiple product categories?
I am building a BigCommerce store for selling framed art. Many of the pieces of art will fall in more than one product category. Let's say I have a framed print of a photograph of a western landscape. This piece of art would fit into these categories; "western", "landscape", and "photography". I would have three pages with duplicate content for just this one framed print. Will google give me less page rank due to this? Can all the link juice be given to just one of the three categories by use of rel=canonical? If so, does anyone know how to do this for a bigcommerce site? I would appreciate any feedback. Thanks, Kelly
Technical SEO | | Kelly_S0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Duplicate content problem from an index.php file
Hi One of my sites is flagging a duplicate content problem which is affecting the search rankings. The duplicate problem is caused by http://www.mydomain.com/index.php which has a page rank of 26 How can I sort the duplicate content problem, as the main page should just be http://www.mydomain.com which has a page rank of 42 and is the stronger page with stronger links etc Many Thanks
Technical SEO | | ocelot0 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | | vforvinnie1 -
CGI Parameters: should we worry about duplicate content?
Hi, My question is directed to CGI Parameters. I was able to dig up a bit of content on this but I want to make sure I understand the concept of CGI parameters and how they can affect indexing pages. Here are two pages: No CGI parameter appended to end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html CGI parameter appended to the end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html?pagewanted=2&ref=homepage&src=mv Questions: Can we safely say that CGI parameters = URL parameters that append to the end of a URL? Or are they different? And given that you have rel canonical implemented correctly on your pages, search engines will move ahead and index only the URL that is specified in that tag? Thanks in advance for giving your insights. Look forward to your response. Best regards, Jackson
Technical SEO | | jackson_lo0