Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content on Event Pages
-
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
-
A penalty is something google will have to manually remove and you will be able to see that in webmaster tools. A devaluation is when you are adjusted by the algorithm and lowered as a result because each thing that google does not like acts as points against you but you can quickly change and see your results return. Does that make sense?
-
We decided that it was worth a large investment as we would own the content ourselves and not worry in the future about anyone claiming ownership to the content as google gets stricter. So we re wrote half a million words!
-
Also could you fully explain the difference between devaluation and a penalty?
-
Do you mind if I ask how much of the content you re-wrote? My main fear is the amount of work that this would take since a lot of content goes up on the site daily. If the content is re-written did you do the same amount of content or did you re-write your office space listings with less content?
-
This is a Panda issue.
Google has said many times with affiliate sites that use the same content that if they do a better job than the original site it will rank them. So its not all bad when you look at it from that point of view.
However, Google loves unique content and will do its best to rank sites first that have the unique content. I have a business in the office space industry and a few years back we used to aggregate office apace listings which were shared amongst 30+ sites. The display of these listings would be different for many searches but the content was the same as all the other sites. This slowly put us in a PANDA DEVALUATION (there is no panda penalty).
After re-writing them with our clients we saw a significant change once the content had be re-crawled.
So it can have a great effect. If Google starts to see that large parts of your site are duplicate content it will start to question the authority you have in your industry.
Could you offer and incentive to your customers to write something unique? And also maybe inform your users not to copy and paste their own content on your site as this could affect them negatively in Google?
If you are an authority could you tell users that if you want to be listed it must be unique? Or if its a paid service have an ad on service for a few bucks where you write a professional description? Might become a nice additional income?
Just a few ideas

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in sidebar
Hi guys. So I have a few sentences (about 50 words) of duplicate content across all pages of my website (this is a repeatable text in sidebar). Each page of my website contains about 1300 words (unique content) in total, and 50 words of duplicate content in sidebar. Does having a duplicate content of this length in sidebar affect the rankings of my website in any way? Thank you so much for your replies.
On-Page Optimization | | AslanBarselinov1 -
Thoughts on archiving content on an event site?
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain). We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO. We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
On-Page Optimization | | accessintel0 -
How to Handle duplicate pages/titles in Wordpress
The wordpress blog causes problems with page titles. If you go to the second page of blog posts it there's a different URL but with the same page title. for example: page 1: site/blog page 2: site/blog/page/2 Each page gets flagged for duplicate page titles. Thanks in advance for your thoughts,
On-Page Optimization | | heymarshall1 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Quick and easy Joomla 1.5 Duplicate content fix?
www.massduitrialalwyers.com has a TON of duplicate content based on the way joomla 1.5 uses articles. Do you have a tried and true method to eliminate (automated would be preferred) the issues>? if not, might you suggest a plug in that takes care of the rel canonical?
On-Page Optimization | | Gaveltek-173238
Cheers0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | | MeghanPrudencio0