Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I redirect or add content, to 47 Pages?
-
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this?
I'm I right in thinking I have 2 options? Either add new content or redirect the page?
Thanks in advance

-
Whether you should redirect or add content to the 47 pages depends on the specific circumstances and goals for those pages.
Redirecting:
When to Redirect: If the 47 pages have low-quality content, are outdated, or are duplicating other content on your site, redirecting might be the best option. Redirecting these pages to more relevant, high-quality pages can help consolidate your site’s authority and improve user experience. Additionally, if any of these pages are receiving low traffic and you have no plans to update them, redirecting can prevent them from dragging down your overall site performance.
SEO Consideration: 301 redirects are ideal if the content is permanently moving. This allows you to preserve most of the SEO value from the old pages.
Adding Content:When to Add Content: If the pages in question have potential but lack sufficient depth or relevance, enhancing them with additional content can be beneficial. By updating these pages with more comprehensive, valuable information, you can improve their ranking potential and better serve your audience’s needs.
SEO Consideration: Ensure that the new content is well-researched, relevant, and optimized for the target keywords. This approach helps maintain or even improve the rankings of these pages.
Recommendation:
Evaluate the current performance of each of the 47 pages. If a significant portion has strong existing backlinks or decent traffic, it may be worth investing in content updates. On the other hand, if the pages are weak and have little SEO value, redirecting could be a smarter strategy.It might even be a mix of both approaches, depending on what you find during your evaluation.
-
@liamjordan193 thanks
-
@ww4686101 said in Should I redirect or add content, to 47 Pages?:
Hi Laurent,
You're absolutely right that you have two main options: adding new content or redirecting the pages. Here’s how you can decide which approach is best:
Add New Content: If the pages have potential value and could offer useful information to your audience, then updating them with fresh, high-quality content is the way to go. Focus on making each page unique and valuable to your users. This will not only address the duplicate/low content warnings but also improve your site's overall SEO.
Redirect: If the pages are redundant or don’t serve a specific purpose anymore, a 301 redirect to a more relevant page might be a better option. This helps consolidate your content, avoid potential penalties, and preserve any link equity those pages might have.
When to Choose Each Option:
Add Content if the pages cover topics that are still relevant, have potential for traffic, or could be expanded into something more comprehensive.
Redirect if the pages are outdated, nearly identical to other pages, or if the content isn’t worth expanding.
In some cases, a mix of both strategies might be ideal. You could add content to some pages and redirect others that are less useful.
Hope this helps!
Brilliant answer, thank you!
-
@laurentjb, The best solution is to consolidate duplicate content by merging similar pages and redirecting outdated or redundant pages to relevant ones using 301 redirects. For low-content pages, either expand the content to add value or combine them with related pages. This improves SEO and enhances the user experience.
-
@laurentjb said in Should I redirect or add content, to 47 Pages?:
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this?
I'm I right in thinking I have 2 options? Either add new content or redirect the page?
Thanks in advance

You're correct that you have two main options: either add more valuable content to improve the quality of those pages or redirect them to more relevant, higher-quality pages. Adding new content in Flooring Contractor in Ajax is ideal if the pages have unique value, while redirects are better for pages with little to no potential for improvement. Both approaches help address duplicate/low content warnings and improve your site's SEO.
-
@laurentjb said in Should I redirect or add content, to 47 Pages?:
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this?
I'm I right in thinking I have 2 options? Either add new content or redirect the page?To handle duplicate/low content warnings on your insurance agency website:
Add New Content:
Expand and enhance content to make it unique and valuable.
Use targeted keywords and structured data to improve SEO.
Redirect Pages:Use 301 redirects for pages with minimal value, consolidating content to stronger, related pages.
-
Hi Laurent,
You're absolutely right that you have two main options: adding new content or redirecting the pages. Here’s how you can decide which approach is best:
Add New Content: If the pages have potential value and could offer useful information to your audience, then updating them with fresh, high-quality content is the way to go. Focus on making each page unique and valuable to your users. This will not only address the duplicate/low content warnings but also improve your site's overall SEO.
Redirect: If the pages are redundant or don’t serve a specific purpose anymore, a 301 redirect to a more relevant page might be a better option. This helps consolidate your content, avoid potential penalties, and preserve any link equity those pages might have.
When to Choose Each Option:
Add Content if the pages cover topics that are still relevant, have potential for traffic, or could be expanded into something more comprehensive.
Redirect if the pages are outdated, nearly identical to other pages, or if the content isn’t worth expanding.
In some cases, a mix of both strategies might be ideal. You could add content to some pages and redirect others that are less useful.
Hope this helps!
-
@laurentjb The action required depends on the type of page triggering these warnings. If these are blog category/tag results pages, you can add unique identifiers to keep them distinct.
If they are unique pages, I would recommend adding content if you feel that this will provide extra value to users. If they are pages that get no traffic, do not rank for anything on Google/Bing, and do not provide value to users, you can deprecate them and 301 the links to the most relevant page without harming your website.
I hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | | davidvogel1 -
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Should I keep my existing site or start new?
I have a website with less than 3K visits a year. Only customers with an Account with me who have login credentials can see my product pricing and make a purchase onsite; therefore, indexing/page ranking is not a concern for me. My agency suggests that my product catalog be corrected to a parent/child relationship. Currently, each product variation has its own SKU and PDP. As a result, product findability: Site Search, Categorization, and Facets are a mess. Is there any way I can keep my current URL (branding purposes)? I thought we could delete all pages (PLPs & PDPs) and create all new and enforce 301 redirects. Thoughts?
Community | | SEOfreshman1 -
slug Link redirect to subdomain?
Hi !
Link Building | | Leviiii
Im Levi new here and new in the world of SEO, please dont judge if my questions are silly. Back on the days when the site was built we thought it is a good ideea to have subdomains that together with the domain name represent our main keywords.
ex. https://stansted.tonorwich.uk, https://heathrow.tonorwich.uk, https://luton.tonorwich.uk, https://gatwick.tonorwich.uk. There is content on this subdomains, would it make any difference from SEO perspective if we create slugs that redirect to these subdomains? for example creating https://tonorwich.uk/taxi-minibus-vip-tesla-norwich-to-stansted that redirects to https://stansted.tonorwich.uk ? Or better create these slugs with slightly different content?
Any ideeas would be appreciated.
Thanks in advance!0 -
Solve Redirect Chains
I've gotten a few Redirect Issues that involve Redirect Chains, with the https:// version redirecting to the www. version and then redirecting to the right URL. Here is an example:
On-Page Optimization | | Billywig
Schermafbeelding 2021-12-07 om 11.04.32.png I've tried setting a direct redirect between the first and the last URL, but WordPress doesn't seem to allow that (it's overwritten). I've also tried checking the internal links to make sure that none of the links are the first one. They don't seem to be there. Does anyone have any tips on solving these Redirect Chains?0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform.  We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https.  Only our shopping cart is using https protocol.  We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version.  What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version.  And the https version is always better.  Example:  http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27.  Can somebody please help me make sense of this?  Thanks,
On-Page Optimization | | Aquatell1 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5