Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How Removing Zombie pages effect on domain authority?
-
Hi.
Recently I got a project (removing zombie pages here: https://www.alamto.com/ )
As you can see this site has about 20k indexed page on google and it seems I should remove about 6000 useless indexed page. does removing (Noindex) these pages affect on the site metrics?
Which metrics would affected? and how?
Thanks.
-
Hi there,
Your focus should be on removing pages which are not adding value to the website and for your users. For example, if you have pages which contain old and inaccurate content which is no longer useful, then you may want to consider either updating the page with better content or removing the page completely.
Adding a noindex tag may make Google drop it from their index, but users may still be able to find the page on your website, so consider whether you want to remove the page completely using a 404 or redirect it to another relevant page with a 301.
In terms of metrics, it really comes down to what you want your users to do when they visit your website. If you want them to spend time reading your content and then clicking through to other pages, then these are important metrics to measure. Then if your pages aren't doing this, then you either need to improve them or consider removing them if they aren't good quality.
I hope that helps!
Paddy
-
You could export the Webmaster analytics, make a summary of which pages tend to run good and which not. Based on that data you remove the "old" pages, and watch the effect. While your at it perhaps throw some new content towards it. Does'nt hurt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On-page SEO
This is a question for the organic SEO experts, once you added the main keyword that you want to rank for in the homepage title, meta title plus meta description, perhaps once or twice in the text on the homepage. How often do you then write it in the content marketing, say blog posts, we want to rank higher on Google for "SEO agencies Cardiff" however if you mention this in the blog posts too much say once a week, this could lead to over optimisation issues?
On-Page Optimization | | sarahwalsh1 -
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
Is it better to keep a glossary or terms on one page or break it up into multiple pages?
We have a very large glossary of over 1000 industry terms on our site with links to reference material, embedded video, etc. Is it better for SEO purposes to keep this on one page or should we break it up into multiple pages, a different page for each letter for example? Thanks.
On-Page Optimization | | KenW0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Home page and category page target same keyword
Hi there, Several of our websites have a common problem - our main target keyword for the homepage is also the name of a product category we have within the website. There are seemingly two solutions to this problem, both of which not ideal: Do not target the keyword with the homepage. However, the homepage has the most authority and is our best shot at getting ranked for the main keyword. Reword and "de-optimise" the category page, so it doesn't target the keyword. This doesn't work well from UX point of view as the category needs to describe what it is and enable visitors to navigate to it. Anybody else gone through a similar conundrum? How did you end up going about it? Thanks Julian
On-Page Optimization | | tprg0 -
Page rank check
Hello everyone, How long should I wait to see if page rank for optimized pages have improved? cheers
On-Page Optimization | | PremioOscar0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | | MeghanPrudencio0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5