Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Backlink quality vs quantity: Should I keep spammy backlinks?
-
Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation:
- Option 1: More backlinks including a lot of spammy links
- Option 2: Fewer backlinks but only reliable, non-spam links
I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other.
For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA.
Thank you!
-
Backlinks are always about quality not quantity. Google does not like too many backlinks and especially spammy backlinks. I would suggest you to go with quality backlinks if you want long term and sustainable results otherwise there will always be a threat of getting penalized by google if you focus on spammy backlinks.
-
It's a myth that your DA drops because you put links in disavow. Disavow is a google only (or bing) tool, where lets say you get spammy links from a rogue domain and there's no way you can get 'm removed.
MOZ cant read your disavow file either you file into google. So i'm not sure on how the link is being put here. With MOZ, or any other tool, they just calculate the amount of incoming, FOLLOW links and presume your DA on some magical number. Thats all there is to it. Again, PA/DA has nothing in common at all with Google as Google maintains their own algorithm.
-
Hello again,
Thanks for the clarification and the link. I've read through that and a few other sources across the web, but none of them seemed to answer my question the way you did, so thanks! Our backlink profile is pretty balanced with spammy and definitely not spammy, so I'm not super concerned about it, but I appreciate the reminder.
-
I should also clarify, these may hurt you if they are your only links. If you have very little equitable links, this may cause Google and other search engines to falsely recognize you as spam. So just be careful and be on the look out for extra suspicious spam links. The balanced approach is the best approach: don't worry but stay aware!
Here is a more technical write-up from Moz that I reccomend: https://moz.rainyclouds.online/help/link-explorer/link-building/spam-score
-
No problem Liana.
- That is correct. Google understands that you don't have control of 3rd party sites, so instead of penalizing you, they minimize/ delete the effect the spam site links have.
- Yes, but only kind of. It may or may not increase PA/ DA, but according to Google it shouldn't hurt you.
But yeah that's the gist of it! Instead taking the time investigating and disavowing links, you could spend that time cultivating relationships with other websites and businesses that could give you nice quality linkage.
Hope this answer works for you.

-
Hello Advanced Air Ambulance SEO!
Thanks for the quick and thorough response. Please confirm if I understand you correctly:
- I can leave spammy backlinks alone (not spend time disavowing them) _unless _I see a manual action in Search Console, which would indicate that Google sees an issue and is penalizing my site until I disavow the links. Without this manual action, there's no indication that the spam links are hurting my rankings or DA.
- Leaving spammy backlinks that don't incur a manual action may actually increase DA since leaving them maintains a higher volume of backlinks (albeit some spammy), and backlink quantity is a contributor to DA.
Thank you!
-
Hi Liana,
As far as spammy links, Google has done well detecting whether or not they are intentional, aka black hat. If they aren't, Google does not penalize you for these links, so it's best to leave them.
As far as a strategy for generating links to your website, you should always focus on high quality over quantity. High quality links give you exponentially more return than high quantity of bad links.
I recommend this article Google wrote for us to understand when and how to disavow links.
https://support.google.com/webmasters/answer/2648487?hl=en
In short, rarely do you ever need to disavow links, even if they have a high spam score. You are only hurt when they sense you are gaming the system and in the case that they detect or suspect unethical backlinking, you will be penalized with a "manual action". You can check if you were penalized, as well as disavow flagged backlinks, in the Google Search Console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the difference between 301 redirects and backlinks?
i have seen some 301 redirects on my site billsonline, can anyone please explain the difference between backlinks and 301 redirects, i have read some articles where the writer was stating that 301 are not good for website.
Technical SEO | | aliho0 -
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
422 vs 404 Status Codes
We work with an automotive industry platform provider and whenever a vehicle is removed from inventory, a 404 error is returned. Being that inventory moves so quickly, we have a host of 404 errors in search console. The fix that the platform provider proposed was to return a 422 status code vs a 404. I'm not familiar with how a 422 may impact our optimization efforts. Is this a good approach, since there is no scalable way to 301 redirect all of those dead inventory pages.
Technical SEO | | AfroSEO0 -
403s vs 404s
Hey all, Recently launched a new site on S3, and old pages that I haven't been able to redirect yet are showing up as 403s instead of 404s. Is a 403 worse than a 404? They're both just basically dead-ends, right? (I have read the status code guides, yes.)
Technical SEO | | danny.wood1 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | | mtthompsons0 -
Div tags vs. Tables
Is there any reason NOT to code in tables (other than it being outdated) for SEO reasons?
Technical SEO | | EileenCleary0 -
Multilingual Website - Sub-domain VS Sub-directory
Hi Folks - Need your advice on the pros and cons of going with a sub-domain vs a sub-directory approach for a multi lingual website. The best would be a ccTLD but that is not possible now, so I would be more interested in knowing your take on these 2 options. Though, I have gone through http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-seo-friendly-structure/ and this somewhat vouches for a sub-directory, but what would you say'?
Technical SEO | | RanjeetP0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0