Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Too many on page links
-
Hi
I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+
How important is this now? I've read a few articles to say it's not as crucial as before.
Thanks!
-
Hi Becky!
First, I would like to say this is it great you are being proactive in making sure your webpage doesn't have too many links on it! But, luckily for you, this is not something you need to worry about. 100 is a suggested number but not something that will penalize you if you go over.
Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline—so be sure to check that out! It's commonly thought that having too many links will negatively impact your SEO results, but that hasn't been the case since 2008. However, Google has said if a site looks to be spammy and has way too many links on a single page—Google reserves the right to take action on the site. So, don't include links that could be seen as spammy and you should be fine.
Check out this Moz blog that discusses how many links is too many for more information!
-
Thank you for the advice, I'll take a look at the articles

Brilliant, the round table sounds great - I'll sign up for this
-
I honestly wouldn't worry Becky. The page looks fine, the links look fine and it is certainly not what you would call spammy,
Link crafting was a 'thing' a number of years ago, but today Google pretty much ignores this, as has been shown many times in testing.
However, you can benefit from internal links, but that is a different discussion. Read this if you are interested.
If you are interested, there is a round-table discussion on eCommerce SEO hosted by SEMrush on Thursday and that could be useful to you? Myself and 2 others will be talking on a number of issues.
-Andy
-
Thanks for the advice, I've looked into this before.
We have menu links and product links as it's an ecommerce site, so I wouldn't be able to remove any of these.
I've found it hard to find a way to decrease these links further on primary pages. For example http://www.key.co.uk/en/key/aluminium-sack-truck has 130 links.
Any advice would be appreciated

-
Confirmation from Google here to limit the links on a page to 3000
https://www.deepcrawl.com/knowledge/news/google-webmaster-hangout-notes-friday-8th-july-2016/
I would consider that to be a lot though

-Andy
-
Brilliant thank you!
-
In the "old days" (yup, I go back that far), Google's search index crawler wasn't all that powerful. So it would ration itself on each page and simply quit trying to process all the content on the page after a certain number of links and certain character count. (That's also why it used to be VERY important that your content was close to the top of your page code, not buried at the bottom of the code).
The crawler has been beefed up to the point where this hasn't been a limiting factor per page for a long time, so the crawler will traverse pretty well any links you feed it. But I +1 both Andy and Mike's advice about considering the usability and link power dilution of having extensive numbers of links on a page. (This is especially important to consider for your site's primary pages, since one of their main jobs is to help flow their ranking authority down to important/valuable second-level pages.)
Paul
-
Hi Becky,
Beyond the hypothetical limit, would be the consideration of dividing the link authority of the page by a really large number of links and therefor decreasing the relative value of each of those links to the pages they link to.
Depending on the page holding all these links, user experience, purpose of linked-to pages, etcetera, this may or may not be a consideration, but worth thinking about.
Good luck!
- Mike
-
Hi Becky,
If the links are justified, don't worry. I have clients with 3-400 and no problems with their positions in Google.
That doesn't mean to say it will be the same case for everyone though - each site is different and sometimes you can have too many, but just think it through and if you come to the conclusion that most of the links aren't needed and are stuffing keywords in, then look to make changes.
But on the whole, it doesn't sound like an issue to me - there are no hard and fast rules around this.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is a page with links to all posts okay?
Hi folks. Instead of an archive page template in my theme (I have my reasons), I am thinking of simply typing the post title as and when I publish a post, and linking to the post from there. Any SEO issues that you can think of? Thanks in advance!
Intermediate & Advanced SEO | | Nobody16165422281340 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0