Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a suggested limit to the amount of links on a sitemap?
-
Currently, I have an error on my moz dashboard indicating there are too many links on one of my pages. That page is the sitemap. It was my understanding all internal pages should be linked to the sitemap.
Can any mozzers help clarify the best practice here?
Thanks,
Clayton
-
Your html sitemap is best for website visitors, so best practice is to list the most important sections/pages. Google can use your html sitemap page to crawl the rest of your site as long as the structure can be followed.
If you have lots of pages, then it's best to us an xml sitemap to submit through Google Webmaster. Once your xml sitemap is in the root directory of your website, you can also let search engines know its location through your robots.txt file like this:
User-agent: * Sitemap: http://www.SomeDomain.com/sitemap.xmlIf your site changes over time, it's a good idea to create fresh sitemaps - just set reminders for yourself in a calendar.
-
That makes sense. Thanks.
-
Thanks for the help BrewSEO, Darin, and Zora.
-
I believe he is referring to an actual site map html page, not an XML file to submit to Google.
-
Don't worry about it. The "too many links" message is based on Google's suggestion to have less than 100 links per-page. Obviously site-maps are going to be an exception to this rule, and with good reason. You are fine.
-
The answer is "technically" 50,000.
However, the size for sitemap matters too (no bigger than 50MB).
If you have more than these numbers allow for in Google's Guidelines then you break them up and have multiple sitemaps on your site.
Here is Google's Guidelines on Sitemaps:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=183668
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Automate XML Sitemaps
Quick question, which is the best method that people have for automating sitemaps. We publish around 200 times a day and I would like to make sure as soon as we publish it gets updated in the site map. What is the best method of updating a sitemap so it gets updated immediately after it is published.
Technical SEO | | mattdinbrooklyn0 -
Referencing links in Articles and Blogs
Hi I am wondering if the <sup>tag in html is picked up by google as a reference point?</sup> I.e when you put a superscript in word it puts a small number next to your sentence. Then you have a list of reference at the end of the blog/article does google recognise this?
Technical SEO | | Cocoonfxmedia0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
International Site Links In Footer
We have several international sites and we have them linked in the footer of our main .com site . Should we add "nofollow" to these links? Our concern is that Google could see these sites as a network?
Technical SEO | | EwanFisher0 -
Link Volume - calculate what you need?
Hi everyone, an interesting question here. How do you determien what link volume you should try and get into your website? What analysis do you do to determine the number of links you feel is right to go into a back-link profiel every month? obviously there is no magic number but its an interesting question to know what others do. Obviously you don't want to build too many or too little. If you have been penalised for bad links in the past and are now back on track - how do you calculate the volume? Do you take links dropping out into consideration?
Technical SEO | | pauledwards0