Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How would you create and then segment a large sitemap?
-
I have a site with around 17,000 pages and would like to create a sitemap and then segment it into product categories.
Is it best to create a map and then edit it in something like xmlSpy or is there a way to silo sitemap creation from the outset?
-
Thanks Saijo,
We are trying to silo product types/categories and break them into different sitemaps. I'm familiar with SF but I don't think it will create sitemaps with the granularity that we are looking for.
I'm using XMLSpy but I'm finding it hard to break out blocks of content.
-
To my knowledge, Screaming Frog doesn't allow you to create an XML sitemap. Perhaps Excel allows you to format the output from SF but I'm not sure. I did find a utility called XMLSpy which, though pricey, allows me to do some of the sorting I was looking for. Once sorted, I can manually pull out sections to segment my sitemap. It is a pain in the neck because I can determine a silo and do it automatically. That being said, I think I can develop a sitemap template and have our new web programmer to develop a way to auto generate a group of segmented sitemaps.
Anyone know if there is a canned solution that works with IIS?
-
If you site is structured such that the urls contain the categories you wish to sort , you can use something like Screaming Frog ( http://www.screamingfrog.co.uk/seo-spider/ ) and export all the urls and sort them out via excel in to categories and go that way
NOTE : the free version has a 500 url limit, so you might want to look at paid ( ask them if it can handle 17,00 urls before getting it ) or look at http://home.snafu.de/tilman/xenulink.html ( I haven't used it myself , so don't know if you can export stuff to excel from there )
Good luck mate , sounds like you have a big job ahead of you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Is it important to include image files in your sitemap?
I run an ecommerce business that has over 4000 product pages which, as you can imagine, branches off into thousands of image files. Is it necessary to include those in my sitemap for faster indexing? Thanks for you help! -Reed
Technical SEO | | IceIcebaby0 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
How to create unique content for businesses with multiple locations?
I have a client that owns one franchise location of a franchise company with multiple locations. They have one large site with each location owning it's own page on the site, which I feel is the best route. The problem is that each location page has basically duplicate content on each page resulting in like 80 pages of duplicate content. I'm looking for advice on how to create unique content for each location page? What types of information can we write about to make each page unique, because you can only twist sentences and content around so much before it just all sounds cookie cutter and therefore offering little value.
Technical SEO | | RonMedlin0 -
Best Dynamic Sitemap Generator
Hello Mozers, Could you please share the best Dynamic Sitemap Generator you are using. I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php Thanks in advanced for your help.
Technical SEO | | SEOPractices0 -
How do I create a Video Sitemap for Youtube Embedded Videos?
I've been seeing a lot of people recommend creating a video sitemap or Media RSS feed (mRSS) and submit to Google. We have videos hosted on Brightcove and most on YouTube. Brightcove can generate the sitemap for us. But does anyone know how to generate a YouTube Video Sitemap for those videos embedded on our pages? Note: I realize I could manually assemble the video sitemap, however manually assembling the sitemap is probably not an option for us due to the volume of videos we've published.
Technical SEO | | LDS-SEO1