Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Exclude Child URLs from XML Sitemap Generator (Wordpress)
-
Hi all,
I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs.
There is a section Exclude items and a subsection Exclude posts. I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work. So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked.
I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz!
Cheers.
-
AH! You did it Keith - I thought clicking 'update' at the bottom would do it, but there's a little link hidden in some text at the top saying "rebuild the sitemap manually".
Finally it's done, thanks so much for your help!
Mark
-
Did you try to generate a new sitemap after clicking update options and then submitting it to webmaster tools?
Generally it will only update when you add/delete pages on it's own.
-
I'm just trying to exclude these child URLs from the sitemap - in future I may block them entirely, but I certainly don't want to submit a sitemap with these URLs and then contradict that in robots.txt.
I have used the Post ID numbers to exclude the pages from the sitemap, however they remain in place.
Thanks once again for your assistance and quick responses!
-
It may take some time for it to propagate to Google if that is what you are asking. Are you trying to block the pages/posts completely from search engines?
-
Hi Keith,
Thanks once again for a quick response. I have actually tried that method, however when I check the live sitemap I can still see the pages in my sitemap. Very frustrating! Is it that the sitemap doesn't update live straight away? And just to confirm, I am clicking "Update Options" at the bottom - quite often it'll be something stupid like that!

Thanks,
Mark
-
Great question, and WP really should make this easier!
http://businessaccent.com/2009/06/08/what-is-my-wordpress-post-id-number-and-how-can-i-find-it/ This article explains one way to see it, also if you open up the post/page in the admin panel to edit it you can just look in your browser to see the url which will have the post ID in it... IE: www.yoursite.com/wp-admin/post.php?post=615&action=edit (615 is the post ID)
Hope that helped

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap.xml strategy for site with thousands of pages
I have a client that has a HUGE website with thousands of product pages. We don't currently have a sitemap.xml because it would take so much power to map the sitemap. I have thought about creating a sitemap for the key pages on the website - but didn't want to hurt the SEO on the thousands of product pages. If you have a sitemap.xml that only has some of the pages on your site - will it negatively impact the other pages, that Google has indexed - but are not listed on the sitemap.xml.
Technical SEO | | jerrico10 -
.xml sitemap showing in SERP
Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work.
Technical SEO | | Kyleroe950 -
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
.%E2%80%9d breaking the URL in wordpress
My wordpress URL is breaking and there are 5000 not found urls in webmaster due to some code being added %E2%80%9d. This code stands for double quotation marks - " Now the question is, where has my site gone wrong? I will tell you the changes which i have made Deleted a Vbulletin forum - Half of the errors are due to the forum being deleted directly Upgraded to Wordpress 3.3 (crawl errors did not show on the same day. Much later) Upgraded to Blue host pro (crawl errors did not show on the same day. Much later) These are some of the speculations. But nonetheless i have no idea why this is happening. To give further hints, the Home page URL is being added to the original URL. http://www.marketing91.com/article/http://www.marketing91.com http://www.marketing91.com/article/http://www.wrodpress.org So these are a list of problems i am facing in URL. Now i have no idea why this is happening. I can account for the deletion of a vbulletin forum. But that accounts only for half of the crawl errors. So any replies or answers??
Technical SEO | | hith2340 -
Should XML sitemaps include *all* pages or just the deeper ones?
Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think?
Technical SEO | | timwills0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0