Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practices for Moving a Sub-Domain to a Sub-Folder
-
One of my clients is moving their subdomain to a subfolder on their main domain. (ie. blog.example.com to example.com/blog)
I just wanted to get everyone's thoughts on some best practices for things we should be doing/looking for when making this move.? ie WMT, .htaccess, 301s etc?
Thanks.
-
Glad it helped!
-
These are really good. I'm going to dive into them tonight.
On top of moving the subdomains (there are 3) we've also built a new site on a new platform and moving all those pages from there too. They had page IDs vs Search Engine Friendly urls. They've been online since 2002 so there has been a TON of work that has gone into it and I just want to make sure I can make the move as smooth as possible.
Thanks again for your help and I'll keep you posted.
-
Moving a subdomain permanently can be a huge project! This question is probably better answered through a blog post that includes a walkthrough of what to check - and double check - before finalizing anything. Below are some of the blog posts I've found that best detail this process. If you come across any questions, feel free to reach out! I'd be happy to help. But I'm sure the blog posts will do the trick (including one here at Moz!)
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Move to new domain using Canonical Tag
At the moment, I am moving from olddomain.com (niche site) to the newdomain.com (multi-niche site). Due to some reasons, I do not want to use 301 right now and planning to use the canonical pointing to the new domain instead. Would Google rank the new site instead of the old site? From what I have learnt, the canonical tag lets Google know that which is the main source of the contents. Thank you very much!
Intermediate & Advanced SEO | | india-morocco0 -
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Legacy domains
Hi all, A couple of years ago we amalgamated five separate domains into one, and set up 301 redirects from all the pages on the old domains to their equivalent pages on the new site. We were a bit tardy in using the "change of address" tool in Search Console, but that was done nearly 8 months ago now as well. Two years after implementing all the redirects, the old domains still have significant authority (DAs of between 20-35) and some strong inbound links. I expected to see the DA of the legacy domains taper off during this period and (hopefully!) the DA of the new domain increase. The latter has happened, although not as much as I'd hoped, but the DA of the legacy domains is more or less as good as it ever was? Google is still indexing a handful of links from the legacy sites, strangely even when it is picking up the redirects correctly. So, for example, if you do a site:legacydomain1.com query, it will give a list of results which includes pages where it shows the title and snippet of the page on newdomain.com, but the link is to the page on legacydomain1.com. What has prompted me to finally try and resolve this is that the server which hosted the original 5 domains is now due to be decommissioned which obviously means the 301 redirects for the original pages will no longer be served. I can set up web forwarding for each of the legacy domains at the hosting level, but to maintain the page-by-page redirects I'd have to actually host the websites somewhere. I'd like to know the best way forward both in terms of the redirect issue, and also in terms of the indexing of the legacy domains? Many thanks, Dan
Intermediate & Advanced SEO | | clarkovitch0 -
Domain Alias SEO
We have 5 domain alias of our existing sites
Intermediate & Advanced SEO | | unibiz
All 5 domain alias are domain alias of our main site. It means, all domain alias will have exactly same site and contents
Like Main domain: www.mywebsite.com
DomainAlias: www.myproduct.com, www.myproduct2.com, www.myproduc3.com
And if anybody will open our site www.myproduct.com, it will open same website which I have in primary site what can i do to rank all website without any penalty....i s there any way? This is domain alias of in hosting industry Thanks0 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Buying a banned domain
Hello all, I've found a exact match keyword domain that I'm able to buy. Problem is that I'm under the impression it might have been banned by google, currently it is only showing adsense without content. The site can't be found using the cache: or site: parameters in Google and the PR is 0. What are your experiences on buying a banned domain and how can I double check if the domain is banned? This blogpost suggests I should not buy it, any other opinions? Thanks. Hellemans
Intermediate & Advanced SEO | | hellemans0