Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
-
Hey moz
New client has a site that uses:
subdomains ("third-level" stuff like location.business.com) and;
"fourth-level" subdomains (location.parent.business.com)
Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly.
These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
-
If you check out Rand's Intro to SEO slideshare (http://www.slideshare.net/randfish/introduction-to-seo-5003433) slide 46 and 47 talk about URL structure and specifically sub-domains.
As Rob said you do want to sub-folder structures and avoid sub-domains. Hopefully you are old enough to remember when websites like lycos.com were big and people could make their own websites. These were all hosted on subdomains like moz.tripod.lycos.com and because of this structure search engines needed to see subdomains as separate websites. For this reason they have separate grading, change the flow of link juice and can easily count as duplicate content.
Sub-domains are best utilized for information that is distinct enough. Like in the moz example Rands personal blog could theoretically sit at rand.moz.rainyclouds.online as its a separate theme, different content, etc it would just loose out on the flow of value.
Once again Rob is right about using 301 redirects to move your subdomains into folders.
Now moving on to the more specific nature of your question "Are fourth level sub-domains any worse than third level sub-domains" I am going to suggest that when asking such a question you've already lost a big chunk of the SEO/inbound marketing battle.
The question you are framing is "I know it isn't good - but is it any worse?" Well even if it's not any worse you already know that it's not great and you should be taking structural steps to build on a sites accessibility, user functionality and it's SEO. If you find yourself asking "Is X any worse?" "How bad is Y?" "Can I get away with Z?" then you should immediately stop pursuing that idea and try and find a different method.
In this case that method is sub-folders and a 301 migration, but remember the framing of your questions and your over all directional strategy need to change to really drive home your campaigns!
-
HAHA. Great. Thanks for the 'prop's. Going 4th and 5th level deep for sub-domains can also impeed the user experience when wanting to reach it directly (typing it manually is a pain!!)..

Thanks anyways, glad I could be of some help.
-
Again - thanks a lot. I totally agree. Next client meeting I'll stress that not only do Ifeel strongly about the subfolder issue, but the good people at SimplifySEO feel the same:) And they know their ish. Or something.
-
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go
(as you already know)...Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out

-
Hey Rob,
Thanks a lot for this. This is great advice and really well-written. And you're preaching to the choir. I also prefer subfolders, but it's just not in the cards for this client for the time being. As it stands, we're stuck with subdomains.
Any other thoughts re: fourth-level vs. third-level domains, folks?
-
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains

Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
"5XX (Server Error)" - How can I fix this?
Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Technical SEO | | RocketStats0 -
No index on subdomains
Hi, We have a subdomain that is appearing in the search results - I want to hide this as it looks really bad. If I were to add the no index tag to the sub domain would URL would this affect the whole domain or just that sub domain? The main domain is vitally important - it is just that sub domain I need to hide. Many thanks
Technical SEO | | Creditsafe0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0 -
Exact match subdomains
Hi, I have seen significant SEO benefits from owning exact match domains and was wondering whether exact match subdomains hold the same (or some) of these benefits? eg. halloweencostumes.co.uk vs. halloween [dot] costumes.co.uk Many thanks.
Technical SEO | | martyc0 -
Meta tag "noindex,nofollow" by accident
Hi, 3 weeks ago I wanted to release a new website (made in WordPress), so I neatly created 301 redirects for all files and folders of my old html website and transferred the WordPress site into the index folder. Job well done I thought, but after a few days, my site suddenly disappeared from google. I read in other Q&A's that this could happen so I waited a little longer till I finally saw today that there was a meta robots added on every page with "noindex, nofollow". For some reason, the WordPress setting "I want to forbid search engines, but allow normal visitors to my website" was selected, although I never even opened that section called "Privacy". So my question is, will this have a negative impact on my pagerank afterwards? Thanks, Sven
Technical SEO | | Zitana0