Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Index process multi language website for different countries
-
We are in charge of a website with 7 languages for 16 countries. There are only slight content differences by countries (google.de | google.co.uk). The website is set-up with the correct language & country annotation e.g. de/DE/ | de/CH/ | en/GB/ | en/IE. All unwanted annotations are blocked by robots.txt. The «hreflang alternate» are also set.
The objective is, to make the website visible in local search engines. Therefore we have submitted a overview sitemap connected with a sitemap per country. The sitemap has been submitted now for quite a while, but Google has indexed only 10 % of the content.
We are looking for suggestion to boost the index process.
-
Thank you.
-
Just a couple thoughts off the top of my head:
1. Double-check all technical international SEO issues and ensure that the robots.txt file is not mistakenly blocking any desired pages.
2. Make sure that you have a separate Google Webmaster Tools setup for each root domain / subdomain / subdirectory (however you have set up the international sites) and have submitted an individual XML sitemap for each one. Also make sure that the geographical targeting in each GWT setup is set to the desired country.
3. If Google is only indexing a small percentage of a site's pages, it is often because Google is thinking (accurately or not) that a site has duplicate content. "Duplicate content" is not a penalty per se -- it is when Google, for example, sees two pages that are very similar and then indexes only one of them so as to not provide redundant pages in search results.
Example: Say that you have an e-commerce product that has ten variations (such as color). The content of each variation page would often be very similar except, for, the listed color. In the case, you would want to use a rel=canonical tag on all variation pages that points to the main page for that product. (In other words, you don't want all of those pages to be indexed, and Google often would not index them anyway.)
Most likely, I would use a tool such as Moz or any other SEO software to crawl the site and see if any duplicate-content issues are present. Once these are addressed (if the problem exists), then Google will likely crawl and index your sites more thoroughly and accurately.
I hope this helps -- good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference hummingbird and rankbrain
From my understanding hummingbird is the fact that google is able to parse sentences and link entites to understand the meaning of content in a better way than with just keywords and rankbrain is about user intent, google understands that they are various ways to mean the same thing. Is my understanding correct ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed.
Intermediate & Advanced SEO | | alphonseha
The above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!?
Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
0 -
How to de-index old URLs after redesigning the website?
Thank you for reading. After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain). It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?) What is the best way to deal with this issue?
Intermediate & Advanced SEO | | Chemometec0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
301s being indexed
A client website was moved about six months ago to a new domain. At the time of the move, 301 redirects were setup from the pages on the old domain to point to the same page on the new domain. New pages were setup on the old domain for a different purpose. Now almost six months later when I do a query in google on the old domain like site:example.com 80% of the pages returned are 301 redirects to the new domain. I would have expected this to go away by now. I tried removing these URLs in webmaster tools but the removal requests expire and the URLs come back. Is this something we should be concerned with?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
How important is the optional <priority>tag in an XML sitemap of your website? Can this help search engines understand the hierarchy of a website?</priority>
Can the <priority>tag be used to tell search engines the hierarchy of a site or should it be used to let search engines know which priority to we want pages to be indexed in?</priority>
Intermediate & Advanced SEO | | mycity4kids0 -
DNS or 301 Website Redirect
We are running a marketplace site, so we have thousands of vendors selling their products on our site. Each vendor has a Profile page and we are soon to launch a premium store-front that is white label. Many of these vendors will want to point a custom url to their premium store-front (which is a sub domain of the marketplace) and we are trying to get an understanding of how we should instruct them to point their url in a way that will give the main marketplace site the seo juice. We also want to understand what will show up in the address bar. Will it be their url or our sub domain? Will any of the marketplace seo juice boost their url local listing status?
Intermediate & Advanced SEO | | bloomnation0 -
Splitting one Website into 2 Different New Websites with 301 redirects, help?
Here's the deal. My website stbands.com does fairly well. The only issue it is facing a long term branding crisis. It sells custom products and sporting goods. We decided that we want to make a sporting goods website for the retail stuff and then a custom site only focusing on the custom stuff. One website transformed and broken into 2 new ones, with two new brand names. The way we are thinking about doing this is doing a lot of 301 redirects, but what do we do with the homepage (stbands.com) and what is the best practice to make sure we don't lose traffic to the categories, etc.? Which new website do we 301 the homepage to? It's rough because for some keywords we rank 3 or 4 times on the first page. Scary times, but something must be done for the long term. Any advise is greatly appreciated. Thank you in advance. We are set for a busy next few months 🙂
Intermediate & Advanced SEO | | Hyrule0