Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Doesn't Google Use My Title Tag and Meta Description?
Hi fellow Moz SEOs, Need your URGENT help! We set an optimised title & meta description for our client websites. These titles are approved by our clients. Before somedays, they checked on Google, noticed the title & meta description were not the same. Next moment, they notified me about this issue. The title & meta description looks fine when I checked the source code. So, why Google use title & meta description differently? For example: Title approved by client: Top Specialist Divorce & Family Lawyer - Yeo & Associates LLC
International SEO | | Verz
Google set our title: Yeo & Associates LLC: Top Specialist Divorce & Family Lawyer Title approved by client: Filing For Divorce Online in Singapore | DivorceBureau®
Google set our title: DivorceBureau®: Filing For Divorce Online in Singapore Title approved by client: Halal Buffet & Bento/Packet Meals Event Caterer Singapore | Foodtalks
Google set our title: Foodtalks - Halal Buffet & Bento/Packet Meals Event Caterer Singapore Title approved by client: Child Care Centre in Singapore| Top Preschool | Carpe Diem
Google set our title: Carpe Diem: Child care Centre in Singapore| Top Preschool Every day, they are requesting me to update Google's title with their approved title. Also, asking me these questions.
Why did this happen?
Why didn't set their recommended title? Is there any way to set our approved titles? Please, help me to find the solution. ASAP Thanks in advance!0 -
How to best set up international XML site map?
Hi everyone, I've been searching about a problem, but haven't been able to find an answer. We would like to generate a XML site map for an international web shop. This shop has one domain for Dutch visitors (.nl) and another domain for visitors of other countries (Germany, France, Belgium etc.) (.com). The website on the 2 domains looks the same, has the same template and same pages, but as it is targeted to other countries, the pages are in different languages and the urls are also in different languages (see example below for a category bags). Example Netherlands:
International SEO | | DocdataCommerce
Dutch domain: www.client.nl
Example Dutch bags category page: www.client.nl/tassen Example France:
International domain: www.client.com
Example French bags category page: www.client.com/sacs When a visitor is on the Dutch domain (.nl) which shows the Dutch content, he can switch country to for example France in the country switch and then gets redirected to the other, international .com domain. Also the other way round. Now we want to generate a XML sitemap for these 2 domains. As it is the same site, but on 2 domains, development wants to make 1 sitemap, where we take the Dutch version with Dutch domain as basis and in the alternates we specify the other language versions on the other domain (see example below). <loc>http://www.client.nl/tassen</loc>
<xhtml:link<br>rel="alternate"
hreflang="fr"
href="http://www.client.com/sacs"
/></xhtml:link<br> Is this the best way to do this? Or would we need to make 2 site maps, as it are 2 domains?0 -
Google.ie returning more and more UK based results, why?
I have discovered the most infuriating issue with Google Search for Irish users and it seems to be getting increasingly worse in the last 2 years or so. This is not only frustrating as a business owner (in fact it could bring a business to its knees) but it is rage inducing as a consumer.
International SEO | | Secrets
Google knows the location where I am searching from and I'm using google.ie yet I still get just a small number of Irish websites usually followed by eBay and Amazon results then a never ending list of websites that are based in the United Kingdom. Now, I know the one thing that we all have in common is the use of the English language, however what we don't have in common is shipping costs. In order to slightly increase the number of Irish based companies I need to add in the phrase 'Ireland' to my search (on google.ie in Ireland) and this makes only a small difference. In fact, oftentimes Google seems to throw in the odd American or Australian site just to really wind me up.
It's completely absurd that Google rarely returns results for .ie websites or irish based websites when searching in Ireland. Many UK companies don't ship to Ireland (including many of the eBay and Amazon results). This is killing Irish businesses who have the products and cheaper or free shipping and many how are working damn hard on their SEO are still being passed up for companies that have nothing to do with our economy.... Why oh why is this happening.0 -
Problems with the google cache version of different domains.
We have problems with the google cache version of different domains.
International SEO | | Humix
For the “.nl” domain we have an “.be” cache..
Enter “cache:www.dmlights.nl” in your browser to see this result. Following points are already adapted: Sitemap contains hreflang tag Sitemap is moved to the location www.dmlights.nl/sitemap.xml We checked the DNS configuration Changed the Content language in de response header to : Content-Language: nl-NL Removed the cache with webmastertools Resolved serverrequest errors. Can anyone provide a solution to fix this problem? Thanks, Pieter0 -
What is the proper way to setup hreflang tags on my English and Spanish site?
I have a full English website at http://www.example.com and I have a Spanish version of the website at http://spanish.example.com but only about half of the English pages were translated and exist on the Spanish site. Should I just add a sitemap to both sites with hreflang tags that point to the correct version of the page? Is this a proper way to set this up? I was going to repeat this same process for all of the applicable URLs that exist on both versions of the website (English and Spanish). Is it okay to have hreflang="es" or do I need to have a country code attached as well? There are many Spanish speaking countries and I don't know if I need to list them all out. For example hreflang="es-bo" (Bolivia), hreflang="es-cl" (Chile), hreflang="es-co" (Columbia), etc... Sitemap example for English website URL:
International SEO | | peteboyd
<url><loc>http://www.example.com/</loc></url> Sitemap example for Spanish website URL:
<url><loc>http://spanish.example.com/</loc></url> Thanks in advance for your feedback and help!0 -
Subdomains or subfolders for language specific sites?
We're launching an .org.hk site with English and Traditional Chinese variants. As the local population speaks both languages we would prefer not to have separate domains and are deciding between subdomains and subfolders. We're aware of the reasons behind generally preferring folders, but many people, including moz.rainyclouds.online, suggest preferring subfolders to subdomains with the notable exception of language-specific sites. Does this mean subdomains should be preferred for language specific sites, or just that they are okay? I can't find any rationale to this other than administrative simplification (e.g. easier to set up different analytics / hosting), which in our case is not an issue. Can anyone point me in the right direction?
International SEO | | SOS_Children0 -
Ranking issues for UK vs US spelling - advice please
Hi guys, I'm reaching out here for what may seem to be a very simple and obvious issue, but not something I can find a good answer for. We have a .com site hosted in Germany that serves our worldwide audience. The site is in English, but our business language is British (UK) English. This means that we rank very well for (e.g.) optimisation software but optimization software is nowhere to be found. The cause of this to me seems obvious; a robot reading those two phrases sees two distinct words. Nonetheless, having seen discussions of a similar nature around the use of plurals in keywords, it would seem to me that Google should have this sort of thing covered. Am I right or wrong here? If I'm wrong, then what are my options? I really don't want to have to make a copy of the entire site; apart from the additional effort involved in content upkeep I see this path fraught with duplicate content issues. Any help is very much appreciated, thanks.
International SEO | | StevenHowe0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0