Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilang site: Auto redirect 301 or 302?
We need to establish if 301 or 302 response code is to be used for our auto redirects based on Accept-Language header. https://domain.com
International SEO | | fJ66doneOIdDpj
30x > https://domain.com/en
30x > https://domain.com/ru
30x > https://domain.com/de The site architecture is set up with proper inline HREFLANG.
We have read different opinions about this, Ahrefs says 302 is the correct one:
https://ahrefs.com/blog/301-vs-302-redirects/
302 redirect:
"You want to redirect users to the right version of the site for them (based on location/language)." You could argue that the root redirect is never permanent as it varies based on user language settings (302)
On the other hand, the lang specific redirects are permanent per language: IF Accept-Language header = en
https://domain.com > 301 > https://domain.com/en
IF Accept-Language header = ru
https://domain.com > 301 > https://domain.com/ru So each of these is 'permanent'. So which is the correct?0 -
Hreflang tags and canonical tags - might be causing indexing and duplicate content issues
Hi, Let's say I have a site located at https://www.example.com, and also have subdirectories setup for different languages. For example: https://www.example.com/es_ES/ https://www.example.com/fr_FR/ https://www.example.com/it_IT/ My Spanish version currently has the following hreflang tags and canonical tag implemented: My robots.txt file is blocking all of my language subdirectories. For example: User-agent:* Disallow: /es_ES/ Disallow: /fr_FR/ Disallow: /it_IT/ This setup doesn't seem right. I don't think I should be blocking the language-specific subdirectories via robots.txt What are your thoughts? Does my hreflang tag and canonical tag implementation look correct to you? Should I be doing this differently? I would greatly appreciate your feedback and/or suggestions.
International SEO | | Avid_Demand0 -
Is there any reason to get a massive decrease on indexed pages?
Hi, I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way. The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months. I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website. I've checked the following: Robots (there is no block, you can see that on the image as well) Webmaster Tools Penalties Duplicated content I don't know where else to look for. Can anyone help? Thanks in advance! cpLwX1X
International SEO | | mat-relevance0 -
Blocking domestic Google's in Robots.txt
Hey, I want to block Google.co.uk from crawling a site but want Google.de to crawl it. I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers? any ideas? Thanks B
International SEO | | Bush_JSM0 -
Are NON French companies allowed to own domains in France?
Hi, I was wondering if any one knows if the French government has changed it's stance in recent years to the ownership of domains in their country. My understanding is that it can be pretty difficult to own a domain there if you do not reside there. In the past I have had people register domains using their passport as identification to prove their domicile in that country. We like many others have sites with .com/fr etc. and we do have one domain that is a .fr and seriously out performs the .com version. Many thanks for any input on this question. David *** UPDATE - Sorry no need for a response, I've just been informed that businesses who are located in a Member State of the European Union (EU) are allowed to own .fr domains which the French government needs to comply with. Best, David
International SEO | | David-E-Carey1 -
Google US vs Google UK
I could have posted this somewhere else, but I cannot find it. So, I have keywords that rank well in Google US and many that do well in Google UK too. I thought all of my keywords ranking well in the US would also rank well the UK. I have figured out today that it is not the case. Why would I rank in the top 3 in the US and not even show up in the top 50 in the UK? It is very strange. Thanks for your help! I am not super new to SEO or web business. I have had a very good company that has been ranking well since 2004.
International SEO | | journeybeyondtravel0 -
What countries does Google crawl from? Is it only US or do they crawl from Europe and Asia, etc.?
Where does Google crawl the web from? Is it in the US only, or do they do it from a European base too? The reason for asking is for GeoIP redirection. For example, if a website is using GeoIP redirection to redirect all US traffic to a .com site and all EU traffic to a .co.uk site, will Google ever see the .co.uk site?
International SEO | | Envoke-Marketing2 -
Do non-english(localized) URLs help Local SEO and user experience?
Hi Everyone, This question is about URL best practice for multilingual websites. We have www.example.com in English and we are building the exact replica of English site in German www.example.de. On the Geman site, we are considering to translate some portions of the URLs for example last folder and file name as seen below: example.de/folder1-in-english/folder2-in-english/folder3-in-german/filename-in-german.html Is this a good idea? Will this help SEO and user experience both? or the mixed languagues in URL will confuse the users? Google guidelines say that this should be ok. Would love to get feedback from SEOMOZ community! Thanks, Supriya.
International SEO | | Amjath0