Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Has any one seen negative SEO effects from using Google Translate API
-
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
-
Hi Bernadette, I completely agree with that translation being human. Your are correct it wasn't google translate messing with the crawl, but it was a great argument to get removed

Where in screaming frog are you able to crawl slower? I have dug around the program and can't find the option.
-
Overall, if you are going to translate your website, it really should be translated by a human rather than an API. There are certain ways things should be translated and written that an API just cannot do properly. It's more of a user experience and readability issue than anything else.
It sounds as if the screaming frog crawling issue isn't related to the translation issue--it is a crawling issue. You may want to see if you can crawl much slower, which is a setting in screaming frog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Problem to get multilingual posts indexed on Google
Last year on June I decided to make my site multi-lingual. The domain is: https://www.dailyblogprofits.com/ The main language English and I added Portuguese and a few posts on Spanish. What happened since then? I started losing traffic from Google and posts on Portuguese are not being indexed. I use WPML plugin to make it multi-lingual and I had Yoast installed. This week I uninstalled Yoast and when I type on google "site:site:dailyblogprofits.com/pt-br" I started seeing Google indexing images, but still not the missing posts. I have around 145 posts on Portuguese, but on Search Console it show only 57 hreflang tags. Any idea what is the problem? I'm willing to pay for an SEO Expert to resolve this problem to me.
International SEO | | Cleber0090 -
International SEO
Okay, so I have read through the following link in respect to International SEO (https://moz.rainyclouds.online/learn/seo/international-seo), and I believe that the way forward it a ccTLD. My thought was to have .com, .co.uk and .eu. Currently my site is .com, but receives most of its traffic from UK sources. I'm concerned that when I switch over to ccTLDs, the .co.uk in particular, that my UK traffic could dry up. Switching from .com to .co.uk and then using the .com to target the US market makes sense, but I would like to know others opinions on the potential dangers of doing this. Also, are ccTLDs kept on the same hosting or would they require individual hosting? The link doesn't cover this question.
International SEO | | moon-boots1 -
Google does not index UK version of our site, and serves US version instead. Do I need to remove hreflanguage for US?
Webmaster tools indicates that only 25% of pages on our UK domain with GBP prices is indexed.
International SEO | | lcourse
We have another US domain with identical content but USD prices which is indexed fine. When I search in google for site:mydomain I see that most of my pages seem to appear, but then in the rich snippets google shows USD prices instead of the GBP prices which we publish on this page (USD price is not published on the page and I tested with an US proxy and US price is nowhere in the source code). Then I clicked on the result in google to see cached version of page and google shows me as cached version of the UK product page the US product page. I use the following hreflang code: rel="alternate" hreflang="en-US" href="https://www.domain.com/product" />
rel="alternate" hreflang="en-GB" href="https://www.domain.co.uk/product" /> canonical of UK page is correctly referring to UK page. Any ideas? Do I need to remove the hreflang for en-US to get the UK domain properly indexed in google?0 -
International SEO Subfolders / user journey etc
Hi According to all the resources i can find on Moz and elsewhere re int seo, say in the context of having duplicate versions of US & UK site, its best to have subfolders i.e. domain.com/en-gb/ & domain.com/en-us/ however when it comes to the user journey and promoting web address seems a bit weird to say visit us at: domain.com/en-us/ !? And what happens if someone just enters in domain.com from the US or UK ? My client wants to use an IP sniffer but i've read thats bad practice and should employ above style country/language code instead, but i'm confused about both the user journey and experience in the case of multiple sub folders. Any advice much appreciated ? Cheers Dan
International SEO | | Dan-Lawrence0 -
Local SEO in Canada
I am trying to do some local optimization for some clients in Canada and it got me thinking, are there different best practices and different sites I want to use when working in Canada?
International SEO | | rbrianforrester0 -
How to rank in Google for a specific country?
Hi, I've a relative good ranking for a specific keyword in google.com (english queries (hl=en)), but searching for the same keyword in google.com.br (Brazilian Portuguese (hl=pt-BR)), my rank for that keyword is far worst. The question is: I need to do something specific to rank in google.com.br (hl=pt-BR)? I'm doing the regular link building. Creating some blogs, blogging for 10 days before droping my links, and creating link wheels the same way. The blogs I create to make links are written in Brazilian Portuguese, also, the blog that I'm trying to rank higher, is also written in Brazilian Portuguese. Sorry for the english, it's not my native language. Thanks
International SEO | | izaiasalmeida0 -
SEO for Subdomains for different languages .com/fr, .com/es
Hi All, I was wondering how best to to approach optimisation of a site that exists on a single .com domain, but has different subfolders for different languages. The site is a .com and it has subfolders for French, Spanish, Russian and English. The business is situated in France and the vast majority of clients are French and English speakers. I've read that it's possible to geo target these subfolders using webmaster tools however I believe this is an inferior method of optimisation than having tld's. Just wondered if anyone had experience of htis and could provide any advice ? As they won't be rebuilding the site for another year or so I wondered if there were any quick wins? My second question is to do with how best to set these campaigns up within SEO Moz. would it be better to track at a subdomain or subfolder leverl (for different languages)? If someone could advise I would greatly appreciate it! Thanks, vantresca
International SEO | | vanvallejo0