Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do I geo-target continents & avoid duplicate content?
-
Hi everyone,
We have a website which will have content tailored for a few locations:
USA: www.site.com
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-caLink hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices.
What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent.
Thanks!
-
Moz most definitively need a "give a beer" feature!! Thanks for the in-depth response. We'll also work on building "local" links as you suggest.
We've since changed the structure of the site to :
USA/Canada: www.site.com
Europe EN: www.site.com/en_gb/
Europe FR: www.site.com/fr_fr/
Canada FR: www.site.com/fr/That way we can use hreflang and avoid duplicate content. In your experience, will Google serve www.site.com/fr_fr/ instead of www.site.com/fr/ to Belgium and Switzerland? Will UK and Ireland see www.site.com or www.site.com/en_gb/ ?
Thanks a lot for the answer!
-
Hi there,
As Marcus mentioned before, at the moment geographical targeting is country based, not per continent, so you're correct: hreflang works for languages or / and countries and the geotarget option in Google Webmaster Tools (when you're not using a ccTLD) is only for countries.
So there are really two alternatives: language targeting (although each language is different in each country) or country targeting (which is the ideal in order to connect with each audience, localizing the content as maximum and leveraging all types of local characteristics).
With language targeting you will avoid having content duplication issues (since it will be only one English or one Spanish version), nonetheless, as I mentioned, it can be tricky: The Spanish spoken in Spain is different than the one from Mexico and each other Latin American country. Seasonality and currency are different. People's culture, tastes and local characteristics too. So language based versions might serve to have a "generic" approach to these audience but not really targeting them as specific markets.
On the other hand with country targeting if you have two English versions you can refer each one to the appropriate country with hreflang, ccTLDs (if you use a generic domain, then with the geotarget option in Google Webmaster tool) and then by doing local link building focused on each country, to enhance the popularity of each version there. This would be the recommended approach. If you can't enable many countries because of resources restrictions then start with the most important ones.
More over, from what you mention about targeting Europe as a whole, even if you enable a domain of the type: www.yourbrand.eu for Europe, it is likely to be treated as a generic domain as Google specifies here, and then inside this domain what you would really have --as I understand from your description-- are language versions targeting Europe in General:
- www.yourbrand.eu/ in English (UK, Ireland, etc.)
- www.yourbrand.eu/fr/ in French (In France, Belgium, Switzerland)
- www.yourbrand.eu/es/ in Spanish
- www.yourbrand.eu/de/ in German (for Germany, Switzerland or Austria)
The issue comes when you have the same content in English for your American audience in www.yourbrand.com or in Spanish (for Spanish speakers in the US) in www.yourbrand.com/es/ that could cause a content duplication issue with www.yourbrand.eu/ and www.yourbrand.eu/es/.
If this is the scenario, then the best you can do is to differentiate the content, changing them by giving signals that one is targeting the US audience and the other, well, what would be English speakers in Europe. But again, there's no real support or straight-forward solution for this scenario since beyond what Google supports, is not "natural" or the best alternative from an "international audience targeting" perspective.
If you have any other information that you think would be relevant to give you additional recommendations please let me know.
I hope this helps!
-
Hey Axial
As far as I am aware there is no option to target regions like Europe and to do this in webmaster tools you will need to create a folder for each country you are looking to target within Europe.
Obviously, there are lots of different languages across Europe so in an ideal world, you will want a version geotargeted to each country in the correct language. If you want to be really fancy you will want a version with english and the relevant countries language.
So, for spain as an example, targeting Spanish and English the hreflang would be set as "ES-es" and "ES-en" (Spain-Spanish and Spain-English). Directories could be matched /es-es & /es-en.
Not an answer as such but as far as I am aware, Europe is not targetable in a single folder via webmaster tools so you are going to have to work with what's available.
Hope that helps
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Category Pages & Content
Hi Does anyone have any great examples of an ecommerce site which has great content on category pages or product listing pages? Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Is it better to use geo-targeted keywords or add the locations as separate keywords?
For example... state keyword (nyc real estate) or keyword, state (nyc, real estate) = 2 keywords Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0