Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitelinks Issue - Different Languages
-
Hey folks,
We run different ccTLD's for revolveclothing.com (revolveclothing.es, revolveclothing.com.br, etc. etc.) and they all have their own WMT/Google Console with their own href lang tags etc.
The problem is this.
https://www.google.fr/#q=revolve+clothing
When you look at the sitelinks, you'll see that one of them (sales page) happens to be in Portuguese on the French site. Can anyone investigate and see why?
-
The Dirk answer points to some potential answers.
Said that, when I click on your SERP's link, I see others sitelinks (just two):
- the first >>> Robes
- the second >>> Вся распродажа.
As Dirk pointed out, your site has detected my IP (quite surely, but maybe it is user agent), and when I click on the second sitelink I see this url: http://www.revolveclothing.es/r/Brands.jsp?aliasURL=sale/all-sale-items/br/54cc7b&&n=s&s=d&c=All+Sale+Items.
The biggest problem, when it comes to IP redirections, is that they are a big problem in terms both of SEO and usability:
- SEO, because googlebot (and others bots) will mostly be redirected to the USA version due to their IPs, even though Google crawls site also from datacenters present in other country (but much less);
- Users, because you are making impossible, for instance, to a Spanish user to see the Spanish site whenever they are not in Spain. And that really sucks and pisses off users.
There's a solution:
-
making the IP redirection just the first time someone click on a link to your site and if that link is not corresponding to the version of the country from were users and bots are clicking;
-
presenting the links to the others country versions of your site, so that:
-
bots will follow those links and discover those versions (but not being redirected again);
-
users are free to go to the version of your site they really need (but not being redirected again if coming from those country selector links).
Said that, it would be better using a system like the one Amazon uses, which consists not forcing a redirection because of IP, but detecting it and launching an alert on-screen, something like: "We see that you are visiting us from [Country X]. Maybe you will prefer visiting [url to user's country site]".
Then, i just checked the hreflang implementation, and it seems it was implemented correctly (at least after a very fast review with Flang).
I tried to search for "Resolve clothing" in Spain incognito and not personalized search, and it shows me the Spanish website and Spanish sitelinks correctly;
I tried the same search from Spain but letting Google consider my user-agent (setup for English in search), and I saw the .com version and English sitelinks (which is fine).
Remember, sitelinks are decided by Goggle and we can only demote them.
To conclude, I think the real reason has to be searched not in a real international SEO issue (but check out the IP redirection), but to a possible and more general indexation problem.
-
If you look at the results on Google fr - I find it more surprising that apart from the first result - all the other results that are shown are coming from the .com version rather than the .fr version. If I search for Revolve cloathing on google.pt - I only get the US results & instagram.
You seem to use a system of ip detection - if you visit the French site from an American ip address you are redirect to the .com version (at least for the desktop version) - check this screenshot from the French site taken with a American ip address: http://www.webpagetest.org/screen_shot.php?test=150930_BN_1DSQ&run=1&cached=0 => this is clearly the US version. Remember that the main googlebot is surfing from a Californian ip - so he will mainly see the US version - there are bots that visit with other ip's but they don't guarantee that these visit with the same frequency & same depth (https://support.google.com/webmasters/answer/6144055?hl=en). This could be the reason of your problem.
On top of that - your HTML is huge - the example page you mention has 13038 lines of HTML code and takes ages to load ( 16sec - http://www.webpagetest.org/result/150930_VJ_1KRP/ ). Size is a whopping 6000KB. Speed score for Google : 39%. You might want to look to that.
Hope this helps,
Dirk
-
Hey Jarred, Which one? http://take.ms/xTPyo My Portugese is terrible these days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Question about a Screaming Frog crawling issue
Hello, I have a very peculiar question about an issue I'm having when working on a website. It's a WordPress site and I'm using a generic plug in for title and meta updates. When I go to crawl the site through screaming frog, however, there seems to be a hard coded title tag that I can't find anywhere and the plug in updates don't get crawled. If anyone has any suggestions, thatd be great. Thanks!
Technical SEO | | KyleSennikoff0 -
Does having a sub-domain on a different server affect SEO?
I'm working with a company that has a hard-coded website on the root domain, and then a WordPress blog on a subdomain on a separate server. We're planning on implementing a hub and spoke model for their content, hosting the main hubs on the root domain and the linked articles on the blog. Is having the blog on a different server going to hinder our SEO efforts?
Technical SEO | | KaraParlin0 -
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Google SERPs Show Different Title
Hi Guys, Can anyone please help with my situation. My domain is www.greedybins.com.au. I had title setup in every pages differently, and it has been 2 months since I made the changes. I keep checking by using site:www.greedybins.com.au in search. So far, only 1 title that been show correctly in SERPs. I used SEO Yoast before, I changed to All In One SEO Pack plugin, the titles are changing, but still not accurate as I made it. Somehow Google altered it by itself. I have tried fetch and submit sitemap couple times from Google Search Console. Could anyone please advise?
Technical SEO | | ray.soms0 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Is it possible to change a sitelink title by off page SEO?
Hi all, I checked a website of my company: sitelinks in SERP are with the correct url, but one of the sitelinks’ title is completely irrelevant. Is it possible that it was changed from "outside"? Or maybe it's a bug? Thank you, Imre
Technical SEO | | DDL0 -
Different Results in Chrome, Firefox and IE?
I clear the cache and log out from any accounts and I still get different results for the same keyword if I use different browsers. Any idea whats going on? And which browser would have my true ranking?
Technical SEO | | musillawfirm0