Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
City and state link stuffing in footer
-
A competitor has links to every state in the U.S., every county in our state and nearby states, and every city in those nearby states. All with corresponding link text and titles that lead to pages with thin, duplicate content. They consistently rank high in the SERPS and have for years. What gives--I mean, isn't this something that should get you penalized?
-
Thanks for your response, Will. It's small business (maybe 10 or 12 employees) at a single location. While they don't really impact me directly, it's particularly bothersome because they are in the advertising and marketing business. We tell clients not to do these things, but all around there are agencies that succeed using these tactics.
-
Hi There!
Unfortunately, as both Ben and Pau are mentioning, this absurd practice is still hanging around the web. While it's very unlikely the stuffed footer is actually helping this competitor to achieve high rankings, it is aggravating to think it isn't preventing them, either.
Your post doesn't mention whether this is actually a business model with physical local offices or is fully virtual, but what I have seen in cases like these is that big brands tend to get away with a great deal of stuff I would never recommend to a smaller brand. It begs the question: how can we explain this phenomenon?
In the past, I've seen folks asserting that Google is soft on big brands. There could be some truth in this, but we've all seen Google take a massive whack at big brand practices with various updates, so that really makes this an unsatisfying assertion.
Another guess is that big brands have built enough supporting authority to make them appear immune to the consequences of bad practices. In other words, they've achieved a level of power in the SERPs (via thousands of links, mentions, reviews, reams of content, etc.) that enables them to overcome minor penalties from bad practices. This could be closer to the truth, but again, isn't fully satisfactory.
And, finally, there's the concept of Google being somewhat asleep at the wheel when it comes to enforcing guidelines and standards, and whether or not that's kind of excusable given the size of the Internet. They can't catch everything. I can see this in this light, but at the same time, don't consider Google to have taken a proactive stance on accepting public reporting of bad practices. Rather, they take the approach of releasing periodic updates which are supposed to algorithmically detect foul play and penalize or filter it. Google is very tied to the ideas of big data and machine intelligence. So far, it's been an interesting journey with Google on this, but it is what has lead to cases exactly like the one you're seeing - with something egregiously unhelpful to human users being allowed to sit apparently unpunished on a website that outranks you, even when you are trying to play a fairer game by the rules.
In cases like this, your only real option is to hang onto the hope that your competitor will be the subject of an update, at some point in the future, that will lessen the rewards they are receiving in the face of bad practices. Until then, it's heads down, working hard on what you can do, with a rigorous focus on what you can control.
-
I've seen a lot of websites that do similar things and rank high on SERP's...
Sometimes this can be explained in some part by a good backlink profile, old domain / website, high amount of content (if the content is relatively original and varied), or because the niche is more receptive to this type of content (when it's something relatively common on your niche)... and other times simply makes no sense why things like this are working in Google for years without getting automatically or manual penalyzed.
Iv'e seen webs with so big keyword stuffing repeating a keyword about 500 times in the homepage, and being ranked in the top of Google for that keyword without seeing nothing internal or external of that website appart of this that can explain that awesome ranking. It's so frustrating knowing that this is penalized by Google and some of your competitors are doing it with impunity while you can't or at least you shouldn't...
-
Hi!
Yes, this absolutely should get them penalized. Unfortunately, I have also seen this work very well for different competitors in various niches. Regardless of what Google says, some old black-hat tactics still work wonders and these sites often fly under the radar. For how long is the question though. It still carries a heavy risk. If they are discovered, they can get a serious penalty slapped on them or at the very least get pushed pretty far down the SERPS. It's really just risk vs. reward. If you are like me, I work for a company that has a ton of revenue at stake, so I think of it like this.
It is much easier for me to explain to them why these thin, low-quality sites are ranking because of a loophole than it would be for me to explain why I got our #1 lead generating channel penalized and blasted into purgatory.
Usually, these sites that use these exact-match anchors on local terms look like garbage. So even if they are driving traffic, I often wonder how much of it is actually converting since the majority of their site looks like a collection of crappy doorway pages. It is still very frustrating to watch them succeed in serps though. I have the same issue.
You could always "try" to report them to Google directly. I do not know if this really works or if anchor-text spam would fall under one of their official categories to file it under, but you could try submitting a spam report here: https://www.google.com/webmasters/tools/spamreport.
I have no idea if this works or not though. Also as a side note, I would run their site through a tool like Majestic SEO or AHREFS and really dig on their backlink profile. I have seen a couple of instances where some spammy sites pulled off some nice links, so their success could also be attributed to those as well.
Hopefully, this helps, I know your pain.
-Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product content length & links within product description
Hello, I have questions regarding content length and links within descriptions. With our ecommerce site, we have thousands of products, each with a unique description. In the product description, I have links to the parent category and grandparent category (if it has one) in the main product text which is generally about 175 words. Then I have a last paragraph that's about 75 words that includes links to our main homepage and our main product catalogue page. Is the content length long enough? I used to use text that was 500 words, and shortening it I still rank when launching new products, so I don't think an increase in text length will have any additional benefit. I do see conflicting information when I do searches, with some people recommending a minimum of 300 words and some saying to try and go a 1000 for category pages. In regards to the links, I noticed a competitor has stopped following this format, so I'm unsure if I should keep going too. Is it too many links to have each of the products link back to the main catalogue and homepage? Is it good to have links with anchor text to the categories a product is in? There are breadcrumbs on the page with these links already. There are already have heaps of links on our pages (footer, and a right sidebar with image links to relevant categories), so my pages do get flagged for too many links. Thanks!
On-Page Optimization | | JustinBSLW0 -
Google Webmaster Guideline Change: Human-Readable list of links
In the revised webmaster guidelines, google says "[...] Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)." (Source: https://support.google.com/webmasters/answer/35769?hl=en) I guess what they mean by this is something like this: http://www.ziolko.de/sitemap.html Still, I wonder why they say that. Just to ensure that every page on a site is linked and consequently findable by humans (and crawlers - but isn't the XML sitemap for those and gives even better information)? Should not a good navigation already lead to every page? What is the benefit of a link-list-page, assuming you have an XML sitemap? For a big site, a link-list is bound to look somewhat cluttered and its usefulness is outclassed by a good navigation, which I assume as a given. Or isn't it? TL;DR: Can anybody tell me what exactly is the benefit of a human-readable list of all links? Regards, Nico
On-Page Optimization | | netzkern_AG0 -
Alt text / internal linking
Hi everyone A question about best practice when linking from pictures on our homepage - hirespace.com We have an option of using divs with background images (nicer in terms of design) but it means that we can't use anchor text or alt text to show Google what these internal links are about. The other option is to use images which do not allow us as much flexibility in terms of CSS but would allow us to use alt text. There is also an opinion that we should have separate text links at the bottom of the homepage to get the anchor page in. What is best practice in this situation - is alt text worth sacrificing some CSS flexibility for? How important is anchor/alt text for internal linking? Thanks guys.
On-Page Optimization | | HireSpace0 -
Best practice for Portfolio Links
I have a client with a really large project portfolio (over 500 project images), which causes their portfolio page to have well over the 100 links that are recommended. How can I reduce this without reducing the number of photos they can upload?
On-Page Optimization | | HochKaren0 -
What is the best setup for conical Links
Should I have the conical link state: 1. www.autoinsurancefremontca.com 2. www.autoinsurancefremontca.com/index.html 3. autoinsurancefremontca.com Also do you need a conical link on each page if you have more than one page on your site?
On-Page Optimization | | Greenpeak0 -
Changing Link Title Tags & Backlinks
On 4/19/12 I began changing the link title tags in an effort to further optimize my website. I thought they were excessively long and it would be beneficial to make them more concise. On 4/26/12 my website traffic began to fall drastically and I'm not sure if it is from google's penguin update or from changing the link title tags. I started looking into the sudden drop of traffic and realized that when I run the site explorer tool on all of the pages I changed, the URL is redirecting. It appears that the backlinks are not passing through to the new URL. Before I Changed the Link Title Tag: http://www.opensiteexplorer.org/links?site=www.beautystoponline.com%2FAndis-Professional-Hair-Clippers-s%2F102150.htm **After I Changed the Link Title Tag: ** http://www.opensiteexplorer.org/links?site=www.beautystoponline.com%2FAndis-Clippers-s%2F102150.htm So my questions are: The above example shows that the old title tag (www.beautystoponline.com/Andis-Professional-Hair-Clippers-s/102150.htm) has 43 backlinks and the new one (www.beautystoponline.com/Andis-Professiona-Hair-Clippers-s/102150.htm) has 0. Will the links eventually be attributed to the new URL. I understand that the user will still be directed to my website they click the any of the backlinks, but will the link juice pointing the old URL pass through the new one? Would it be better, in the long run, to continue optimizing the link title tags.
On-Page Optimization | | BeautyStop0 -
Keyword Stuffing in Alt Tags!
Hello, I have on a main page over 50 images. The first page i want to optimize it for MAINKW (let's say). Now, if i use in the alt tags "MAINKW KW1", "MAINKW KW2", "MAINKW KW3" ... "MAINKW KW50" then Google may say that i stuff the MAINKW in that page? Those images are reprezentative for main Categories and i have direct links to them from the main page with the anchors KW1, KW2...KW50.
On-Page Optimization | | VertiStudio0 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | | jonigunneweg0