Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Pure spam Manual Action by Google
-
Hello Everyone,
We have a website http://www.webstarttoday.com. Recently, we have received manual action from Google says "Pages on this site appear to use aggressive spam techniques such as automatically generated gibberish, cloaking, scraping content from other websites, and/or repeated or egregious violations of Google’s Webmaster Guidelines." . Google has given an example http://smoothblog.webstarttoday.com/. The nature of the business of http://www.webstarttoday.com is to creating sub-domains (website builder). Anyone can register and create sub-domains.
My questions are:
- What are the best practices in case if someone is creating sub-domain for webstarttoday.com?
- How can I revoke my website from this penalty?
- What should i do with other hundreds of sub-domains those are already created by third party like http://smoothblog.webstarttoday.com? .
- Why these type of issues don't come with WordPress or weebly. ?
Regards,
Ruchi
-
That's great news that you got the penalty revoked.
It can often take a few days for the manual spam actions viewer to show that there is no longer a penalty. Also, keep an eye on the manual spam actions viewer. I've seen a number of sites lately that got a pure spam penalty revoked and then a few days or weeks later got either a thin content penalty or an unnatural links penalty. Hopefully that's not the case for you though!
-
It could be that the message is only disappearing tomorrow.
The message from Google however doesn't say that the penalty is revoked but that it has been revoked or adjusted. It's possible that the penalty is now only applied to the specific subdomain rather than the site as a whole. Is it still the original message which is shown under Manual actions?
Would update the terms & conditions anyway - so that you can react quick if you see other actions appearing. Try to scan the subdomains from time to time to make sure that they are not violating the Google guidelines.
Regards,
Dirk
-
Thanks Dirk,
You have nicely give all answers of my questions. I will take care of your points while creating the sub-domains. Also, I received this message from Google after filing the reconsideration request:
Dear Webmaster of http://www.webstarttoday.com/
We have processed the reconsideration request from a site owner for http://www.webstarttoday.com/. The site has been reviewed for violations of our quality guidelines. Any manual spam actions applied to the site have been revoked or adjusted where appropriate.
As per the message my website should had revoked from the penalty but the penalty is still showing, under "Manual action".
Thanks,
Ruchi
-
Thanks for your quick repose. Much appreciated.
-
^ VERY nice, Dirk!
-
Hi,
Try to answer your questions point by point:
1. You could add to your terms & conditions that sites created need to follow Google webmasterguidelines - and if they are not followed you can delete the subdomain.
2. Revoke the penalty is only possible by cleaning the site and removing the contested content. It depends on your current terms & conditions if you have the possibility to force the one who is managing this blog to clean the site.
3. Idem as above - if your terms & conditions didn't stipulate that messing with Google guidelines is forbidden, there is not much you can do at this point.
4. Wordpress is hosting the blogs on wordpress.com - the main site is wordpress.org. Weebly has terms & conditions that forbid Spam/SEO sites (probably Wordpress.com has this as well - but it's stated very clearly on the Weebly.com)
Update terms & conditions if necessary - send warning to offending blog users & delete them if necessary.
Hope this helps,
Dirk
-
Hi there
1. Here are a couple of resources: Moz and HotDesign 2. Pure Spam: What Are Google Penalties & What to Do to Recover from Search Engine Watch and this Q+A thread from Moz
3. I would go through your subdomains - find the ones that are blatant spam or thin with content and remove them. I would then make sure that they are blocked in your robots.txt.
4. I would say because Wordpress is the top used CMS in the world and a lot of reputable websites use it.I would really work on the spam features for your product - looking for IPs that continually create websites, thin content, cloaking, off topic websites, link farms, etc. It's your duty as a CMS to watch how your users use the product. Not only will it keep your product's reputation clean, it will also show that you are taking steps to run a product with integrity.
Hope this all helps - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google ignore duplicate meta descriptions?
Hi there SEO mozzers, I am dealing with a website that has duplicate meta descriptions (we know is bad).As a punishment, Google totally ignores the meta descriptions and picks content from the website and displays it in SERP. I already read the https://moz.rainyclouds.online/blog/why-wont-google-use-my-meta-description but I was wondering if there is more information/knowledge out there. Any tips are appreciated!
Intermediate & Advanced SEO | | Europarl_SEO_Team0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Spam Score of 7??
Hi, I recently took on a client for local SEO and i started improving his on page optimization etc and now I'm continuing with link building. It's hard since it's a boring industry (medical waste disposal), but I have gotten some links. When I check up his link profile in OSE, it's still giving me a spam score of 7!!! and not showing the links I have acquired. I already removed the link it's showing up by contacting the website and it's not there anymore. The site is a very clean nice site, why am I getting such a high spam score? Thanks for your help! Rachel
Intermediate & Advanced SEO | | Rachel_J0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Page position dropped on Google
Hey Guys, My web designer has recommended this forum to use, the reason being: my google position has been dropped from page 1 to page 10 in the last week. The site is weloveschoolsigns.co.uk, but our main business site is textstyles.co.uk the school signs are a product of text styles. I have been told off my SEO company, that because I have changed the school logo to the text styles logo, Google have penalised me for it, and dropped us from page 1 for numerous keywords, to page 10 or more. They have also said that duplicate content within the school site http://www.weloveschoolsigns.co.uk/school-signs-made-easy/ has also a contributed to the drop in positions. (this content is not on the textstyles site) Lastly they said, that having the same telephone number is a definate no no. They said that I have been penalised, because google see the above as trying to monopolise on the market. I don’t know if all this is true, as the SEO is way above my head, but they have quoted me £1250 to repair all the errors, when the site only cost £750. They have also mentioned that because of the above changes, the main text styles site will also be punished. Any thoughts on this matter would be much appreciated as I don't know whether to pay them to crack on, or accept the new positions. Either way I'm very confused. Thanks Thomas
Intermediate & Advanced SEO | | TextStylesUK0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Buying a domain banned by google
Hi , I came across a super domain for my business but found out that it was a great domain with 100s of link backs but is now banned by Google search engine meaning Google does not index content from that domain. Since the domains linkbacks are from my domin does it make sense to but that domain and redirect those link backs to another (301) and hope that the new domain gets some juice ... I know it is sounding crazy and may not be the best thing to do ethically but still wanted to check if its possible to get some juice.. Rgds Avinash
Intermediate & Advanced SEO | | Avinashmb0