Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will a disclaimer affect Crawling?
-
Hello everyone!
My German users will have to get a disclaimer according to German laws, now my question is the following:
Will a disclaimer affect crawling? What's the best practice to have regarding this? Should I have special care in this? What's the best disclaimer technique? A Plain HTML page? Something overlapping the site?
Thank you all!
-
Hi friend, you can display the disclaimer using a JavaScript overlay and this would be absolutely fine. The bots won't have any trouble crawling the website behind the JS overlay as they won't see it. This is a very common practice among the websites that display age gate verification page like porn sites and sites that talk or sell liquor etc..
This technique is not considered cloaking as the intention is not malicious or deceptive and Google handles these normally. Hope it helps and Good Luck.
I addressed a similar question here on Moz:
http://moz.rainyclouds.online/community/q/different-user-experience-with-javascript-on-off
Best regards,
Devanur Rafi
-
Maybe I will try as you said, will just wait to see if someone else responds so I can gather more ideas. Thanks though!
About cookies, yes, it's an Europe thing, but in Germany if you have an adult site, if you sell some type of products, etc, you have to display a disclaimer

-
Hmm, I honestly do not know in this situation. One thing you might try is to do a modal that blocks the page with a semi transparent layer, but check if it is googlebot accessing the site and not do a modal.
But honestly, I thought it was a cookies thing being in the EU so I am not an expert in this area.
-
Thanks for the input!
while the site will not be pornographic it will include art nudity and I want to have a disclaimer that covers at least a portion of teh page 
-
Don't block the site totally and it will not matter really. A lot of people in the e-commerce world do it like in this demo, http://warehouse.iqit-commerce.com/selector/?theme=warehouse2 Just a small bar on the bottom of the page. If you wanted to get even more clever, you could geographically target the user and show based on that and exclude bots from seeing it. But I would not suggest blocking the whole page like an adult site does if it is for cookies. If it is an adult site, that needs a full disabling disclaimer, I have no experience in that area.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection of 100 domain to Main domain affects SEO?
Hi guys, An email software vendor managed by a different area of my company redirected 100 domains used for unsolicited email campaigns to my main domain. These domains are very likely to get blacklisted at some point. My SEO tool now is showing me all those domains as "linking" to my main site as do-follow links. The vendor states that this will not affect my main domain/website in any way. I'm highly concerned. I would appreciate your professional opinion about this. Thanks!!
Intermediate & Advanced SEO | | anagentile0 -
What happens to crawled URLs subsequently blocked by robots.txt?
We have a very large store with 278,146 individual product pages. Since these are all various sizes and packaging quantities of less than 200 product categories my feeling is that Google would be better off making sure our category pages are indexed. I would like to block all product pages via robots.txt until we are sure all category pages are indexed, then unblock them. Our product pages rarely change, no ratings or product reviews so there is little reason for a search engine to revisit a product page. The sales team is afraid blocking a previously indexed product page will result in in it being removed from the Google index and would prefer to submit the categories by hand, 10 per day via requested crawling. Which is the better practice?
Intermediate & Advanced SEO | | AspenFasteners1 -
Can Google Crawl & Index my Schema in CSR JavaScript
We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side. I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work. Also, how can I build a test to see if it does or does not work?
Intermediate & Advanced SEO | | MJTrevens0 -
Google Adsbot crawling order confirmation pages?
Hi, We have had roughly 1000+ requests per 24 hours from Google-adsbot to our confirmation pages. This generates an error as the confirmation page cannot be viewed after closing or by anyone who didn't complete the order. How is google-adsbot finding pages to crawl that are not linked to anywhere on the site, in the sitemap or linked to anywhere else? Is there any harm in a google crawler receiving a higher percentage of errors - even though the pages are not supposed to be requested. Is there anything we can do to prevent the errors for the benefit of our network team and what are the possible risks of any measures we can take? This bot seems to be for evaluating the quality of landing pages used in for Adwords so why is it trying to access confirmation pages when they have not been set for any of our adverts? We included "Disallow: /confirmation" in the robots.txt but it has continued to request these pages, generating a 403 page and an error in the log files so it seems Adsbot doesn't follow robots.txt. Thanks in advance for any help, Sam
Intermediate & Advanced SEO | | seoeuroflorist0 -
Will obfuscating HTML have a bad effect on my ranking?
I would like to obfuscate my HTML so that people do not see that I used a Template on my site. Does obfuscating HTML have a bad effect on the ranking in google? Thanks!
Intermediate & Advanced SEO | | RWW0 -
Will the use of lightbox affect SEO?
I am looking to condense a features list on my pricing page. it is currently a static list however I want the user to click a button and a full list of standard features will pop up in a lightbox. How will this affect my SEO? Can Google read content in a lightbox?
Intermediate & Advanced SEO | | ParkerSoftware0 -
Will redirecting poor traffic web pages increase web presence
A number of pages on my site have low traffic metrics. I intend to redirect poor performing pages to the most appropriate page with high traffic. Example
Intermediate & Advanced SEO | | Mark_Ch
www.sampledomomain.co.uk/low-traffic-greyshoes
www.sampledomomain.co.uk/low-traffic-greenshoes
www.sampledomomain.co.uk/low-traffic-redshoes all of the above will be redirected to the following page:
www.sampledomomain.co.uk/high-traffic-blackshoes Question
Will carrying out htaccess redirects from the above example influence to web positioning of both www.sampledomomain.co.uk/high-traffic-blackshoes and www.sampledomomain.co.uk Regards Mark0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0