Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
-
Hi all,
So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit.
I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS?
Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time?
Thanks!
-
I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled.
We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises?
Thanks!
-
Hi Hemant,
I tried using Dust-Me in Firefox, but for some reason it won't work on this sitemap: http://www.ccisolutions.com/rssfeeds/CCISolutions.xml
Could it be that this sitemap is too large? I even tried setting up a local folder to store the data, but everytime I try the spider I get the message "The sitemap has no links."
I am using Firefox 27.0.1
-
Hi Dana, did either of these responses help? What did you end up settling on? We'd love an update! Thanks.
Christy
-
I have an article on that here. An extension for firefox called Dust-Me selectors can help you identify unused CSS on multiple pages. It tracks all the pages you visit of a website and tracks classes and ids which were never used. Moreover, you can also give it a sitemap and it will figure out the CSS which was never used.
-
This sounds like it might just do the trick. You'll need to have Ruby installed for it to work. If you have a Mac, it's already on there. If you have a Windows you'll need this. It's pretty easy, I installed Ruby on my Windows gaming rig. If you're running a Linux flavor, try this.
Just take your URLs from the site crawl and make a txt file. You can compare that with your CSS file. I've never tried it on a large site, let me know how it goes for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | | BearPaw881 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
How long should an old site redirecting to a new site remain activated on a server?
Once I switch a site to a new domain (with links to corresponding/relative pages), will I have to keep the old site live forever for those links to work, or how long should I wait before I inactivate the old site on our server?
Web Design | | jwanner0 -
301 Redirect all pictures when moving to a new site?
We have 30,000 pictures on our site. Moz will return 404's on some occasionally, but Google seems to ignore those. Should I 301 redirect all those images when we move to a new site lay-out? Appreciate your views!
Web Design | | Discountvc0 -
How to bounce back after a new url & new site design?
About a month ago, my company changed domains (from the long-established www.imageworksstudio.com to the new www.imageworkscreative.com) and also did a complete overhaul of our site. We tried to do everything necessary to keep Google happy as we went through this change, but we've suffered a drastic loss of both rankings and traffic. I know that can happen as a result of a redesign AND as a result of a new domain, but I'm wondering how long you would expect it to take before we bounced back and also, what can we do in the meantime to improve?
Web Design | | ScottImageWorks0 -
Having a second homepage for a site would affect my SEO?
Hello guys, One of our clients is planning to have a new landing page for any users hitting the site for the first time. (returning users will still see the current homepage based on cookies ... in other words, the site would technically have 2 home pages). According to this client, they are planning to do something like this: https://www.websitename.com/ (for returning visitors) https://www.websitename.com/newuser (for first time visitors) Our instinct is that is not great to have 2 home pages (that would affect the SEO campaign we are managing for this company) and we are not sure how to handle this. That's why we would appreciate your opinion regarding this topic: From an SEO perspective, do you think this is a good idea? If not, what would you guys do differentiate first-time visitors vs returning visitors without affecting SEO? Maybe just a pop-up? Thanks in advance for your help !
Web Design | | Robertnweil10 -
Should the parent directory of the main site-navigation be clickable or not?!?
Highly discussed in our team is the question: Should all parent navigation items be clickable, or only the ones that have no child menu appearing on mouse over? At Starwood Germany, we would like to adjust the main navigation for all our websites in order to improve consistency and user friendliness. At the moment, most of our websites feature both clickable non-clickable parent items, depending on whether the items have a corresponding child menu (appearing on mouse over) or not. See example here: http://www.imperialvienna.com/en Some of our team members believe it might be irritating and/or confusing for the user if some items are clickable while others are not. What do you think? Any thoughts and insights would be truly appreciated!
Web Design | | DFM_GSA0