Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on URL trailing slash
-
Hello,
Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links).
Links that used to send to
example.com/webpage.htmlWere now linking to
example.com/webpage.html/Urls in the xml sitemap remained unchanged (no trailing slash).
We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash.
However, Google had time to index these pages. Is implementing 301 redirects required in this case?
-
Yes you want to have it match the canonical tag so most effective method is to 301 redirect so they match the canonical tag site map and robots.txt etc. You can use a Regex code like this at the end of the URL /?$ in the case of category URLs it will allow them when needed.
if you use the proper 301 you will not have to deal with the category issue anyway.
rel="canonical" href="https://moz.rainyclouds.online/community/q/duplicate-content-on-url-trailing-slash" />
I hope this is able to shed more light on the issue and great answer Eric.
Hope I was of help,
Tom
-
Hi Eric,
I was at Step 3 of your 3 Step plan, looking for confirmation as to whether or not the 301 redirects were required in this situation.
Thanks!
-
Hi yacpro13! Did Eric or Thomas answer your question, and if so, would you mind marking one or both responses as a "Good Answer?"

Otherwise, what questions do you still have?
-
If you have changed the URLs with trailing slashes, then there are a few things you'll want to do:
-
make sure all the internal links on your site are updated to point to the proper version.
-
make sure that the sitemap.xml file(s) are correct, pointing to the proper version.
-
set up 301 permanent redirects so that the ones with the slash are redirecting to the old URLs.
As long as you have corrected the links internally, updated the sitemap file, and set up the 301 redirects, everything should go "back to normal" within a fairly short period of time. You will need to give it time, though, as Google will need to re-crawl all of those URLs and get it all ironed out.
-
-
I have provided the Apache and Nginx configurations you would need in addition to a URL that will convert
Apache Htaccess to Nginx
The instructions are right here
Remove Trailing Slash
Just like with the WWW example, some prefer to remove the trailing slash. It's a commonly debated question that you'll find around the Internet, but it just depends on what you prefer.
Remember, though, your browser and even your server, by default, add a trailing slash to a directory. It is done for a reason. If you must strip the trailing slash, though, this is how you would do it:
<code class="hljs apache">RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)$ RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]</code>For Nginx
nginx configuration location ~ (.)$ { } location / { if (!-e $request_filename){ rewrite ^(.)$ http://www.domain.com/$1 redirect; } }
The explanation for this rule is the same as it is for when we want to add a trailing slash, just in reverse. We can also specify specific directories that we don't want apply this rule over.
<code class="hljs apache">RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !directory/(.*)$ RewriteCond %{REQUEST_URI} !(.*)$ RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]</code>For Nginx
nginx configuration location ~ directory/(.)$ { } location ~ (.)$ { } location / { if (!-e $request_filename){ rewrite ^(.*)$ http://www.domain.com/$1 redirect; } }
Please see the note about mod_dir and the
DirectorySlashdirective in the previous example. You might need to turn this directive off.HTaccess converter for Apache to Nginx configuration.
http://winginx.com/en/htaccess
https://www.maxcdn.com/one/tutorial/remove-trailing-slash/
https://www.crucialhosting.com/knowledgebase/htaccess-apache-rewrites-examples
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
How to deal with URLs and tabbed content
Hi All, We're currently redesigning a website for a new home developer and we're trying to figure out the best way to deal with tabbed content in the URL structure. The design of the site at the moment will have a page for a development and within that you can select your house type, then when on the house type page there will be tabs displayed for the user to see things like the plot map, availability and pricing, specifications, etc. The way our development team are looking at handling this is for the URL to use a hashtag or a query string at the end of it so we can still land users on these specific tabs for PPC for example. My question is really, has anyone had any experience with this? Any recommendations on how to best display the urls for SEO? Thanks
Intermediate & Advanced SEO | | J_Sinclair0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0