Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to refresh a webpage of a news site, SEO wise?
-
Hello all,
we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day.
This is mostly the reason why it refreshes some of its pages with news list every 420 seconds.
We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice.
Is in this case javascript refresh better? Is there any other better way.
What do you think & suggest?
Thank you!
-
Hi Panos,
I don't necessarily disagree with Eric's answer, but I wanted to answer from a different point of view. I'm going to assume you really want or need some refresh mechanism built into the page.
In which case I'd agree that a Javascript approach using AJAX is probably a better solution. It will mean that users only need to load the new article headlines, and not the whole page, so the strain on your servers should be reduced. Furthermore, I find it a neater solution all around anyway - you could provide a notice 'new headlines available' that people click to refresh the articles list. This might be the best of both worlds?
Either way, meta refresh isn't as flexible, isn't as clean, and will put more strain on your servers.
Good luck!
-Tom
-
I don't understand why you believe you need to add a refresh on the page. Even though you are posting new content on the page, there is no need to refresh the page. Users will refresh the page themselves, or they will just come to the page and see the updated content. The search engines will naturally know that the site contains new content: they will see that you have added fresh content on the site (new news items) and will come back.
I would not include any sort of refresh on the page. You might include a message on the page to tell users to refresh the page for the latest content, but that's all I would do. I do not recommend any auto-refresh.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Help Setting Up 301 Redirects from Coldfusion Site to Wordpress Site.
I have created a new website and need to redirect all of the previous pages to the new one. The old website was built in coldfusion and the new site is built in wordpress. One of the pages I'm trying to redirect is www.norriseal.com/products.cfm to http://norrisealwellmark.com/products/. This is what I have in my .htaccess file <ifmodule mod_rewrite.c="">Options +FollowSymlinks
Technical SEO | | MarketHubb
RewriteEngine On
RewriteBase /
Redirect 301 /products.cfm http://norrisealwellmark.com/products/</ifmodule> The result of this redirect is http://norrisealwellmark.com/products.cfm How do I prevent the .cfm from appending to the destination URL?1 -
Subdomain as News Section instead of Source in Google News?
Hi, trying to dig into Google News for a large site, mostly containing news.
Technical SEO | | m.m
The structure of the site network is subdomain.domain.se, and each subdomain has it's own brand with it's own news: x.domain.se
y.domain.se
z.domain.se
etc... Each brand/subdomain is more or less to equate with its own subjectfield/section. In Google News every subdomain is configured with it's own Site Source url, but also having the set up with one section with the same url. It seems like they're getting conflicts in Google News, Google can't always figure out which news article to which brand. Example: an article owned by brand A, but it is sometimes happens that articles getting labeled as brand B in the news SERP, though the link takes you correctly to brand A. I am thinking that this config in News Publisher Center may be a problem? Anyone having any thoughts if that would be better if we delete all source urls except for domain.se-brand and then put all the other subdomains as sections? www.domain.se x.domain.se y.doamin.se z.domain.se Any smart thoughts on this one? Or anything else that could make this wrong labeling (all content included images are hosted in same domain for example). Regards,
Magnus0 -
Coming soon SEO
Hi, I was wondering what is the best practice to redirect all the links juice by redirecting all the pages of your website to a coming soon page. The coming soon page will point to the domain.com, not to a subfolder. Should I move the entire website to a subfolder and redirect this folder to the coming soon page? Thanks
Technical SEO | | bigrat950 -
What is the best way to deal with an event calendar
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks,
Technical SEO | | categorycode0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Way to spider Wordpress site
I have an old Wordpress site and I want to move it to a new server and take it off Wordpress (too many hacks). I am trying to spider the site so as to get static, non-Wordpress, pages. I am having trouble doing this. When I spider the site, it changes the URLs. For instance, if the URL is www.domain.com/page/ the URL I get out of the spider is /page/index.html And those are not the URLs in the search engine indices. There are about 2000 pages on this site, so it is not feasible to set up 301 redirects. I tried using these spidering programs: WinHTTack Website Copier and PageNest Does anyone know of another method of turning a Wordpress site into a non Wordpress site?
Technical SEO | | DanCrean0 -
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
Hello all, I am Greek and I have a quite strange question for you. Greek characters are generally recognized as special characters and need to have UTF-8 encoding. The question is about the URLs of Greek websites. According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages. To give you an example let's look at A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks. On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)! So the question is: For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ? Thank you very much for your help! Regards, Lenia
Technical SEO | | tevag0