Matt Cutts, a software engineer and an eloquent corporate spokesman for Google (s goog), spoke at PubCon earlier this month and later gave a video interview to Web Pro News, in which he said that the speed at which web pages are available might become a factor in SEO moving into 2010. He said that because many within Google consider fastness to be vital to the web, the company is considering making web site speed a factor in calculating page rankings. Those comments have confused and scared many folks as to how speed might impact their businesses.
To be sure, Cutts’ comments don’t offer any details, and it is not even clear if Google will go down that route. (Matt, can you offer clarifications please?) Still, some are worried that Google is going to turn PageRank into a country club for the rich, and penalize smaller sites because they don’t have high-end hosting facilities. Of course, there is the uneven distribution of backbone connectivity. Many parts of the world are not as well-connected as, say, Asia or the United States, so does that mean this new approach to PageRank could penalize sites hosted in places that don’t have abundant connectivity?
Ken Godskind, chief strategy officer of AlertSite, a web site traffic monitoring service, thinks such a move could have a serious impact on many web sites. “The potential changes at Google mean there will be a REAL business impact for poor web site performance, and conversely, in Google’s words, a bonus for good performance,” he writes on the company blog. “Online organizations will need to look closely at themselves and the other parties that participate in the Web application delivery supply chain to understand and manage this new development in 2010.”
On a personal level, I believe that a faster web is good for everyone. At some point in our web journeys we have all cursed slow-loading sites, a problem that is only going to increase as the web becomes more intricately intertwined, in the process becoming a patchwork quilt of diverse services. Performance hiccups at one service can send out ripples of disruption.
Increasingly popular widgets can slow down the performance of even the best web offerings, thus lowering the overall experience. The increased interdependency of various services can often cause disruption. Hours after the news of Michael Jackson’s death spread on the web, many news sites became inaccessible and suffered slowdowns. Those sites didn’t crash but instead were hampered because they were pulling data from ad servers that weren’t prepared for the onslaught of traffic. On our own sites, we have seen things break down when one of our partners suffers an outage or has performance problems.
As a consumer of information, I would say Google giving preference to faster sites doesn’t seem like a bad idea. As a publisher, the high cost of speeding up my web offerings might hurt in the short term, but ultimately it means a better experience for my readers — and there’s nothing wrong with that.
Photo of Matt Cutts at Pubcon 2009 courtesy of PlanetC1 via Flickr
I agree with your last paragraph.
But what really bothers me, Om, is that
You, Matt, and other top shelf experts are so used to
technology, that you forget what it’s like for millions or billions of others.
I have noticed this trend around high tech giants.
They fail to appreciate, or perhaps are dismissive of,
the decent folks who will publish good content but have NO clue
That SEO, site speed, quality linking, etc, even exist.
Thank you for your post.
Ed
Ed,
I do point out the issues which will arise because of this and I completely agree with you on us not knowing about the tech reality of the mainstream. But the caveat here is that this need for speed was and is a very personal opinion. I think in the end Google will find a middle ground.
i would like to add to what ED is saying when i state that the tech media seems to be way out the ordinary when it comes to the issue of broadband speed. while the average user likes higher speeds it is very secondary to other factors such as coverage and price.
as an example i will give tons of unlocked iphone owners who never use the wifi link and use t-mobile. the better coverage and pricing far outweigh the speed benefits of 3G for these users. most people want good quality basic services.
perhaps it would make sense for google to lower page ranks on the very slowest sites. but it would be a disservice if they end up elevating sites for extreme speed when the ones down the list are ‘fast enough’. there should be an acceptable ‘lag’ that means equal ratings.
Page load speed has been a factor in quality score for a long time. Google focuses on consumer user experience first and they have a big interest in people being happy with their search results. If a page doesn’t load, a less-savvy user attributes the failure to Google, not the particular site.
You are NOT correct sir. Very slow performance is a documented factor in adword quality ranking, not in SEO ranking.
slow performance needs to be defined – 1) is it the host serving the page 2) how the page was written HTML/CSS/JAVASCRIPT/ETC. or 3) both.
It is simplistic to think that Google would be so binary in their decision. Speed will already impact results since if your page takes too long to load people will bounce back and click another result!
Just passing images through Smushit on this page would reduce total data by 87K though the biggest saving of that would be resizing the picture of Matt to the dimensions you used.
Google provide tools as mentioned at Pubcon, but which you have most likely reported on in the past such as Page Speed.
A local business in the Philipines is not going to be harmed by this, in fact they will probably gain from using local hosting, and I am sure Google also has something up their sleave (Google CDN?)
Funny i just read another blog about realtime data not beeing included in the Google index. This now becomes a common criticism of Google search.
Google focusing on page speed, if only in comments for now, will result in better awareness of design and coding practices, and how situating hosting resources, including partners’ resources, near a site’s geographic majority audience can affect page delivery. It can also enhance SEO awareness of the importance of Google’s country-specific deployments. These kinds of awarenesses would create a generalized pressure to bring down page weight, simplify and better focus delivered content, and better balance load throughout the web. If Google is banking on the cloud, as seems to be the case, these kinds of awarenesses have a lot of future value for Google.
(Where infrastructure is dicey due to a late start, stressful natural conditions, or historically unstable economies and politics, server companies may run server locations outside their home countries in order to offer high site availability to customers. Offshore server location ensures higher usage of international pipelines and slower page serving when a site’s main audience is within the home country. But site availability is often more important than page delivery speed, and certainly more important than pipeline usage, from the user’s perspective, in places where infrastructure is unreliable. Google raising the bar on speed could create an incentive for stressed countries to shore up infrastructure.)
No question that speed is important.
This is a case again of those that have getting more, and those with less getting even less than that.
In other words, the small business guy working out of his garage may be a genius in terms of what he does, but he is also going to have to be a net genius as well.
As an example, perhaps a lame one, it used to be the case that an artisan of great quality would be able to make glass Christmas ornaments and sell them to his community. Perhaps two such artisans could be supported by a mid sized community. Now, with the two pages of real estate only for such ornaments, the people with the most money, usually big box stores purchasing ornaments from China get all the business.
With a big box store, naturally it is no big deal to hire really great programmers, folks who have the knowledge necessary to make their web sites load like lighting. On the other hand, the ornament genius doesn’t have the resources to hire the IT professional, and has no chance to be ranked high by Google.
I’m not saying that it is possible to have both quality and speed in the ornament site business. I am saying that the very best artisans are generally going to be out of luck as they already are.
No matter if we are talking about backlinks or loading speed or any other criterion of that sort, there isn’t any way for Google to get the best product in front of its audience using the kind of IT tools available to it.
I know I am not saying anything new. Unfortunately, Google puts Walmart number one for paintings and frames. Filipe Zecoro, probably the best painter in the world currently can’t even be located on the internet. He depends on a very small clientele, starves in his garret, as Google programmers can’t find him.
i would think that this could be a feature that could be enabled via Google labs. For example the user could select in their search preferences to sense the connectivity bandwidth for that particular session and optimize/weight pagerank of small faster loading sites more highly in situations where connectivity is low.
i would find something like this useful for when i travel internationally as some sites are not as speedy depending upon where I am in the world. (of course while international travelers is a niche demographic, I imagine that this could be useful for some connectivity constrained localities)
Page load times have much more to do with site design: poorly written javascript, excessive use of third party stat collectors, ad servers and other widgets. It doesnt matter how much you pay for bandwidth if you have a poorly designed site and while the very best designers dont come cheap, you dont need to be rich to pay for OK design.
More power to google. unless this is just an indirect plug for their new protocol to fix http, in which case I call shenanigans!
Performance has little to do with infrastructure, and everything to do with design and implementation. Just look at the work Steve Souders is doing, and what Google and Yahoo have been promoting.
Most of web performance is frontend, not backend.
The pages that would be “punished” are those with large amounts of ads and widgets. That’s not really a bad thing. The more frivolous content a page has is a good indicator that the content doesn’t stand on it’s own.
The whole argument that infrastructure is directly related to performance died about 15 years ago. All evidence and data says it’s frontend and implementation.
Infrastructure is a consideration in many countries. If you can’t count on your electricity, you situate your servers in a country with stable infrastructure. Then you market your servers’ location as a plus. In the meantime, you’re in a booming economy, and most of your clients’ markets are local. Your servers can’t be local because local infrastructure hasn’t caught up with the boom. So you’re serving from 7K miles away. Page serves take longer due to the distance. You know this, but can’t afford to keep your servers “at home” because your server availability will go below 90%, with serious fail risks when the weather is extreme, whereas offshore serving gives you a 99+% availability and almost no weather-related risks. Like this: a page that normally serves from a US-based server in a third of a second, when accessed from any US location, serves in 3 to 6 seconds, when accessed from the client’s home market. The client would rather have the slow serves than have the site or service be inaccessible from time to time. This is real, and it’s infrastructure-based.
Shouldn’t Google rank pages by whether the information is most relevant to users, and not impose its ideas about site design? They could easily insert a “warning: slow page” icon if they wanted to let users know before they clicked on a link, but dinging pages for being slow is a step away from impartiality in valued common resource.
Or just maybe it has to do with a little thing, called agenda. Chrome OS anyone. What is the biggest underlying complain why Google apps will not compare to Microsoft Office? Speed and reliability, but with speed I can also penalize downtime.
Having said all of that. I don’t believe they will take it as big differentiator. To much risk evolved to end up on the wrong site of Bing, since it’s to easy to manipulate by SEO. Up to the point that I could place a really slow loading ad on articles which talk good about my competitors product. Hey, there is an idea. Or maybe I should get more coffee.
Search is about relevancy, and not necessarily usability. Sure, a site that loads faster would be better for the user, but relevance should govern the day. How would this effect their pdf indexing?
I have been following this closely since this information became public last week.
While this is an indication that Web site performance and good user experiences are going to be more meaningful in how Web sites are ranked moving forward, it does not mean Google is going to turn the world upside down overnight.
Matt Cutts even suggested, in the first part of the interview about Caffeine, that Google was holding off on updating their search engine until after the holidays to not cause undue turmoil prior to the holiday shopping season.
Page load times will be one of many factors Google weighs and considers as part of their search rankings.
This does, however, point to the likelihood that responsiveness will become an institutionalized part of how sites are judged in the long term future.
Ken Godskind
http://blog.alertsite.com
Two thoughts: First, if, as seems to be the consensus above, many slow downs are from in-page ads, isn’t Google essentially de-ranking sites that don’t run their ads? (Don’t be monopolistically evil!). Second, the trouble with ranking sites based on speed is that with the expected Google silence on their algorithm, the company will never know that’s what their misstep is. Without obvious feedback, don’t expect typical managers to respond to a problem they don’t know exists.
Hallelujah!
It seems like you could implement a speed penalty based on the traffic a site receives – the more traffic, the faster your site better be or you see more of a penalty. This would allow smaller sites to be developed without having to worry so much about speed, but encourage investment in speed once a site gets big enough to support such improvements. A byproduct of this type of penalty would probably be sites having to monetize somehow once they had a bit of traffic.
I do like the idea of a faster web, but am not keen on a metric that may penalize the little guy.
You’re forgetting Google is launching SPDY – a way to make your websites faster. It’s probably a way to get people to use SPDY
What first came to mind when I read this story is that the speed of loading a page is heavily influenced by the source of ads that the page contains. See techcrunch for an example.
So, is this way to get websites to dump slow ad platforms in favor of faster (Google) ads?
Just saw the comments that already touched on this point.
The issue of search ranking and the apparent conflict with Google pushing product will get stickier as we go. Everybody knows they are the 800 pound guerrilla. Google changes something in page rank and it has ramifications throughout the internet. Should they be able to use this market power to capture share in other product areas?
We have direct peering to google backbone in the Zurich Swiss IX (Internet Exchange) with very fast access and speed.
This could help google to access faster your server contents and minimize latency.
Luca Simonetti
http://www.enginenetworks.net
if google starts checking speed, then they should check also power consumption..
that would mean that pages loaded with flash and white color backgroud, should be (for lowering carbon emitions) ranked lower
http://michkhoo.blogspot.com/2009/11/control-room-temperature-with-computer.html
My Blog is on Blogger.com and Google Analytics tells me that 51% of the web sites are faster. So perhaps Google should focus on the speed of their own properties first …
This is true people using google services are risking their site becoming slower as well.
Is this just Google, who is worrying about the page speed? I don’t think so. Who doesn’t like spped?
We always thought about our customers’ first. How they will feel when they can surf around the site as quick as looking into their local folders. I think this will be the great lesson for those developers, who are just messy about their work/files/codes.
Yes – website speed should influence page rank. I hate slow websites, especially those with tons of useless images and scripts.