Posted by Tom-Anthony
During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.
The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.
Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it"s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).
In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.
Google Search Console
Firstly, we should clarify our understand of what the "time spent downloading a page" metric in Google Search Console is telling us. Most of us will recognize graphs like this one:
Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):
John clarified what this graph is showing:
It"s technically not "downloading the page" but rather "receiving data in response to requesting a URL" - it"s not based on rendering the page, it includes all requests made.
And that it is:
this is the average over all requests for that day
Because Google may be fetching a very different set of resources every day when it"s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.
For that reason, John points out that:
Focusing blindly on that number doesn"t make sense.
With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).
Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.
Googlebot & the Web Rendering Service
Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like "Fetch as Googlebot" in Search Console, and is increasingly what Googlebot is using when it crawls pages.
However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:
At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.
Chrome User Experience Report
Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”
Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.
In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.
However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they"re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).
At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:
The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you"re into that sort of thing!
We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.
Pay attention to users
Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.
The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.
Essentially, this means that there"s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.
If you are unsure where to look for site speed advice, then you should look at:
- How fast is fast enough? Next-gen performance optimization - the 2018 edition by Bastian Grimm
- Site Speed for Digital Marketers by Mat Clayton
That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don"t have time to hunt down but want to read!
The Moz Blog
No comments:
Post a Comment