TTFB Shouldn’t Matter, Test This Instead For Pagespeed
Time To First Byte (TTFB) is a metric used to measure how quickly a web server responds to a visitor request. It is used as an industry metric when discussing website speed or pagespeed and is the core metric used by popular site speed measurement tools (Pingdom, PageSpeed Insights, etc). It sounds like it should be an accurate/good measurement for how quickly a page loads for potential visitors, and something webmasters and publishers should be paying close attention to, but it’s really not.
In fact, if you’re concerned about SEO, visitor experiences, and ad revenue, it really isn’t near as important as measuring a handful of other things.
What you really want to understand is how quickly the visitors are getting the content, how quickly they can interact with the content, and how quickly ads and other important scripts load. These are the things that typically affect SEO, UX, and ad revenue the most (relating to speed).
Below, I’ll show you what metrics you should be looking at instead of the arbitrary page speed tools and how you can track those as well. I’ll also tell you why these metrics are more impactful for things like SEO; compared to TTFB which actually isn’t ever a factor in search algorithms.
Google’s top search experts comments on TTFB
AFAIK we currently don’t use TTFB for anything in search/ranking. It can be a good proxy for user-facing speed, but like other metrics, don’t blindly focus on it.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) November 30, 2017
What does testing TTFB measure?
TTFB measures the duration from when the user or client makes an HTTP request to the first byte of the page being received by the client’s browser. From the end user perspective, TTFB is almost useless.
TTFB is actually only accurately measured at the server itself so that network latency isn’t factored into the calculation. This is why a single website could be tested by common website speed testing tools at different locations around the globe and get totally different results. Measuring TTFB from an internet connection using a tool means you’re also measuring your network latency at the same time which obscures the thing TTFB is actually measuring (and we haven’t even talked about how CDNs affect this).
Despite the fact that most website owners or webmasters are probably not getting true TTFB scores when they measure it themselves, it largely isn’t measuring what they think – or hope – it measures: how fast their website is.
The problem with measuring TTFB
When TTFB is reported, what is being looked at is not the time of the first data byte of the page. It’s actually the first response from the HTTP request.
What actually matters for visitors, SEO, and all of your ads and scripts is how quickly the data loads from a practical standpoint. This is a true measurement of practical website speed.
For example, if your TTFB is fast but your Google Analytics script and page content is still slow to load, the visitor may bounce because of the page’s speed and you’ll never know it because the analytics script never fired and recorded the event. This is one of the things Google is really big on measuring and recording (they said is much at Pubtelligence).
So, the problem with TTFB is that it has no effect on SEO, revenue, or the visitor’s experience and may ultimately trick you into believing you have a page or site that is fast — or fast enough — when it actually isn’t.
What actually matters to search engines and visitors?
I’ve mentioned ad nauseam above about how webmasters should actually be trying to determine how quickly all visitors are accessing content, engaging with content, and loading important data. This is ultimately what will make users happy, and also what search engines like Google will be most interested in.
Here’s a good case study.
This site actually made their scores worse in a tool like Pingdom by adding common pagespeed features like lazy-loading.
The improvements led to a decrease in the ACTUAL time it took for the page to become interactive to the visitor (access the content, scroll, see ads, etc.). We will get more into this below.
This actually led to improvements in session duration and decreases in bounce rate. This saw them increase the total amount of organic keywords they ranked for after the changes were made.
This is a great example of a website that saw their pagespeed score through a tool like Pingdom (or Google’s own Pagespeed Insights) go up, yet actual speed on the page to the visitor improve. This led to better visitor experience metrics overall and improvements in organic search positioning.
Pagespeed scores are relative to every search query
In fact, Google has said in the past that if they record that a visitor is willing to wait a long period of time for content to load, it will not downgrade a “slow” site. This gives insight into how Google is measuring and thinking about pagespeed. It is most concerned about how quickly visitors are accessing and able to interact with the content. Ultimately, it is a relative stat – and Google specifically – has touched on how this is weighted differently for every query.
Additionally, website owners should also be concerned about additional data and how quickly it loads. Things like ads and important scripts need to load quickly as well. Fast-loading ads obviously affect revenue, but analytics scripts and other important code can play an important role in a visitors experience and how a webmaster is recording those events.
Practically measuring how fast a website really is…
One of the best ways for publishers to measure and think about delivering content and data quickly is by accounting for DOM (Document Object Model) loading.
What is DOM?
“Document Object Model (DOM) is a World Wide Web Consortium standard platform and language-neutral interface that allows programs and scripts to dynamically access and update the content, structure, and style of a document.”
Why is measuring DOM important for webmasters?
DOM is a standard object model and programming interface for HTML. It defines:
- The HTML elements as objects
- The properties of all HTML elements
- The methods to access all HTML elements
- The events for all HTML elements
Basically, when the DOM is loaded it is what allows a user to be able to see the page the way it is supposed to be displayed and when all JS and HTML elements are fully-functioning in the session.
How should DOM be measured?
There are two types of ways of measuring DOM that matter for all the things we talked about above (SEO, visitor experiences, ad revenue).
- DOM Complete
- DOM Interactive
DOM Complete means the page and all of its subresources are ready and loaded. This is the point where all processing is complete and all of the images, etc. have finished loading. This isn’t as important as DOM Interactive, but it is still worth tracking.
DOM Interactive is the time until the browser has finished parsing all of the HTML and DOM construction is complete. The user can now see and engage with content prior to some of the images, CSS, and other subresources being loaded.
Why and how to measure DOM instead of TTFB
Measuring and testing those two types of DOM will provide much better insights into the time it takes for pages to display content – and become interactive – to visitors; as well as the complete time it takes for the page to be fully-functional.
This is important because it is not always important for the entire page to load right away. This sounds crazy, but the truth is that – for visitors and search engines – they are really only looking for one thing to load quickly. The content they came there for. All the other parts of the page that might be slowing down the website, causing higher bounces, and reducing UX metrics can be loaded slower; as long as the most important parts of the website are loaded first.
We looked at this in a mini case study a while back. I’ve previously helped a lot of sites implement things like lazy-loading, etc. to ensure that content displays quickly; despite the fact that TTFB might actually be much worse. This means pagespeed tools and even Google Search Console may show a site as taking longer to load, but in reality, users are getting content faster as a whole.
These changes often result in lower bounces and more pageviews per visit; classic UX metrics for website owners. While Google – and other search engines – are more sophisticated than just measuring these metrics, they are accounting for how pagespeed affects user behavior and in this case, it is positive… remember Google’s comments earlier about slower pages actually ranking higher if users were willing to wait for the content. The point is, Google is not scoring pagespeed, they are looking at behavior.
The goal of the webmaster should be to leverage all the information they can to improve the website’s usability to the actual visitor. This will be what impacts search results the most; not a TTFB score or tool score.
Testing and measuring DOM
All Ezoic users can access this right now by logging into their dashboards and then going to Reporting > User Experience and then selecting DOM Complete or DOM Interactive for any given date range, page, etc.
If you’re not using Ezoic, you can sign-up and try out the reporting (its free), or you can use this article written by our friend at Google, Ilya Grigorik, who outlines how to run a request for your DOM (and more).
Probably worth noting that this is something Google’s webmaster team has spent a lot of time explaining and emphasizing to publishers 😉
Summarizing the information on TTFB vs. DOM stats
So there it is. We certainly aren’t the only ones talking about how irrelevant TTFB is. Even though Moz did a study on TTFB and its relationships to SEO a while back, they since admitted that any correlation likely has nothing to do with TTFB. Additionally, Cloudflare has talked about how TTFB can actually get worse while actual pagespeed can actually get better in many cases.
DOM is what the Google webmaster team seems to be spending a lot of time talking about these days, and DOM Complete and DOM Interactive provide a better look at how quickly visitors can access content and when critical page resources are fully-loaded. This better accounts for when you may be optimizing loading based on visitor engagement/SEO; instead of arbitrary loading times.
Questions, concerns, thoughts? Leave them below.