We look at how website speed affects your SEO and how it directly impacts your Google search ranking.
Contents
- Speed and why Google wants it
- What is Core Web Vitals
- Mobile-responsiveness
- Safe Browsing
- HTTPS Security
- Intrusive Interstitial Guidelines
- Largest Contentful Paint (LCP)
- First Input Delay (FID)
- Cumulative Layout Shift (CLS)
- How to measure your own site
- The controversial bit
- Conclusion
You’ve no doubt heard about the importance of Search Engine Optimisation (SEO) when it comes to creating a thriving website for your business. The world of SEO stretches far and wide and there are countless strategies you can employ to make your website as searchable as possible. One of those key attributes your website now needs to have, is speed.
Speed has always been an important ranking factor for Google, but it’s usually been bit of a rumour in the SEO world, but now Google themselves have come out and said that website speed now directly affects ranking.
In October 2020 Google announced ‘Google Core Web Vitals’, which were designed to measure the overall ‘page experience’, which comprises of;
- Mobile-responsiveness
- Safe Browsing
- HTTPS website security
- Intrusive interstitial guidelines
I know, I know, you’re probably thinking “what does any of that even mean?” Fortunately we can explain that for you now…
Ultimately this means that the website is fully accessible and can be easily navigated using a mobile phone. Websites are full of lots of JavaScript, CSS and many other things, which sometimes are intrinsic to the overall design. Mobile-responsiveness is a test to ensure that any content that appears on a phone, is easily readable, clickable (using your thumb) and that nothing is blocked.
With over 56% of web traffic now performed on a mobile phone, ensuring your website is mobile-friendly has never been more important.
Google’s security team created a service called ‘Safe Browsing’, that was built to identify unsafe websites across the internet, and to notify and warn users before they ran into potential harm. Their system systematically scans websites to check for malicious code, or known exploits to cause the victim harm should they land on that site.
We check everything that goes into one of our websites, so you feel safe that nothing bad will happen to your site if hosted with us.
As websites become more and more complex in their functionality and designs, security is paramount, and Google wants all websites to have encryption – i.e. the little padlock you see in the address bar (you’ll see it on this site right now). Unlike traditional traditional HTTP (Hypertext Transfer Protocol), HTTPS (Hypertext Transfer Protocol Secure) adds an additional layer of security by delivering data over a secure connection.
If you have a website and you don’t have an SSL (Secure Sockets Layer) padlock, Google is far less likely to recommend you in SERPs (Search Engine Results Page), so now is the time to get one!
This is quite an extensive subject, but essentially this is more around user experience and helping users to find the content they’re looking for, or rather in this case, ensuring you are not blocking it with intrusive popups or annoying, nagging messages.
Google’s ultimate aim is to provide their users with the answers to their query, and if you have a popup that blocks that content on first load, it will penalise you for it.
OK, Back to Speed
Now that we’ve covered the first part of Core Web Vitals, we can better understand how speed plays an important part in the whole experience.
If you’re not that techie, we apologise for the next bit and we wouldn’t think any less of you if you skipped it! For those who are a bit more on the geeky side, let’s dig in!
When Google measures the speed of your website, it does this using a whole host of methods (including the above), but three of the most important are;
Largest Contentful Paint, or LCP, measures how long it takes for the largest content element to appear on the screen. This is not to be confused with how long the rest of the page takes to load, but more how long the most important aspect of that page takes to load and ultimately, display.
In a very crude and basic example, if you had a web page with a heading text and an image with nothing else, that image would be considered the LCP, as it would be the largest piece of content to load.
First Input Delay is the time it takes for the web browser to respond to the user’s first interaction – scroll, click etc. Ultimately the faster this happens, the more snappy and responsive the website will feel. You have no doubt visited sites where you weren’t able to actually do anything until everything had loaded on the site – that’s a long FID, which Google says is now bad and is certainly not a great user experience. As we know, slow websites cost retailers nearly £60 billion in lost sales each year.
As we’ve mentioned, websites have become far more complex in both their design and functionality, and a lot of these cool features can come at a cost in the form of JavaScript, CSS and lots of high resolution imagery. JavaScript can often be the main cause of a long FID, as some core functionality such as menus can only be triggered or displayed once the script has run. This is where a well-coded, well-optmised website really stands out.
The previous two measurements, LCP and FID have always been fairly well intwined with overall website speed, whereas CLS is a newer metric. Cumulative Layout Shift measures effectively how rigid or fixed parts of the website are when loading. To give an example; if you have a web page that contains a few sections, a few images and your main CTA (Call to Action) button sits somewhere in the middle – if a user tries to click that button but the rest of the content hasn’t loaded, that button will probably shift or move around, leading to frustration as they try to click.
Another classic example of this, and one that some websites have done intentionally (tut tut), is website ads. You will have almost no doubt been on a website where you’re trying to download a file, or access a button, only when attempting to click on the link or button, something else has loaded and you’ve clicked on an ad by mistake. This ‘shift’ is what Google is trying to phase out, as they believe it harms the overall user experience.
There are numerous tools that will measure the speed of your website and score it based on some of the criteria above, as well as some of their metrics. A few of the best sites to use are;
- Google PageSpeed Insights – Google’s own site that gives insights into how they measure your website. It uses Lighthouse as the tool to suggest where improvements can be made.
- GTmetrix – Audits your site to suggest where improvements can be made.
- Pingdom – Similar to GTmetrix.
One important thing to remember, is that when Google test your website, they artificially throttle (slow down) the connection, to simulate a kind of worst case scenario. This is to measure how the website performs for people who don’t have a blazing fast connection. Obviously the faster your internet connection and computer/mobile device, the quicker that website will probably load. However, they need to understand how it loads for people who don’t have such a fast connection or device.
The slightly controversial bit is that is it really fair to purposefully limit speed when the large majority of people do have a reasonably fast connection, or a mobile device capable of handling most modern sites with ease? Is it fair to penalise sites (in ranking) that are super quick to load on desktop via a broadband connection, but perhaps struggle a little bit on a 3G connection out in the middle of nowhere?
Ultimately search engine’s algorithms are extremely complex in how they rank a website, but are they really less likely to start showing a really popular website, or a website with fantastic content that answers user’s queries, purely because it falls slightly foul of Core Web Vitals? I guess we’ll start finding out as Core Web Vitals starts to roll out…
We’ve covered how and why Google wants everyone’s sites to be super fast and super snappy, and how it’s directly linked to how that site will rank in SERPS. Most websites can be made faster through some simple fixes such as caching (server and page), delivering some content via a CDN (Content Delivery Network), optimising images before they’re uploaded, and only running scripts where absolutely necessary.
However, it’s really important not to get too sucked into chasing that perfect score. No site will get a perfect score, and some of the really large websites out there perform really poorly, yet they appear top of the search results time and time again.
We are at the mercy of Google and how their algorithm works, so all we can do is to try and make our sites as ‘Google-friendly’ as we possibly can, but chasing that perfect vanity score becomes a futile exercise.
If your website isn’t performing as well as you’d like, get in touch and we can help.