The perfect 100% score on both desktop and mobile is not feasible! Now I know that you may disagree with this satement and say that this is technically possible to achieve because you’ve done it. Well you’d be correct as we’ve also achieved this too and ideally everyone should be aiming for this, but please just hear me out on this.
We’ve spent a lot of time enabling caching and setting up our server configuration correctly, using picture tags with srcset to deliver optimised images for different browser sizes, and minifying all of our html, css, and js, as well as all of the other optimisations that google have thrown at us when using their pagespeed insights tool. Finally we hit the fabled 100 out of 100 rating on both desktop and mobile and we were ready to push all of our changes up from our dev environment to our live site, great! However, this is where we suddenly find out that we’ve been deceived by Google.
All of our optimisations are in place but we are now seeing only 99 out of 100 on the live site! Why has this happened Google?
Oh, I see… It appears that Google Tag Manager, Google Analytics, and HotJar are all dragging our score down. At this point I’d like to note that none of these are running on our dev site which is why we were able to achieve a perfect score.
Leverage Browser Caching
Ok, so how do we leverage browser caching for these resources then? We need to set expiry dates in the HTTP headers, but these rules are already in place on our domains. All of the files we have control over are compliant with this. These files are hosted by other domains though and it appears that they haven’t set any expiration dates or maximum ages.
We don't have any control over these files so the only feasible option available to us would be to download the file and serve it from our own server. This isn’t really an option though to be honest because any updates made within your google account would no longer be applied. The files would be static and no longer respond to updates to those services, so this isn't really a good solution at all. From what we can tell from various google searches, there isn’t really a way around this. The majority of people seem to be trying to trick the pagespeed insights test into ignoring the files, rather than properly optimising their delivery.
As a result we’ve had no choice but to just leave these files alone. This means that we don’t get to enjoy our perfect score which is annoying but short of removing user tracking (which marketers hate to consider) we are left in the position of having to accept that sadly, this is just a problem which we cannot fix.
This does however flag up some warning signals for us which we would like to share with you! Google Tag Manager is now being used on many of our sites. At the point of going live with a project, we’ve gotten the site as close to a perfect pagespeed score as possible. However, after seeing this issue we’ve gone back through a few of them and seen that the Pagespeed scores are now lower. We can attribute this mostly down to Google Tag Manager.
Just because Google Tag Manager makes it easier for you to add more tracking scripts to your site, along with lots of tools and reports, it doesn’t mean that you should! We understand that some of these will be essential, but we also want you to understand that they have an impact on your sites overall speed. If a very poorly optimised tool is added and your site is running slowly, there isn’t really much that we can do about it. The best course of action would be to review the tool, and find an alternative which is better optimised.
We always strive to get as close as possible to 100 out of 100 when it comes to pagespeed, but when google's own products prevent us from achieving this, we can’t help but feel a little bit irked. Especially when it isn’t just browser caching that they’ve havn’t optimised. There are a whole host of other areas that start to suffer when Google Tag Manager and Google Analytics are used, and a few affected by HotJar too which is another tool we are using.
As you can see from the optimisation reports, we now have a whole host of areas suffering due to these external scripts being included on our site. These external tools do provide us with great benefits though so we will be keeping in place.
As a developer though, it annoys me at how poorly optimised they are. An even clearer indication for me is when we look at our sites waterfall. A waterfall shows us the order in which resources are loaded and how quickly they are completed.
The first half of our waterfall is fairly vertical and shows that we’ve optimised this very well, with waterfalls the more vertical they are the better the site is. This is because everything is loading at the same time and not holding up the page at all.
The bottom half of the waterfall however is another story all together. This is where google, hotjar, and typekit all kick in, and boy do they have an impact. They’ve nearly doubled the total load time of the page whilst we wait for these resources to sort themselves out. We just wish that some of these tools would improve their integrations, then they wouldn't impact so nagatively on all of our hard work.
Ultimately yes, a perfect 100% pagespeed score on both desktop and mobile is indeed possible. But we need many various tools to provide us with useful data and the benefit is much greater than the cost. So we won't ever truly reach that fabled perfect score on our live site. Is the perfect pagespeed score realistically achievable on a live site? Definitely not. In fact, we'd go as far as saying it's impossible!