As we all know, site speed is more important than ever; it’s been proven repeatedly that the faster the page loads, the better the user experience is. This leads to a better conversion rate and more revenue. Speed is also a search engine ranking factor.
So, we’ve optimized all our site code, but the site is still slow? What could be the reason? Often, a page is loading slowly despite your performance optimisation because of the third-party scripts you use for your website’ functionality or revenue streams.
3rd party code like this is used quite often and in most cases, it will add an overhead to your site that degrades performance. Basically, it’s a code that has to be downloaded from a different server, parsed, and then loaded – and that can take a bit of time, which can result in loss of transactions.
How Can we Identify a Slow-Loading Third-Party JS?
To reduce the impact of any slow loading 3rd party script, we need to be able to identify them. In recent years, a series of tools have been developed for this purpose, but those which stand out, and most of today’s technical marketers recommend, are Page Speed Insights, Lighthouse and Chrome DevTools.
Using PageSpeed Insights
One of the most user-friendly is PageSpeed Insights, which is based on the Lighthouse API. After running the test on your desired page (we recommend testing at least 3 or 4 different URLs to try to capture as many page templates as possible), look under the diagnostics section for the label “reduce the impact of third-party code” (or check under “passed audits” if it’s not there).
It’s often useful to investigate it further using Chrome DevTools – which is handily built straight into the Chrome browser.
Using Chrome DevTools
Chrome DevTools is incredibly powerful and has a ton of useful information about any site that you want to audit, and can give you incredible insights of what the browser is actually doing, and how much time it spends doing it.
To take advantage of Chrome DevTools, you just need to right-click on your page and chose “Inspect”, or press Command + Option + I on Mac, or Control + Shift + I on Windows. We would recommend doing any testing in an incognito window, that way any Chrome extensions or Chrome addons won’t affect the accuracy of page loading time considerably.
In the top right-hand corner, click the Settings icon and select “Use large request rows” and you should get a bit more information for each network request (if it is not already ticked).
The Performance Report in Chrome DevTools
Once we have a list of JS files we suspect to be responsible for slower page load time, it is important to see the performance impact of removing it and also to observe if anything noticeable breaks if those scripts are removed. Once again, Chrome DevTools comes to our rescue with a feature called Network Request Blocking.
Using our site, and the above screenshot, you will notice that by right-clicking on the desired file we can take advantage of Network Request Blocking feature – Block request URL/domain. If we’ve had anything already blocked we would see it highlighted in red, or in the console below.
As we mentioned before, blocking a specific JS can save some you some seconds from the total loading time. To measure the impact of removing any JS, we need a benchmark time. To create this, we need to request our desired page 3 times – use incognito and 3 different tabs – and take an average of the Load time (shown in the Bottom Bar – in red).
Repeat the process after our identified script is blocked. The difference between this new averaged Load time and our benchmark will give us a rough idea of how many seconds we can save if we reduce the number of slow loading Java Scripts.
Improving Loading Times for 3rd Party Scripts
Instructions to defer or async are handy
The async and defer attributes can change this behaviour in your favour:
- With async, the browser downloads the script in “parallel” while it continues to parse the HTML document. When the script is finally downloaded, parsing is then blocked and the script is executed.
- With defer, the browser downloads the script asynchronously while it continues to parse the HTML document. The script doesn’t run until the parsing is complete.
A caveat for these attributes is that we shouldn’t use them on any important scripts that are critical to the loading of the page and need to be executed as early as possible. A good example of NOT using these attributes is jQuery library which is often used for the layout of a page.
Also, it is important to thoroughly test any browser modifications in the handling of JS on a staging environment to be confident that the page is loading properly and there are no side effects.
Using Preconnect and DNS-Prefetch Resource Hints
Going back to browser behaviour, a page is rendered by requesting different assets that are hosted on different domains, which are seen as new connections to those domains. A new connection can take a bit of time, the browser needs to look up the domain name, establish the connection to the new server and negotiate SSL. On average, this means around 300 – 400 ms, but when you have a series of these requests, the time will add up.
This is where the resource hints can be used to perform a DNS lookup for domains hosting third-party scripts. When the request for them is finally made, time can be saved as the DNS lookup has already been carried out.
For example, on your critical assets, such as images on a CDN, you can see performance improvements by using a preconnect link tag in the <head> of your site.
<link rel=”preconnect” href=”https://cdn.example.com”>
For less critical third-party assets that aren’t required for the page to render, such as analytics, you can use dns-prefetch to get the browser to do the initial DNS setup:
<link rel=”dns-prefetch” href=”https://www.googletagmanager.com”>
Again, it’s important to test it to ensure it’s actually having a performance improvement on your site. Sometimes, having too many of these resource hints in place can negatively impact your site performance. Andy Davies has a nice article about how you can test the impact of preconnect using WebPageTest, and this is a really useful method to see whether it’s worth trying this out on your site before you get the dev team involved.
Improving Loading Times by Using Tag Managers Wisely
Tag managers are great tools in any business owner’s arsenal because you are allowing you or your team to add tags to the site without having to rely heavily on a developer. A tag is a snippet of code that can be used to integrate a third-party tool, set cookies, or collect data within your site. (for example, Google Analytics tags, Facebook tags, Google Ads tags, etc.)
However, these tags can have a cost to your page’s loading performance if they are not managed properly and can become really messy over time. This is where a tag manager comes in place.
The most popular one is Google Tag Manager (GTM) which is “an asynchronous tag, meaning that when it executes, it does not block other elements from rendering on the page. It also causes the other tags that are deployed via Google Tag Manager to be deployed asynchronously, meaning that a slow loading tag won’t block other tracking tags”. (source: marketingplatform.google.com)
Tag managers may improve page load performance by reducing how many calls to external resources are needed as long the number of tags is kept to a minimum. For GTM, a Data Layer is used to collect values in a single unique place which can be used by third parties, for example, to trigger any conversion-tracking data.
As good practice, it’s well worth reviewing your tag manager regularly to check which tags, triggers or variables are being loaded in and if they are still necessary to business goals or not. Sometimes, a simple consolidating and choreographing just a few 3rd party tags can reduce the loading time of your page significantly.
Using GTM to Prioritize Tags Loading
When it comes to persuading a client to implement an overall performance improvement process, a small thing like replacing a Youtube player or by delaying the load of a chat or feedback widgets can be enough to show the true potential of reducing the loading time of their pages.
A while ago, we had a retained client who didn’t believe they needed to consider improving their site speed at that time. So, immediately we evaluated what “quick wins” can be implemented to show them the impact of a faster loading page on their conversion rate.
One of the items which we identified as a quick win was a live chat widget which was used across the site to improve the engagement of their customers. Luckily, this widget was implemented using GTM through a custom HTML tag.
This meant that we didn’t rely on a developer to make any modifications and we would have the freedom to try any optimization (GTM offers the feature to even pause a tag).
By just optimizing the tag – to fire on specific pages where it was actually needed (supported with Hotjar heatmaps) and by delaying the load of the chat – we managed to save 1.5 seconds which considerably improved the user experience.
Along with this quick win, we discovered other tags which could be prioritized to fire in a specific order which impacted the TTI (Time to Interactive).
After this exercise, the client became much more responsive to the potential of an overall performance improvement process.
Summarising the Important Lessons
Like many things performance related, even small incremental improvements quickly add up to make a larger difference. In summary, when we talk about third-party scripts having an impact on site-speed we can consider the following plan:
- Identify & catalogue any scripts/tags that are flagged as slow with the above-mentioned tools;
- Consider using resource hints or async /defer instructions where possible to reduce the third-party loading times;
- Thoroughly test any modifications of JS handling on a staging environment before they are pushed live;
- Use Google Tag Manager to consolidate and choreograph any tags that are loaded to your customer;
- Regularly review the GTM container for any out of date tags, trigger or variables.