Third parties

The Third Parties tabblad can be found in the sidebar navigation. It will show you long JavaScript tasks per hostname and even filename, helping website owners to make informed decisions.

To track the impact per hostname, RUMvision is using the LoAF API. This means that LoAF tracking needs to be enabled to have outcomes visualized in this dashboard

The web, debugging and let alone LoAF data can be quite complex and maybe even intimidating. For both technical as well as technical stakeholders. RUMvision aims to simplify things with this Third Party dashboard.

UX impact scores

In this dashboard, we introduced two scores. This immediately allows site owners to see where main JS performance and responsiveness issues are coming from. An impression of such dashboard can be found in the screenshot below:

1st and 3rd party score

This will answer the following question right away:

  • do we mainly have a 1st party challenge and should we brainstorm with our developers?
    Some platforms could come with its own challenges. For example because of hydration, or many event listeners for add-to-cart buttons on your long product listing pages.
  • or do we mainly have a 3rd party challenge, and should we (re)evaluate marketing tags and efforts?
    Site owners could use this information to decide if they want to stop using a third party, start a conversation with a third party or go with a (more lightweight) alternative.

Most occurences

On desktop, you will see a quick overview at the right. This overview is displaying hostnames that occured the most. Do note that more occurences does not equal a more severe or critical impact.

An explanation of what the occurences panel is showing us:

  • the percentage at the right is indicating the relative amount of occurences compared to all long tasks;
  • the coloured icon is indicating if this typically came with a critical (red), moderate (orange) or low (green) impact on user interactions (and thus INP).

Drawing conclusions

The 1st party score clearly is lower than the 3rd party score, but they're not far off. So, this website actually has performance challenges across both 1st and 3rd party JavaScript.

Looking at the most occurences, they might want to experiment with Sentry (1) by disabling it for a while and see the impact on the 3rd party score as well as their INP score. (5) might be easier to fix. There are alternative plug & play solutions, so it's worth experimenting with an equivalent solution. RUMvision (6) is seen quite often too, but not causing any INP issues based on the 0% involvement.

Own inline (2) JavaScript seems to be a challenge as well, and quite often too. And JS served from the same domain (4) is a critical issue by itself, but luckily only involved 5% of all cases. Nevertheless, this combination is a sign that this merchant should research and work on their 1st party JS behaviour as much as 3rd party JS impact.

But there's more information to be found in our Third Party dashboard:

JS execution time

Below these scores, you will find a more detailed overview of hostnames and their impact. We can see an example in the following screenshot:

Hostnames are categorized into 3 buckets: low (green), moderate (orange) and critical (red) impact. The opaque bar behind each hostname is indicating the relative amount of involvements of that hostname across all INP events.

To determine the impact, we look at both its individual JS execution time, as well as how often it was involved in INP issues:

  • a >200ms individual JS execution time
    if the JS execution time of a single hostname is already exceeding the INP threshold of 200ms, then it clearly is a huge issue., JS execution within the Google Tag Manager container as well as the very own website's (blurred out) JS are examples here.
  • a <200ms individual JS execution time
    In other cases, the individual JS execution time is lower than INP's 200ms threshold. Sentry is an example here.
    However, as the execution time is 96ms, it leaves little room for other batched tasks or scripts, making it more likely to cause INP issues.

Involvement distribution

You can hover over individual hostnames to see the INP distribution. The visualization of the INP distribution can be found in the screenshot below:

This is answering the following question:

  • of all detected INP events where Sentry was involved, how often did this lead to a good INP, moderate INP and poor INP?

This INP distribution is telling us that it's 70.68% that leads to good INP's. That is not meeting the 75th percentile that Google is using for Core Web Vitals.

As a matter of fact, the 75th percentile ends up in the moderate bucket in the example above. As a result, RUMvision automatically categorizes Sentry as "moderate impact" despite its own JS execution time being lower than 200ms. Although you could see this information for every hostname, we did the work here for you so site owners don't need to do such (time consuming) analysis.

But if you do want to analyze even further, you can use the "view detailed" dropdown to jump to the technical tab with filters being pre-configured.

Details per hostname

But RUMvision has more as LoAF provides more information as well. LoAF enables site owners to also get to the exact sourceLoaction where long tasks started to happen. And we collect them as well.

Clicking on a hostname leads you to a similar overview, but then with all the filepaths belonging to the hostname you just clicked on. This looks like the following:

JS time per filepath

In case of, we can see that it's not the uc.js file that is causing issues here, but that their logconsent file is resulting in long tasks.

This file is involved once a user interacts with the cookie notice. This means that first time visitors have a higher chance of running into INP issues. Which is unfortunate, as this (early) engagement during the first pagehit might dictate how users feel about the website. A negative first impression may result in users quickly bouncing to a competitor's site, potentially leading to long-term loss.

Benchmark / alternatives

Luckily, RUMvision doesn't stop there. With all the information we collected via LoAF, we are able to show site owners a benchmark, containing other third party solutions within the same category. This is illustrated below:

In simpler terms, based on the data we've gathered, this site ended up selecting the least favorable option among the available cookie solutions.

Third party categories

We ended up collaborating with Google to get an article out there covering LoAF and INP of third parties and categories.

Overall, we divided data and offering benchmarks in the following categories:

  • A/B testing
  • Advertising
  • Affiliate marketing
  • Analytics
  • Auditing
  • CDNs
  • Chat
  • Consent Provider
  • Content & Publishing
  • Creator-led growth
  • Customer engagement
  • Customer Success
  • Developer Utilities
  • E-mail and SMS marketing
  • Fraud detection
  • Hosting
  • Lead generation
  • Marketing
  • Monitoring
  • Page builder
  • Payment
  • Personalization & recommendation
  • Push notifications
  • Shipping
  • Site search
  • Social
  • Tag Management
  • User behaviour
  • User reviews
  • Video
  • Visitor resource
  • Website apps

From here, site owners can:

  • decide to stop using a third party;
  • search and test alternative solutions;
  • screenshot this page and reaching out to the third party supplier.

When reaching out, be aware that not all third parties are ready to acknowledge the impact of their scripts at this point. The difference between lab data and field data isn't known to everyone yet, let alone the meaning of INP.

How to proceed from here

RUMvision is removing a lot of investigation and debugging work. By looking at the third party dashboard, site owners and other shareholders (CTO's, SEO specialists, developers, marketers) know right away which 3rd parties they should focus on in discussions. Or 1st parties. From their, they can start drawing conclusions.

The next step from here is a developer, marketer or consultant taking over. They can either change code, change third parties or continue debugging with the insights that RUMvision provided. Consultants could then be able to come up with a tailored recommendation.