In this experiment, we looked at the improvements on Core Web Vitals and other pagespeed metrics after turning off client-side A/B testing via Google Optimize. This is not a specific Google Optimize problem, but also other client-side A/B testing tools that offer a similar anti-flicker snippet.
How A/B testing can impact Core Web Vitals
Mobile Data Analysis
When we anylsize pagespeed data, we often look at the mobile experience first. These, because they are often slower than desktops, therefore experience a slower experience on your website. A client-side A/B test also requires more of the device your user enters through.
With some background information given you are ready for the data analysis focused on the homepage performance across various types of page views:
Unique pageviews
A unique pageview is the first uncached page hit in a new visit. A 16.8% improvement was observed in FCP for unique page hits on the homepage. However, accounting for the fact that TTFB showed a regression, the net FCP gain would actually be around 36%.
Returning pageviews
A returning pageview is the first page hit in a returning visit. On top of a minor TTFB regression of 50ms, the FCP improvement in returning visits reached 31.8%
Successive pageviews
A successive pageview is a subsequent page hit in an ongoing visit. With just a 1ms TTFB fluctuation, the FCP improvement in successive visits was 33.8%, making it a reliable percentage.
Results on Core Web Vitals and pagespeed metrics
As anticipated, there has been a noticeable improvement in both the FCP and the LCP metrics. This is crucial as it directly impacts the user experience. A slow FCP results in a prolonged period where the user is faced with a blank screen, potentially leading to frustration and abandonment of the site in favor of a more responsive alternative.
Impact on First Contentful Paint
A substantial improvement of over 30% was observed in all groups and types of page views for the homepage.
Impact on Largest Contentful Paint
LCP visibly benefits from the achieved FCP improvements. There isn't an exact one-to-one correlation. I would never expect there to be a one-to-one correlation. Because the further we get from a performance bottleneck (from TTFB to FCP, and from FCP to LCP), the more work needs to be done by a browser, that could get in between our results.
Nevertheless, LCP was still showing a significant improvement.
Impact on other templates
The above data is based on mobile traffic on the homepage. We also saw changes across other templates:
- brand, category and listing pages saw the same (+30%) FCP improvements;
- pages such as search, checkout and cart achieved a a 10% improvement.
We didn't study the reason for the limited improvements, as those pages typically have more fluctuating TTFB because of the type of server side work that is happening across those pages.
Conclusion
The analysis demonstrates the following:
- turning off Google Optimize (or other client-side A/B testing tools) can significantly improve the FCP metric;
- if disabling A/B testing is not an option, considering a different strategy will be a wise decision.
A/B testing's impact on these key performance indicators should be considered when optimising your website for user experience.
When your company is using client side A/B testing, be sure to consider pagespeed and Core Web Vitals as well to find a healthy balance between the two and prevent a regressing user experience and conversion.