Synthetic Monitoring vs. Real User Monitoring
To determine how you score on the Core Web Vitals, you can use synthetic and real user monitoring methods. While both are useful, one gives a better picture of how your users experience your site speed.
If you start measuring your website, you will soon end up at Google's Lighthouse. Lighthouse uses lab data, which is synthetic data. But to find out how your visitors experience your site you will need a little more.
Not everyone has the budget to buy the latest phone and we certainly don't have any influence on internet connection. Let alone that everyone has the same experience, which Lighthouse indicates.
If you want to understand how you're scoring on Core Web Vitals and get a score that represents the UX, then you might want to take a look at our Core Web Vitals checker or our Core Web Vitals history tool.
Synthetic Monitoring
Synthetic testing involves re-creating user scenarios and keeping an eye on how the website performs and how long it takes to respond. It can be used on a large scale by testing a website's performance from many different places.
It can also offer more than one way to run a test, such as a choice of which user device to imitate or which browser to use to run the test. To accomplish this, you can use tools such as WebPageTest.
Real User Monitoring
Real user monitoring, is a performance monitoring process that gathers specific information about how a user experiences a website. Real user monitoring involves the collection of data on a wide range of metrics. Take, for instance, Core Web Vitals and Site Speed User Experience.
Each user's experience is collected. If someone comes in with poor connectivity and a low-end device, they will have a worse experience. It will be better if the user has the latest smartphone with 5G connectivity. With real user monitoring, you collect both and get insights into how your real users experience your website.
Why lab data isn't always representative
Lab data represent how your website would perform on one type of device under specific network conditions. But a red Lighthouse score does not guarantee that your real users will have a bad experience on your website.
Let's take this example. When looking at that the scores you might think the odds are low for this website to pass the Core Web Vitals assessment, but keep in mind that we're looking at lab data.
But with real user data, we have insight into the same website, and the results are completely different based on the real users.
As you can see, almost all the scores are green, which means the user experience is good. If you look at the difference between the Largest Contentful Paint you will see that lab data shows 16.7 seconds, while in reality, it is only between 1 and 2 seconds for real users.
Conclusion
If you are already looking at lab data that is already a good start. But remember that the scores given are not directly representative of the experience of your real users. To measure this through, you need a real user monitoring tool, such as RUMvision. Tools like Pagespeed Insights also allow you to see the experience of real users, but with a 28-day lag and no clarity as to what factors drive the scores.
Lighthouse's performance score also does not determine your SEO and ranking factor. The experience of your real users is taken into account for this purpose.