Metrics

PageSpeed Insights vs Real User Data

Intermediate
Medium

What PageSpeed Insights Actually Shows You

PageSpeed Insights is one of the most widely used performance testing tools, but it is also one of the most misunderstood. When you enter a URL and get a score, most people focus on the number at the top — the 0-100 performance score. That score comes from a Lighthouse lab test, and it is not what Google uses for search rankings. The section that matters for SEO is the field data panel at the top of the report, which shows your real Core Web Vitals from actual Chrome users.

Understanding what each section of a PageSpeed Insights report means — and which one to act on — prevents you from chasing the wrong number.

The Two Sections

A PageSpeed Insights report has two distinct data sources, and the tool displays them together in a way that can be confusing.

Field data (top of the report) comes from the Chrome User Experience Report (CrUX). It shows your LCP, INP, CLS, and TTFB as measured by real visitors over the past 28 days. Each metric has a bar showing the distribution of good, needs-improvement, and poor experiences. This is field data — it reflects reality, and it is what Google uses to assess your page experience for search.

Lab data (the performance score and diagnostics) comes from Lighthouse, which loads your page once in a simulated environment — typically a mid-range mobile device on a throttled connection. The 0-100 score, the filmstrip, the waterfall, and all the diagnostic recommendations come from this single synthetic test.

When these two sections agree, the picture is clear. When they disagree — and they frequently do — field data is the one that determines your search rankings.

Why the Score and Reality Diverge

It is common for a site to score 90+ in the lab section while showing poor Core Web Vitals in the field data, or to score in the 50s while field data shows everything is green. Several factors drive this divergence:

Visitor diversity. Lighthouse tests from one simulated device and location. Your real visitors span a range of devices, connection speeds, and geographic locations. A site that performs well for the simulated mid-range phone may struggle on the lower-end devices that a significant portion of your audience actually uses.

Interaction patterns. Lighthouse loads the page and stops. It does not click buttons, open menus, or scroll through content. INP — which measures interaction responsiveness — can only be captured from real users performing real interactions. A page with heavy JavaScript event handlers may look fine in Lighthouse but have poor INP in the field.

Third-party behavior. Ad networks, analytics scripts, and chat widgets often load differently (or not at all) in lab environments. In the real world, these scripts may inject content that causes layout shift, block the main thread, or add significant download weight.

Caching differences. Lighthouse always tests a cold load — no cached resources. But many of your real visitors may arrive with warm caches from previous visits or from other pages on your site. Field data can be better than lab data in this respect.

How to Read the Report Effectively

Start with the field data section. If it says “This URL has sufficient field data” and shows green bars for all three Core Web Vitals, your site is passing — regardless of what the lab score says. If field data shows one or more metrics in yellow or red, that is a real problem affecting real visitors and your search rankings.

If the report says “This URL does not have sufficient field data” it will fall back to origin-level data (your entire domain). If there is no origin-level data either, your site does not have enough Chrome traffic to generate a CrUX record. In that case, lab data is all you have — but Google also lacks field data, so Core Web Vitals are not actively factoring into your rankings for that URL. Read more about how Google measures your site speed to understand the data collection process.

Use the lab diagnostics — the “Opportunities” and “Diagnostics” sections — as a debugging tool. They identify specific resources and patterns that slow down your page. But prioritize based on what the field data says is actually failing, not on which lab diagnostic has the biggest estimated savings.

Common Traps

Optimizing for the score. Chasing a perfect 100 in Lighthouse can lead you to remove features, defer critical resources, or make trade-offs that hurt user experience in ways the lab test does not capture. Aim for good field data, not a perfect lab score.

Testing only one page. PageSpeed Insights tests one URL at a time, but Google evaluates your entire site. Your homepage may score well while product pages or blog posts perform poorly. Check field data in Search Console to see which page groups have issues.

Ignoring the field data panel. Many site owners screenshot the performance score and ignore everything above it. The field data panel is smaller and less visually prominent, but it is the only section that directly reflects your search ranking signals.

Further Reading

Related Articles

The fundamental distinction behind why PageSpeed scores and real-world performance can tell different stories.
Understand the CrUX data collection system that produces the field data shown in PageSpeed Insights.

Need help with this?

Mochyon specializes in WordPress Core Web Vitals optimization. We diagnose, fix, and verify — with a named human accountable for the result.

Get help from Mochyon