Metrics

How Google Measures Your Site Speed

Intermediate
Medium

How Google Evaluates Your Site Speed

Google does not run a speed test on your site and assign a score. Instead, it collects performance data from real Chrome users who visit your pages and uses that field data to evaluate your Core Web Vitals. This system — called the Chrome User Experience Report (CrUX) — is the foundation of how site speed factors into search rankings. Understanding how it works helps you focus on the right metrics and avoid optimizing for the wrong target.

The Chrome User Experience Report

CrUX collects anonymized performance data from Chrome users who have opted in to sharing usage statistics. When a Chrome user loads one of your pages, the browser measures Core Web Vitals — LCP, INP, and CLS — along with supplemental metrics like TTFB. This data is sent back to Google, aggregated, and made available through several tools.

Not every page visit is captured. CrUX only includes data from Chrome desktop and Android browsers (not Safari, Firefox, or iOS Chrome) where the user has opted in. Sites with very low traffic may not have enough data points to generate a CrUX record at all — Google requires a minimum sample size before it reports field data for a URL or origin.

CrUX data is collected on a rolling 28-day window. This means your Core Web Vitals assessment reflects the past four weeks of real visitor experiences, not a single test or a single day. It also means that after you make an improvement, it takes up to 28 days for the change to fully reflect in your field data.

The 75th Percentile

Google evaluates each Core Web Vital at the 75th percentile (p75) of the collected data. This is a deliberate choice that balances representativeness with fairness.

Here is what the 75th percentile means in practice: if 100 visitors load your page and you sort their LCP times from fastest to slowest, the 75th value is your reported LCP. This means 75% of your visitors had an LCP at or better than this number, and 25% had a worse experience.

Why not the median (50th percentile)? The median would ignore the experience of nearly half your visitors. Why not the 95th percentile? That would let a handful of extreme outliers — a visitor on satellite internet, a badly configured corporate proxy — define your score regardless of how well the site performs for everyone else. The 75th percentile captures the experience of the broad majority while remaining sensitive to patterns of poor performance.

This has a practical implication: you do not need every visitor to have a fast experience. You need most visitors to have a fast experience. A small percentage of users on exceptionally slow connections will not tank your score, but a significant segment of visitors on common mobile devices and networks will.

URL-Level vs Origin-Level Data

CrUX provides data at two levels. URL-level data covers a specific page. Origin-level data covers your entire domain. Google uses URL-level data when available and falls back to origin-level data when a specific page does not have enough traffic to generate its own CrUX record.

This matters for WordPress sites where traffic is unevenly distributed. Your homepage might have robust URL-level data while your individual blog posts rely on origin-level data. If your homepage is fast but your blog template is slow (or vice versa), the origin-level data blends both realities together. Pages with their own URL-level data are evaluated independently.

Where to See Your Data

Google Search Console provides the Core Web Vitals report, grouping your pages by status (Good, Needs Improvement, Poor) based on field data. This is the most authoritative view because it directly reflects what Google uses for search.

PageSpeed Insights shows both field data (from CrUX) and lab data (from Lighthouse) for a given URL. The field data section at the top is what matters for SEO — the lab score below is a diagnostic tool.

CrUX Dashboard and BigQuery provide raw access to the CrUX dataset for deeper analysis. These are useful for tracking trends over time or comparing performance across page groups, but most site owners get what they need from Search Console and PageSpeed Insights.

What This Means for Optimization

Because Google uses real user data, optimizing purely for lab tests can miss the mark. A page that scores well in Lighthouse but loads slowly for visitors on mobile networks in your target market will still show poor Core Web Vitals in Search Console.

Effective optimization starts with field data to identify which metrics are failing and which pages are affected, then uses lab tools to diagnose the specific causes. The 28-day rolling window means patience is required — improvements are real, but the data takes time to catch up.

Further Reading

Related Articles

Understand the fundamental difference between real user measurements and synthetic tests, and why they often disagree.
PageSpeed Insights contains both field and lab data — learn which section to trust for SEO decisions.

Need help with this?

Mochyon specializes in WordPress Core Web Vitals optimization. We diagnose, fix, and verify — with a named human accountable for the result.

Get help from Mochyon