Why Your Desktop Score Is Misleading
Most Core Web Vitals failures happen on mobile devices. The gap between mobile and desktop performance is significant — a site that scores 95 on Lighthouse desktop might score 55 on mobile. This is not a quirk of the testing tool. Mobile devices have slower processors, less memory, and often weaker network connections. When Google evaluates your site for search ranking, it uses mobile data from the Chrome User Experience Report, not desktop.
Understanding this gap is essential because many site owners only check desktop performance, assume everything is fine, and miss the experience that most of their visitors actually have.
Why Mobile Is Slower
CPU differences are dramatic. A mid-range Android phone — the kind most people actually use — has roughly 3-5x less processing power than a modern laptop. JavaScript that executes in 200 milliseconds on a desktop might take 800 milliseconds or longer on a phone. This directly impacts INP and LCP scores.
Network conditions vary widely. While desktop users are typically on stable broadband, mobile users may be on 4G, spotty Wi-Fi, or connections that fluctuate as they move. Higher latency and lower bandwidth mean every HTTP request and every kilobyte of transferred data costs more time.
Screen size changes what counts. On mobile, the LCP element might be different from desktop — a hero image that loads quickly at 400px wide might be below the fold on desktop where a different, heavier element becomes the LCP candidate. This means mobile and desktop can fail on entirely different issues.
What the Data Shows
Chrome User Experience Report (field data) consistently shows mobile scores trailing desktop by a wide margin across the web. For LCP specifically, the percentage of origins meeting the “good” threshold on mobile is roughly 15-20 percentage points lower than on desktop. For INP, the gap is even more pronounced because JavaScript execution time scales directly with CPU speed.
Lighthouse simulates a mid-tier mobile device on a throttled connection by default — this is intentional. It approximates the experience of the median mobile user, not the best-case scenario. If your Lighthouse mobile score looks bad, that is likely what real visitors are experiencing.
Testing Mobile Properly
Always test on real devices when possible. Chrome DevTools’ mobile emulation throttles the CPU and network but cannot perfectly replicate the behavior of an actual phone’s browser, memory management, or thermal throttling. If you only have access to lab tools, use Lighthouse’s default mobile preset and treat it as the primary score — not the desktop one.
Check your Core Web Vitals in Google Search Console, which reports mobile and desktop separately. If you see a gap, prioritize fixing the mobile issues first — that is what affects your search visibility.
Further Reading
- Performance Budgets 101 (web.dev) — Setting resource limits is especially important for mobile where every byte costs more.
- Lighthouse Performance Scoring (Chrome Developers) — How Lighthouse calculates scores and why mobile and desktop differ.
