Blog · 2026-04-19

What Lighthouse misses (and what to check instead)

Lighthouse is great. It is also a synthetic, single-shot, lab measurement that disagrees with what your real visitors experience. Here are the gaps and what to do about them.

I love Lighthouse. I run it most days. But it has a specific blind spot that costs people real ranking and real users, and almost nobody talks about it.

Lighthouse is a lab tool. It runs once, on a simulated mid-range Android, on simulated 4G, in a fresh Chrome instance with no extensions, with the cache cleared. That run gives you a number. Your users do not match that profile.

Lab data vs field data

Google ranks pages on field data, not lab data. Field data means real Chrome users on real devices, aggregated by Google over 28 days, and exposed via the Chrome User Experience Report (CrUX). When you read "this page does not pass Core Web Vitals" in Search Console, that is field data.

It is entirely possible (and common) to score 100 on Lighthouse and fail Core Web Vitals in production. The opposite is also true. I have shipped sites that scored 72 in the lab and passed every metric in the field, because the slow part was a sync that never runs for real visitors.

Three things Lighthouse cannot see

1. INP, basically

Lighthouse synthesises a Total Blocking Time score and calls it a day. INP (Interaction to Next Paint) measures real user clicks across the whole session. The difference matters: a button that takes 300 ms to respond on the third click of a session is invisible to Lighthouse and very visible to your user. Watch INP, not TBT.

2. Resource bloat from features only some users hit

Lighthouse loads your homepage. It does not click around. If 8% of your bundle is a checkout flow that 3% of visitors ever see, Lighthouse counts it as homepage weight. Real users do not. Conversely, if your homepage loads light and the dashboard is a 4 MB SPA, Lighthouse never tells you.

3. Anything that depends on a real network

TTFB is the obvious one. Lighthouse simulates a single round trip to your origin. In reality you have a CDN, a cold cache, a warm cache, three edge regions, and one user in Vietnam whose ISP routes through Singapore. Field TTFB is sometimes 5x lab TTFB. Lighthouse will not tell you.

What I check, in order

  1. Search Console > Core Web Vitals report — this is what Google actually uses to rank you, look here first
  2. PageSpeed Insights — same field data, plus a Lighthouse run on the side; it puts both side by side which is the right framing
  3. Real-user monitoring on the actual product (we built vibestat partly because the existing options are heavy or expensive)
  4. Then, finally, Lighthouse — for catching new regressions in CI before they ship

But Lighthouse is still useful

Two things it does brilliantly. One: SEO and accessibility audits, which are largely deterministic checks that do not need real users (alt text either exists or it doesn't). Two: catching big regressions in pull requests, where you want a single number that compares against last week. For both, the lab nature is a feature, not a bug.

Just do not let a 100 score lull you into thinking your site is fast. The score and your users live in different worlds.

TLDR

  • Lighthouse is lab; Google ranks on field data
  • Watch INP, not TBT
  • Use Search Console + PageSpeed for what matters
  • Use Lighthouse for SEO/a11y checks and CI regressions
  • Use real-user monitoring for everything else

Try it on your site

Run a free audit, get a plain-English AI summary in 10 seconds.

Audit my site →