Lighthouse vs PageSpeed Insights vs WebPageTest: Which to Trust

Your Lighthouse score is 95. You send the screenshot to the client. They send it to their Google rep. Their Google rep says "but your Core Web Vitals are failing in Search Console." Now nobody is happ

Three glowing instruments side by side: lighthouse beacon, speedometer, and waterfall chart on dark navy background

Your Lighthouse score is 95. You send the screenshot to the client. They send it to their Google rep. Their Google rep says "but your Core Web Vitals are failing in Search Console." Now nobody is happy and you are explaining the difference between lab data and field data to someone who just wanted a fast website.

This happens constantly. Because three different tools exist to measure web performance, they measure three different things, and most developers use all three interchangeably without realizing why the numbers differ. Let us fix that.

The Short Version (For When You Are Busy)

Lighthouse: The Lab Tool Everyone Misreads

Lighthouse runs a controlled page load under simulated conditions. Mobile mode applies 4x CPU throttling and network throttling to simulate a mid-range phone on a 4G connection. Desktop mode applies no throttling.

The score it gives you (0-100) is a weighted average of five metrics: First Contentful Paint, Speed Index, Largest Contentful Paint, Total Blocking Time, and Cumulative Layout Shift. That score is not what Google uses for ranking. Google uses the three Core Web Vitals metrics (LCP, INP, CLS) from real user data. Lighthouse does not even measure INP directly. It measures Total Blocking Time as a proxy for INP instead.

So a site with a 95 Lighthouse score can still fail Core Web Vitals in Google Search Console. This is not a contradiction. They are measuring different things under different conditions.

When to Use Lighthouse

When NOT to Trust Lighthouse

Lighthouse Score Variance Is Real

Run the same page in Lighthouse three times in a row. Get three different scores. This is normal. Lab measurements on your local machine are affected by what else is running, network fluctuations, and CPU thermal throttling. Scores can vary 5-15 points between runs on the same page with no changes.

If you need reproducible results, use Lighthouse CI or WebPageTest, which run in controlled server environments. Your local Lighthouse is useful for direction, not for exact numbers.

PageSpeed Insights: The One That Actually Matters for SEO

PageSpeed Insights is a webpage (pagespeed.web.dev) that runs Lighthouse in the cloud AND pulls real CrUX field data for your URL. This is the important distinction.

The Two Sections You Need to Understand

When you run PSI, you see two sections. The "Discover what your real users are experiencing" section at the top shows CrUX field data. This is the section Google uses for ranking. It shows LCP, INP, CLS, and TTFB values from real Chrome users over the past 28 days.

If this section shows "Data not available," your URL does not have enough traffic in CrUX. Google falls back to your domain-level data or skips CWV as a ranking signal for that URL.

The "Diagnose performance issues" section below shows Lighthouse lab results. This is useful for debugging but is not the ranking signal.

The Score That Matters for Ranking

The "Core Web Vitals Assessment" box in PSI field data shows "Passed" or "Failed" for your site. That assessment is what Google uses. A 95 Lighthouse score with a "Failed" Core Web Vitals Assessment means you are failing for ranking purposes, period.

Important: PSI field data only appears if your URL has enough Chrome users. New pages, low-traffic pages, or private sites may show "No field data available." In that case, Google either uses your domain-level data or treats CWV as not applicable for that URL.

WebPageTest: The Professional's Tool

WebPageTest (webpagetest.org) is where you go when PSI tells you something is wrong but you cannot figure out why. It runs real browsers (Chrome, Firefox, Edge) from real locations (Dulles, London, Tokyo, etc.) on real network connections and gives you a level of detail that Lighthouse cannot match.

What WebPageTest Does That Others Cannot

Connection view (waterfall): You see every single resource your page loads, in order, with DNS time, connection time, SSL handshake time, and download time shown as color-coded bars. You can instantly spot which resource is taking the longest and exactly why.

Filmstrip view: Frame-by-frame screenshots of your page loading at 100ms intervals. You can see exactly when your hero image appears, when text becomes readable, and when the page looks complete. This is invaluable for understanding what users actually see during load.

Real device testing: WebPageTest has a lab of actual Android phones and iPhones you can test on. Not emulation. Real devices running real browsers on real connections. This gives you data much closer to what your actual mobile users experience than any emulation can.

Multi-step scripting: You can test sequences of actions, not just page loads. Click this button, fill this form, then measure. Great for testing checkout flows, login sequences, or interaction-heavy apps.

Chrome trace / CPU profiling: WebPageTest can download a full Chrome performance trace that you can open in DevTools locally. This is the deep-dive tool that performance engineers use to debug the exact JavaScript functions causing slowdowns.

Reading WebPageTest Waterfall Results

The waterfall chart uses color coding:

If you see a long green bar, your server is slow. Long purple bars mean DNS is the bottleneck. Long gray bars mean your files are large or your connection is slow. Each color tells you where to fix.

Other Tools Worth Knowing

Tool Best For Pricing Verdict
DebugBear Continuous monitoring, regression alerts From $35/mo Excellent for teams that ship often
SpeedCurve Visual regression, RUM + synthetic From $20/mo Best filmstrip and visual comparison
Calibre CI/CD integration, performance budgets From $99/mo Great but overkill for small teams
GTmetrix Quick one-off tests Free tier available OK for quick checks, not for monitoring
Sentry Performance If you already use Sentry for error tracking Bundled with Sentry plans Good INP profiling if already on Sentry

The Practical Workflow: Which Tool to Use When

Here is the honest workflow that experienced performance engineers use:

Step 1: Check PSI field data for your URL. This tells you if real users are actually failing CWV. If green, you are done worrying. If red, keep going.

Step 2: Run Lighthouse on the failing page to get a list of specific issues. Look at the Opportunities and Diagnostics sections, not the score number. These tell you what to fix.

Step 3: Use WebPageTest for deeper investigation when Lighthouse opportunities are unclear. The waterfall view often reveals root causes that Lighthouse abstracts away.

Step 4: For ongoing monitoring after fixes, set up DebugBear or SpeedCurve (or free Lighthouse CI in GitHub Actions). These alert you when scores regress after deploys.

Common Mistakes Developers Make with These Tools

FAQ

Why does my PSI score differ from my WebPageTest score?

They test from different locations, different network speeds, and score things differently. PSI runs Lighthouse from a Google data center. WebPageTest runs from the location you select. Both are lab tools; the scores will always differ. Trust the PSI field data section for ranking relevance.

Should I optimize for Lighthouse score or for CrUX field data?

CrUX field data. Always. The Lighthouse score is a diagnostic tool. Field data is what actually affects your rankings. That said, improving Lighthouse metrics usually improves field data too, just on a 28-day rolling average lag.

My site has no field data in PSI. Is that a problem?

Not a ranking penalty in itself. Google just does not have enough Chrome user data to evaluate your URL. This happens for new pages, staging sites, or low-traffic URLs. Google either falls back to domain-level data or simply does not apply CWV as a ranking factor for that specific URL.

Still Getting Red Scores?

Run a free audit and get a punch-list of exactly what to fix. No account needed.

Run Free Audit →

Still Getting Red Scores?

Run your site through VitalsFixer. Free audit in 30 seconds, no account needed.

Analyze My Site Free →

Want an Expert to Handle It?

Real engineers, 48-hour turnaround, money back if scores don't improve.

View Expert Fix →