Why Your Core Web Vitals Pass on Desktop But Fail on Mobile

Your Lighthouse score is 91. You tell your boss. Everyone is happy. Then Google Search Console shows mobile users are failing Core Web Vitals. And you feel personally attacked.

Split screen showing mobile phone with failing red performance metrics and desktop monitor with green passing metrics

Your Lighthouse score is 91. You tell your boss. Everyone is happy. Then Google Search Console shows mobile users are failing Core Web Vitals. And you feel personally attacked.

So here is the thing. Desktop passing and mobile failing is not a bug. It is not bad luck. It is a completely predictable outcome of how mobile devices work versus how lab tools test them. And once you understand why it happens, the fixes are actually pretty clear.

According to the 2026 HTTP Archive data covering over 10 million origins, only 47% of mobile origins pass all three Core Web Vitals compared to 58% on desktop. That 11-point gap has been there for years and is getting worse, not better. The starkest difference is in INP: 76% of sites pass on mobile versus 97% on desktop. The reason for that specific gap is something we need to talk about.

Why Lab Scores Lie to You (Sort Of)

Lighthouse runs a simulated page load on your machine. When it runs in "mobile" mode it applies two things: CPU throttling and network throttling.

CPU throttling at 4x means Lighthouse slows your CPU down to simulate a low-end Android phone. That sounds reasonable. And it is, mostly. But here is what the simulation still gets wrong.

Your development machine has one CPU core dedicated to the test. A real budget Android phone in someone's pocket has that same amount of CPU competing with their 12 open Chrome tabs, their background syncing apps, their brightness adapting software, and whatever else is running. Median Total Blocking Time on real mobile devices hits around 2,100ms in 2026 HTTP Archive data. On desktop? About 95ms. That is a 22:1 ratio. Lighthouse's 4x throttle helps you catch the rough shape of the problem, but it cannot replicate 22x real-world conditions.

The result: you see a 91 Lighthouse mobile score in the lab, but your real users on real Android phones in real conditions are having a very different experience. And Google knows this because they have CrUX data from Chrome's actual field data collection. That data tells a different story than your lab score.

How Google Actually Uses Mobile and Desktop CWV Data Separately

This is the part that surprises people. Google does not average your mobile and desktop scores together for ranking purposes.

The Chrome User Experience Report (CrUX) collects real performance data from Chrome users who have opted in to sharing usage statistics. It splits this data by device type. Your mobile CWV data affects your ranking in mobile search results. Your desktop CWV data affects desktop search rankings.

So if your desktop is passing but your mobile is failing, you have a mobile ranking problem. You will keep that problem even if your Lighthouse lab score looks great, because Lighthouse is not what Google uses for ranking. Google uses CrUX field data. Those are two different things and people mix them up constantly.

Go to Google Search Console, click Core Web Vitals, and look at the mobile and desktop tabs separately. The mobile tab is what matters for most sites since Google switched to mobile-first indexing. If mobile is red and desktop is green, your ranking hit is on mobile searches.

The 5 Actual Causes of Mobile-Specific CWV Failures

1. INP Is Way Harder on Mobile

Interaction to Next Paint measures how fast your page responds to user input. On a fast desktop it is easy to stay under 200ms. On a slow Android phone running 12 apps in background? Much harder.

Every JavaScript task that takes over 50ms blocks user input. Your desktop might finish those tasks in 60ms each. On a phone with a 9x slower single-core CPU, those same tasks take 540ms. Your buttons feel stuck. Your menus lag. Users bounce.

This is why the INP mobile pass rate is 76% versus 97% desktop. Same code, very different hardware, very different experience.

2. Images Without Proper srcset

If you are serving a 1200px wide hero image to a 390px wide phone screen, you are sending about 9 times more pixels than needed. More data, slower LCP. Full stop.

The fix is srcset. It tells the browser to pick the right-sized image for the screen. Here is what it looks like:

<img
  src="hero-800.webp"
  srcset="hero-400.webp 400w,
          hero-800.webp 800w,
          hero-1200.webp 1200w"
  sizes="(max-width:600px) 100vw, 800px"
  width="800" height="450"
  fetchpriority="high"
  alt="Your hero description"
>

That sizes attribute is the key one. It tells the browser: on screens under 600px wide, this image fills the full viewport. On bigger screens, it is 800px. The browser uses that to pick the right file from the srcset list. Most developers skip the sizes attribute and wonder why browsers still download huge images on phones.

3. Touch Events and INP

Mobile has touch events instead of mouse events. That sounds trivial but it adds real complexity. Touch events fire before click events and require more processing. If you have heavy click handlers, they also get hit by touch processing overhead.

Common mobile INP killers:

Use Chrome DevTools with your phone connected via USB for real device profiling. Go to chrome://inspect, connect your phone, enable USB debugging, and you can profile your actual page on your actual device. The results are usually humbling.

4. Viewport Meta Tag Issues

If you are missing this in your head:

<meta name="viewport" content="width=device-width, initial-scale=1">

You will have CLS problems on mobile. The browser will render at a desktop width and then resize, causing layout shifts. This is a basic fix but you would be surprised how often it is missing or broken.

Also avoid user-scalable=no. It hurts accessibility and Google does not love it.

5. Third-Party Scripts Are Heavier on Mobile

That chat widget? On desktop it adds 80ms to your INP. On mobile it adds 400ms. Because mobile has 4x less CPU to spare. Same script, very different impact.

Run a quick test. Open your site on a phone, open DevTools via remote debug, and go to the Performance tab. Record a page load and look at the main thread view. You will see every third-party script showing up as a long red task. Each one of those is hurting your INP and potentially your LCP too.

Step-by-Step: How to Diagnose Mobile-Specific CWV Issues

Step 1: Check Your Real Field Data First

Go to PageSpeed Insights and run your URL. Look at the "Discover what your real users are experiencing" section, not the "Lab data" section. The field data shows actual CrUX numbers for real mobile users.

If field data is red and lab data is green, your real users have a problem and your tests are not catching it.

Step 2: Use the CrUX API to Get Per-Device Data

The Chrome UX Report API is free. Here is a quick call to get mobile vs desktop data for any URL:

https://chromeuxreport.googleapis.com/v1/records:queryRecord
?key=YOUR_API_KEY

Body:
{
  "url": "https://yoursite.com/",
  "formFactor": "PHONE",
  "metrics": ["largest_contentful_paint", "interaction_to_next_paint", "cumulative_layout_shift"]
}

Change formFactor to DESKTOP to compare. The API returns p75 values, which is exactly what Google uses for ranking. No key needed for testing, just rate limits.

Step 3: Test on a Real Device, Not an Emulator

Emulation in DevTools is useful but not perfect. If you have access to an Android phone, any decent mid-range Android from the past 3 years works fine, do real testing there.

The best free way to test on real devices without owning them: Google's Web.dev offers free RUM data, and many hosting providers give access to WebPageTest with real device labs. WebPageTest lets you run tests on a Moto G (low-end device) from a real location.

Step 4: Profile Your Interactions on Mobile

Connect your Android phone via USB. Go to chrome://inspect/#devices in desktop Chrome. Click Inspect on your page. Open the Performance tab. Click the dots menu and enable CPU 4x slowdown (to simulate low-end further). Record yourself clicking through the page normally. Look for red blocks on the main thread.

The Mobile-Specific Fix Checklist

Mobile CWV Fix Checklist

Fixing Mobile INP Specifically

Since INP is the metric with the biggest mobile vs desktop gap, let us look at that one specifically.

The main thread on a slow phone is basically a single-lane road. Everything has to take turns. JavaScript tasks, style calculations, rendering, user input processing. If your JavaScript is hogging that road with long tasks, user input sits in the queue waiting.

The fix is to break long tasks into smaller chunks with yields. Here is the pattern:

// Old way: one long task that blocks input
function processLargeList(items) {
  items.forEach(item => {
    doExpensiveWork(item); // 10ms each, 100 items = 1000ms blocking
  });
}

// New way: yield between chunks
async function processLargeList(items) {
  for (let i = 0; i < items.length; i++) {
    doExpensiveWork(items[i]);

    // Yield every 5 items to let browser process input
    if (i % 5 === 0) {
      await scheduler.yield(); // Chrome 115+
      // Fallback: await new Promise(r => setTimeout(r, 0));
    }
  }
}

This pattern lets the browser process user input between each chunk. Your INP drops because the browser is no longer stuck waiting for your loop to finish before it can register a click.

Mobile vs Desktop: The Summary Table

Issue Desktop Impact Mobile Impact Priority Fix
Long JS tasks Minor Severe scheduler.yield()
Large images Moderate Severe srcset + sizes
Third-party scripts Moderate Very severe Defer / Partytown
Font loading Moderate Moderate font-display: swap
Missing srcset None Severe Responsive images

FAQ

Why does my PSI score differ every time I run it?

The Lighthouse lab data in PSI uses your current machine conditions and can vary 5-10 points between runs. The field data section (from CrUX) is stable because it averages 28 days of real user data. Focus on the field data, not the lab score.

My mobile Lighthouse score is 85. Should I worry?

Maybe. A Lighthouse lab score of 85 does not mean your field data is passing. Check the "Core Web Vitals Assessment" section in PSI field data for the real answer. Green there means you are actually passing. Red there means real users are having a bad time, regardless of your lab score.

How often does Google update mobile CWV data for ranking?

CrUX data is collected continuously from Chrome users. Google Search Console shows it on a 28-day rolling basis. Ranking changes from CWV improvements typically show up in Search Console within 4-6 weeks.

Is it possible to pass mobile CWV without passing desktop?

Yes, it happens. Sites that optimize specifically for mobile-first (small images, minimal JS, simple layouts) sometimes pass mobile but have desktop CLS issues from wide layouts. But it is uncommon. Usually desktop is easier to pass.

Still Getting Red Scores?

Run a free audit and get a punch-list of exactly what to fix. No account needed.

Run Free Audit →

Still Getting Red Scores?

Run your site through VitalsFixer. Free audit in 30 seconds, no account needed.

Analyze My Site Free →

Want an Expert to Handle It?

Real engineers, 48-hour turnaround, money back if scores don't improve.

View Expert Fix →