PageSpeed Insights vs Lighthouse: What's the Difference?
Both PageSpeed Insights and Lighthouse come from Google. Both give you a 0-100 performance score. Yet they return different numbers for the same page. Here's why, and which one to trust.
The two tools
Lighthouse
Open-source auditing tool. Runs locally in Chrome DevTools, CLI, or CI. Lab data only — controlled, reproducible synthetic tests on your machine. You choose device emulation, throttling mode, and network conditions. No Google account needed.
npx lighthouse https://example.com --output=json
Currently on version 13, which removed the PWA category.
PageSpeed Insights
Google's web tool at pagespeed.web.dev. Runs Lighthouse on Google's servers from data centers in Oregon, South Carolina, Netherlands, or Taiwan. Lab + field data — overlays real-user metrics from CrUX (28-day P75 rolling window). Paste a URL and go.
Also runs Lighthouse 13. Provides an API with 25,000 free requests/day.
Key differences
| Lighthouse (CLI/DevTools) | PageSpeed Insights | Edge | |
|---|---|---|---|
| Data type | Lab only | Lab + field (CrUX) | PSI |
| Where it runs | Your machine | Google servers | Lighthouse |
| INP | Not measured (TBT proxy) | Real field data from CrUX | PSI |
| Throttling | Simulated or applied (your choice) | Simulated only | Lighthouse |
| Caching | Cold or warm (configurable) | Always cold cache | Lighthouse |
| Network | Your connection (throttled) | Google's network | — |
| Score consistency | High (controlled) | Varies (shared infra) | Lighthouse |
| API access | Local only | 25k free requests/day | PSI |
| Site-wide | One URL at a time | One URL at a time | — |
Why your scores differ
A 10-15 point gap between PSI and local Lighthouse is normal. Score variance of ±5 points happens even across identical PSI runs. Here's what causes larger gaps:
Server location
PSI routes your test to whichever Google data center is closest to your IP — not closest to your server. As research on PSI server locations shows, if your site lacks a CDN, a test routed from Taiwan to a US-only origin adds hundreds of milliseconds to TTFB, inflating LCP.
Local Lighthouse connects directly from your machine, which might be on the same network as your server or using a nearby CDN edge.
Throttling method
PSI uses simulated throttling exclusively — it runs the page at full speed then mathematically adjusts the metrics. Lighthouse CLI lets you choose between simulated and applied (real network) throttling. Applied throttling produces lower scores but more realistic results.
Hardware differences
Google's servers have different CPU/RAM profiles than your laptop. Simulated throttling tries to normalize this, but CPU-heavy pages (lots of JavaScript parsing) can score differently depending on the underlying hardware.
Caching state
PSI always runs a cold-cache test. Your local Lighthouse might inadvertently benefit from cached service worker responses or browser cache, especially if you've visited the page recently.
Extensions and background processes
Chrome extensions inject scripts and CSS. Local Lighthouse in DevTools runs in a separate profile, but other browser activity still affects CPU availability. PSI has none of these contamination issues.
Lab data vs field data
This is the most important distinction between the two tools.
Lab data (Lighthouse) tests a single page load under controlled conditions. It's reproducible, fast, and catches regressions — but it represents one synthetic user, not your actual audience.
Field data (CrUX in PSI) aggregates real Chrome users visiting your site over a 28-day rolling window. It uses the 75th percentile (P75), catching the experience of users with slow devices and poor connections — not just the median.
| Lab (Lighthouse) | Field (CrUX/PSI) | |
|---|---|---|
| Source | Simulated page load | Real Chrome users |
| Sample | 1 test run | Thousands of visits (P75) |
| Time window | Point-in-time | 28-day rolling average |
| Traffic requirement | None | ~1,000 page loads in 28 days |
| INP | Not available | Yes |
| Consistency | High | Varies with real conditions |
A site can score 95 in Lighthouse but show "Poor" LCP in CrUX field data. This happens when your real users are on slow mobile networks that lab throttling doesn't fully simulate.
How Lighthouse scores performance
Both tools use the same scoring weights. Understanding them helps explain why the same page gets different numbers:
TBT dominates at 30%. Since PSI and local Lighthouse run on different hardware with different CPU profiles, TBT-heavy pages see the biggest score gaps between the two tools.
INP: the biggest gap between PSI and Lighthouse
Interaction to Next Paint (INP) replaced First Input Delay as a Core Web Vital in March 2024. It measures responsiveness across the entire page lifecycle — every tap, click, and keypress.
PSI shows real INP field data from CrUX. This is the single biggest reason to check PSI even if you primarily use Lighthouse CLI — it's the only place you'll see actual interaction responsiveness without adding your own Real User Monitoring.
When to use which
| Scenario | Use |
|---|---|
| Development | Lighthouse CLI — fast feedback, consistent scores, CI integration |
| Production monitoring | PSI / CrUX — real user data, INP, field metrics |
| CI/CD checks | Lighthouse CI — automated regression detection |
| Debugging score drops | Lighthouse CLI — isolate variables, test specific throttling |
| Checking INP | PSI — only source of real INP data without RUM |
| Site-wide audits | Unlighthouse — crawl every page, spot patterns across routes |
| Stakeholder reports | PSI — field data carries more weight, reflects real users |
How Unlighthouse helps
Both Lighthouse and PSI test one URL at a time. Performance issues rarely affect just one page — a slow font, a heavy third-party script, or a layout shift pattern can be site-wide.
Unlighthouse crawls your entire site and runs Lighthouse on every discovered page. Instead of spot-checking your homepage, you catch the /checkout page with a 3.2s LCP or the /blog layout causing CLS on 40 posts.
npx unlighthouse --site https://your-site.com
Combine with PSI for field data on your critical pages, and Lighthouse CI for automated regression checks in your pipeline. Each tool fills a different gap.
Try UnlighthouseFAQ
Why is my PageSpeed Insights score different from Lighthouse?
PSI runs Lighthouse on Google's servers with different hardware, network conditions, and server locations than your local machine. PSI also overlays CrUX field data that can affect the reported metrics. A gap of ±5-15 points between the two is normal.
Which score should I trust — PSI or Lighthouse?
For production sites with enough traffic: trust PSI's field data. It reflects what real users experience. For development and debugging: trust local Lighthouse, where you control variables and get consistent results.
Does PageSpeed Insights use Lighthouse?
Yes. PSI runs Lighthouse on Google's servers and adds CrUX field data on top. The lab scores in PSI come from the same Lighthouse engine you run locally.
Why does my score change every time I run PageSpeed Insights?
PSI runs on shared Google infrastructure where network conditions, server load, and routing vary between runs. Score variance of ±5 points is normal. Run 3-5 tests and use the median for accurate comparisons.
What is field data in PageSpeed Insights?
Field data comes from the Chrome User Experience Report (CrUX) — real metrics from Chrome users who visited your site over the past 28 days, reported at the 75th percentile. A URL needs roughly 1,000 page loads in 28 days before field data appears.
Should I optimize for Lighthouse or PageSpeed Insights?
Optimize for real users. Use Lighthouse to diagnose and fix lab issues during development. Use PSI field data to verify improvements reach your actual audience. Don't chase a perfect 100 — diminishing returns hit hard above 90.
Can I get a 100 on PageSpeed Insights?
In lab data, yes. But field data may still show issues because real users have diverse devices, network conditions, and geographic locations that lab tests can't replicate. A 100 lab score with "Poor" field CWV means your real users are still suffering.
Bulk Lighthouse Testing
Run Lighthouse on your entire site with Unlighthouse. Limit pages with --maxRoutes, filter by URL pattern, and automate bulk audits.
Accessibility
Master web accessibility with Lighthouse. Learn how accessibility audits work, why they matter for users and SEO, and how to fix common issues.