INP is the hardest Core Web Vital for mobile users. While 97% of desktop pages pass, only 74% of mobile pages achieve good INP. That 23-point gap matters—mobile users make up over 60% of web traffic.
INP replaced FID as a Core Web Vital on March 12, 2024. The bar got higher: 93% of sites passed FID, but FID only measured input delay on the first interaction. INP measures everything—every click, tap, and keypress throughout the entire page session.
Interaction to Next Paint measures responsiveness—how long it takes for the page to visually respond after a user interacts. Google uses this definition from the Lighthouse audit:
"Interaction to Next Paint measures page responsiveness, how long it takes the page to visibly respond to user input."
INP Thresholds:
| Score | Rating | What It Feels Like |
|---|---|---|
| ≤200ms | Good | Instant, responsive |
| 200–500ms | Needs Improvement | Noticeable lag |
| >500ms | Poor | Frustratingly slow |
Score Weight: INP doesn't directly affect Lighthouse Performance scores in lab mode. Instead, Lighthouse uses Total Blocking Time (TBT) as a proxy, which accounts for 30% of the score. In field data (CrUX), INP is measured directly.
INP only measures discrete interactions—events that have a clear start and end:
| Interaction Type | Examples |
|---|---|
| Clicks | Button clicks, link clicks, checkbox toggles |
| Taps | Mobile screen taps, touch inputs |
| Key Presses | Typing in inputs, keyboard shortcuts |
Not measured: Scrolling, hovering, and pinch-to-zoom. These are continuous interactions with different performance characteristics.
The browser tracks all qualifying interactions during the page session. Your INP score is typically the worst interaction (or 98th percentile on pages with many interactions).
Users expect instant feedback. Research shows that delays over 200ms start to feel sluggish. Past 500ms, users assume something is broken.
The consequences of poor INP:
Since 90% of user time on a page happens after it loads, INP captures the experience that matters most.
INP is one of Google's Core Web Vitals ranking factors. Sites with good INP may rank higher than competitors with poor INP, especially when other ranking signals are similar.
Google uses field data (CrUX) at the 75th percentile—meaning 75% of your users need good INP for your URL to pass.
Real-world improvements show direct business correlation:
| Company | INP Change | Business Impact |
|---|---|---|
| Trendyol | 50% reduction | 1% higher click-through rate |
| The Economic Times | 3x faster | 43% lower bounce rate |
| redBus | 72% reduction | Higher engagement metrics |
| Taboola | Multiple optimizations | 5.5% higher ad click-through |
Every interaction breaks down into three phases. Presentation delay accounts for ~42% of total INP on average—often the overlooked bottleneck.
| Phase | What Happens | What Causes Delays |
|---|---|---|
| Input Delay | Time waiting for main thread to become free | Long tasks blocking the thread |
| Processing Duration | Time running your event handlers | Complex handler logic |
| Presentation Delay | Time for browser to render the result | DOM updates, layout, paint |
The critical insight: Most developers focus on processing duration because that's their code. But input delay and presentation delay often account for 60%+ of total INP. A perfectly efficient event handler still fails INP if the main thread was blocked or the DOM update triggered expensive layout.
Click the buttons to feel the difference between fast and slow responses:
Poor INP stems from a handful of causes. Here are the most impactful:
| Issue | Impact | Difficulty | Fix Time |
|---|---|---|---|
| Long-Running JavaScript | High | Medium | Hours |
| Heavy DOM Operations | High | Medium | Hours |
| Third-Party Scripts | High | Low | Minutes–Hours |
| Total Blocking Time | High | Medium | Hours |
| DOM Size | Medium | Medium | Hours |
| Event Handler Delays | Medium | Low | Minutes |
| Hydration Issues | Medium | High | Hours–Days |
→ Diagnose your specific issue
INP requires real user interaction—you can't measure it with an automated page load. Lab tools use Total Blocking Time (TBT) as a proxy because TBT correlates twice as well with INP than FID did.
Chrome DevTools:
Lighthouse: Check the TBT score as your INP proxy. The thresholds are:
Field data is what Google uses for ranking. It represents actual user experience.
PageSpeed Insights: Enter your URL to see CrUX (Chrome User Experience Report) INP data. This shows real-world performance from Chrome users.
Search Console: The Core Web Vitals report shows which URLs pass or fail INP, grouped by similar pages.
Web Vitals JavaScript Library:
import { onINP } from 'web-vitals'
onINP((metric) => {
console.log('INP:', metric.value, 'ms')
console.log('Element:', metric.attribution.interactionTarget)
console.log('Type:', metric.attribution.interactionType)
})
The INP gap between devices is massive:
| Device | Good INP Rate (2024) |
|---|---|
| Desktop | 97% |
| Mobile | 74% |
This 23-point gap means mobile optimization is essential. Mobile devices have:
Test on real mobile devices, not just throttled desktop browsers.
Framework-specific INP optimizations:
Most tools test one page at a time. That's a problem for INP—different pages have different JavaScript, different event handlers, and different interaction patterns.
Your homepage might pass while your checkout fails. Product pages might be fast until someone opens the image gallery.
Unlighthouse scans your entire site and shows TBT (INP proxy) scores for every page. You'll see which templates have issues, which pages are outliers, and whether your fixes work at scale.
The CLI is free and runs locally. Cloud adds scheduled monitoring to catch regressions before users report them.
Animations & Transitions
Eliminate CLS from animations by using compositor-friendly properties like transform and opacity instead of layout-triggering properties.
Long-Running JavaScript
Reduce JavaScript execution time to improve INP. Learn code splitting, web workers, and task chunking to keep the main thread responsive.