PageSpeed Insights API Rate Limits

Handle PSI API quotas without breaking your integration. Rate limit detection, retry strategies, and quota monitoring.
Harlan WiltonHarlan Wilton4 min read Published

Handle quotas without breaking your integration.

Limits

LimitValueScope
Per day25,000Per project
Per minute~240Per project
Per 100 seconds per user60Per API key

The daily limit resets at midnight Pacific Time. The per-minute limits are rolling windows.

Note: Google doesn't officially publish rate limits. These values are from third-party testing and community reports. Some developers report undocumented throttling with 500 errors after ~450 sustained requests.

Detecting Rate Limits

The API returns HTTP 429 when you hit the documented limit, but also returns 500 errors for undocumented throttling:

const response = await fetch(url)

if (response.status === 429) {
  const retryAfter = response.headers.get('Retry-After')
  console.log(`Rate limited. Retry after ${retryAfter} seconds`)
}

if (response.status === 500) {
  // Undocumented per-origin rate limit
  // Wait 60-180 seconds before retrying
  console.log('Possible throttling (500). Wait before retrying.')
}

The Retry-After header tells you how long to wait. If it's missing, default to 60 seconds.

Important: The API doesn't return rate limit headers (X-RateLimit-*). You must track usage client-side.

Strategy 1: Respect Retry-After

The simplest approach—wait as long as the API tells you:

async function fetchWithRetryAfter(url, apiKey) {
  const response = await fetch(
    `https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${apiKey}`
  )

  if (response.status === 429) {
    const retryAfter = Number.parseInt(response.headers.get('Retry-After') || '60', 10)
    console.log(`Rate limited. Waiting ${retryAfter}s`)
    await new Promise(r => setTimeout(r, retryAfter * 1000))
    return fetchWithRetryAfter(url, apiKey)
  }

  return response.json()
}

Strategy 2: Exponential Backoff

For transient errors and rate limits, increase wait time on each retry:

async function fetchWithBackoff(url, apiKey, maxRetries = 5) {
  for (let attempt = 0; attempt < maxRetries; attempt++) {
    const response = await fetch(
      `https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${apiKey}`
    )

    if (response.ok) {
      return response.json()
    }

    if (response.status === 429 || response.status >= 500) {
      const baseDelay = 1000
      const delay = baseDelay * 2 ** attempt + Math.random() * 1000
      console.log(`Attempt ${attempt + 1} failed. Waiting ${Math.round(delay)}ms`)
      await new Promise(r => setTimeout(r, delay))
      continue
    }

    // Non-retryable error
    throw new Error(`PSI API error: ${response.status}`)
  }

  throw new Error('Max retries exceeded')
}

The random jitter prevents thundering herd when multiple clients retry simultaneously.

Strategy 3: Queue with Throttling

Prevent rate limits by controlling request rate upfront:

import PQueue from 'p-queue'

// Stay under ~240/min limit: 4 per second is conservative
const queue = new PQueue({
  concurrency: 4,
  interval: 1000,
  intervalCap: 4,
})

async function fetchPSI(url, apiKey) {
  return queue.add(async () => {
    const response = await fetch(
      `https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${apiKey}`
    )

    if (!response.ok) {
      throw new Error(`PSI error: ${response.status}`)
    }

    return response.json()
  })
}

// Queue handles rate limiting automatically
const urls = ['https://example.com/page1', 'https://example.com/page2']
const results = await Promise.all(urls.map(url => fetchPSI(url, API_KEY)))

Strategy 4: Circuit Breaker

Stop making requests when the API is consistently failing:

class CircuitBreaker {
  constructor(threshold = 5, resetTime = 60000) {
    this.failures = 0
    this.threshold = threshold
    this.resetTime = resetTime
    this.isOpen = false
    this.lastFailure = 0
  }

  async call(fn) {
    if (this.isOpen) {
      if (Date.now() - this.lastFailure < this.resetTime) {
        throw new Error('Circuit breaker open')
      }
      this.isOpen = false
      this.failures = 0
    }

    const result = await fn().catch((error) => {
      this.failures++
      this.lastFailure = Date.now()

      if (this.failures >= this.threshold) {
        this.isOpen = true
        console.log('Circuit breaker opened')
      }

      throw error
    })

    this.failures = 0
    return result
  }
}

const breaker = new CircuitBreaker()

async function fetchPSI(url) {
  return breaker.call(() =>
    fetch(`https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${API_KEY}`)
      .then((r) => {
        if (!r.ok)
          throw new Error(`Status ${r.status}`)
        return r.json()
      })
  )
}

Monitoring Usage

Check your quota in Google Cloud Console:

  1. Go to APIs & ServicesEnabled APIs
  2. Click PageSpeed Insights API
  3. View Quotas tab

You can also check programmatically:

// Track requests locally
let requestsToday = 0
const resetTime = getNextMidnightPT()

function trackRequest() {
  if (Date.now() > resetTime) {
    requestsToday = 0
  }
  requestsToday++

  if (requestsToday > 24000) {
    console.warn(`Approaching daily limit: ${requestsToday}/25000`)
  }
}

Quota Increase

Need more than 25,000 requests/day? You have options:

Multiple GCP Projects

Each project gets its own 25k/day quota. Creating multiple projects is officially endorsed for large-scale operations. Rotate API keys across projects to scale beyond single-project limits.

Request a Quota Increase

  1. Go to APIs & ServicesQuotas
  2. Find Queries per day
  3. Click Edit Quotas
  4. Submit request with justification

Google reviews requests manually. They're reportedly reasonable about increases for legitimate use cases, but approval isn't guaranteed. There's no paid tier — the API is 100% free.

Use CrUX API Instead

If you only need field data (real user Core Web Vitals), the CrUX API has no daily limit — just 150/minute. It's also faster since there's no Lighthouse analysis.

When Limits Aren't Enough

The math doesn't work for site-wide testing:

Site SizeDaily QuotaDays to Test
25,000 pages25,0001+
50,000 pages25,0002+
100,000 pages25,0004+

And that's assuming you test each page once. Weekly monitoring? Monthly monitoring? The quota depletes fast.

Skip the Rate Limit Dance

Unlighthouse runs Lighthouse locally. No API keys, no quotas, no rate limits.

npx unlighthouse --site https://your-site.com

Test your entire site as often as you want.

Try Unlighthouse