Handle quotas without breaking your integration.
| Limit | Value | Scope |
|---|---|---|
| Per day | 25,000 | Per project |
| Per 100 seconds | 400 | Per project |
| Per 100 seconds per user | 60 | Per API key |
The daily limit resets at midnight Pacific Time. The per-100-second limits are rolling windows.
The API returns HTTP 429 when you hit a limit:
const response = await fetch(url)
if (response.status === 429) {
const retryAfter = response.headers.get('Retry-After')
console.log(`Rate limited. Retry after ${retryAfter} seconds`)
}
The Retry-After header tells you how long to wait. If it's missing, default to 60 seconds.
The simplest approach—wait as long as the API tells you:
async function fetchWithRetryAfter(url, apiKey) {
const response = await fetch(
`https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${apiKey}`
)
if (response.status === 429) {
const retryAfter = Number.parseInt(response.headers.get('Retry-After') || '60', 10)
console.log(`Rate limited. Waiting ${retryAfter}s`)
await new Promise(r => setTimeout(r, retryAfter * 1000))
return fetchWithRetryAfter(url, apiKey)
}
return response.json()
}
For transient errors and rate limits, increase wait time on each retry:
async function fetchWithBackoff(url, apiKey, maxRetries = 5) {
for (let attempt = 0; attempt < maxRetries; attempt++) {
const response = await fetch(
`https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${apiKey}`
)
if (response.ok) {
return response.json()
}
if (response.status === 429 || response.status >= 500) {
const baseDelay = 1000
const delay = baseDelay * 2 ** attempt + Math.random() * 1000
console.log(`Attempt ${attempt + 1} failed. Waiting ${Math.round(delay)}ms`)
await new Promise(r => setTimeout(r, delay))
continue
}
// Non-retryable error
throw new Error(`PSI API error: ${response.status}`)
}
throw new Error('Max retries exceeded')
}
The random jitter prevents thundering herd when multiple clients retry simultaneously.
Prevent rate limits by controlling request rate upfront:
import PQueue from 'p-queue'
// Stay under 400/100s limit: 4 per 100ms = 400 per 100s
const queue = new PQueue({
concurrency: 4,
interval: 100,
intervalCap: 4,
})
async function fetchPSI(url, apiKey) {
return queue.add(async () => {
const response = await fetch(
`https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${apiKey}`
)
if (!response.ok) {
throw new Error(`PSI error: ${response.status}`)
}
return response.json()
})
}
// Queue handles rate limiting automatically
const urls = ['https://example.com/page1', 'https://example.com/page2']
const results = await Promise.all(urls.map(url => fetchPSI(url, API_KEY)))
Stop making requests when the API is consistently failing:
class CircuitBreaker {
constructor(threshold = 5, resetTime = 60000) {
this.failures = 0
this.threshold = threshold
this.resetTime = resetTime
this.isOpen = false
this.lastFailure = 0
}
async call(fn) {
if (this.isOpen) {
if (Date.now() - this.lastFailure < this.resetTime) {
throw new Error('Circuit breaker open')
}
this.isOpen = false
this.failures = 0
}
const result = await fn().catch((error) => {
this.failures++
this.lastFailure = Date.now()
if (this.failures >= this.threshold) {
this.isOpen = true
console.log('Circuit breaker opened')
}
throw error
})
this.failures = 0
return result
}
}
const breaker = new CircuitBreaker()
async function fetchPSI(url) {
return breaker.call(() =>
fetch(`https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${encodeURIComponent(url)}&key=${API_KEY}`)
.then((r) => {
if (!r.ok)
throw new Error(`Status ${r.status}`)
return r.json()
})
)
}
Check your quota in Google Cloud Console:
You can also check programmatically:
// Track requests locally
let requestsToday = 0
const resetTime = getNextMidnightPT()
function trackRequest() {
if (Date.now() > resetTime) {
requestsToday = 0
}
requestsToday++
if (requestsToday > 24000) {
console.warn(`Approaching daily limit: ${requestsToday}/25000`)
}
}
Need more than 25,000 requests/day? You can request a quota increase:
Google reviews requests manually. Approval isn't guaranteed.
The math doesn't work for site-wide testing:
| Site Size | Daily Quota | Days to Test |
|---|---|---|
| 25,000 pages | 25,000 | 1+ |
| 50,000 pages | 25,000 | 2+ |
| 100,000 pages | 25,000 | 4+ |
And that's assuming you test each page once. Weekly monitoring? Monthly monitoring? The quota depletes fast.
Unlighthouse runs Lighthouse locally. No API keys, no quotas, no rate limits.
npx unlighthouse --site https://your-site.com
Test your entire site as often as you want.
Try Unlighthouse