Google Search Console API Rate Limits
GSC API allows 1,200 queries per minute per site. URL Inspection is stricter at 600 QPM. Learn backoff strategies and how to avoid 429 errors.
The GSC API enforces strict rate limits to prevent abuse and ensure fair access. Understanding these limits is critical for building reliable integrations.
Search Analytics Limits
Per Site/User:
- 1,200 queries per minute (QPM) - 20 queries per second
- Applies per property per user according to Google's quota documentation
Per Project:
- 30 million queries per day (QPD)
- 40,000 queries per minute (QPM)
If you exceed these limits, you'll receive a 429 Too Many Requests error. Google requires a 15-minute backoff after hitting load quotas.
URL Inspection Limits
URL Inspection API has much stricter limits:
- 2,000 queries per day per site
- 600 queries per minute per site (10 QPS)
These limits make URL Inspection unsuitable for bulk operations. Use it sparingly for real-time indexing checks.
Row Limit
The API returns a maximum of 50,000 rows per day per search type, sorted by clicks. This means:
- Only the top 50K pages/keywords by clicks are exposed
- Long-tail queries with few clicks may be invisible
- Aggregate totals (impressions, clicks) remain accurate even if rows are truncated
See response limits for details on working with this constraint.
Query Cost Factors
Not all queries are equal. Cost increases with:
- Dimensions:
page+querygrouping is most expensive - Date range: 6-month requests cost significantly more than single-day queries
- Filter complexity: Multiple filters increase processing time
When syncing historical data, prefer daily queries over large date ranges to reduce per-query cost and improve parallelization.
Batch Requests
The API supports batching up to 1,000 calls per HTTP request. This doesn't increase your quota, but reduces overhead:
// Batch multiple date queries
const batch = dates.map(date => ({
siteUrl: 'https://example.com',
startDate: date,
endDate: date,
dimensions: ['page']
}))
// Single HTTP request, 1,000 API calls
const results = await gsc.searchanalytics.batch({ requests: batch })
Batching is most useful when you have many lightweight queries (single dates, simple dimensions).
Exponential Backoff
When you hit a 429 error, implement truncated binary exponential backoff:
async function fetchWithBackoff<T>(
fn: () => Promise<T>,
maxRetries = 5
): Promise<T> {
let delay = 1000 // Start at 1 second
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn()
}
catch (error: any) {
if (error.code !== 429 || attempt === maxRetries) {
throw error
}
// Exponential backoff: 1s, 2s, 4s, 8s, 16s
await new Promise(resolve => setTimeout(resolve, delay))
delay *= 2
}
}
throw new Error('Max retries exceeded')
}
// Usage
const data = await fetchWithBackoff(() =>
gsc.searchanalytics.query({
siteUrl: 'https://example.com',
requestBody: {
startDate: '2025-01-01',
endDate: '2025-01-01',
dimensions: ['page']
}
})
)
Add jitter to prevent thundering herd when multiple processes back off simultaneously:
const jitter = Math.random() * 1000 // 0-1s random delay
await new Promise(resolve => setTimeout(resolve, delay + jitter))
Rate Limit Strategy
For syncing large properties:
- Parallelize daily queries - Fetch one day per request to stay under QPM
- Queue jobs - Use a job queue with configurable concurrency (10-15 concurrent requests is safe)
- Respect 429s - Always implement backoff; don't hammer the API
- Monitor quotas - Track your project's daily quota usage in Google Cloud Console
See authentication for project setup and querying data for efficient query patterns.
Related Articles
- GSC API Query Builder - Build queries with dimensions and filters
- GSC API Authentication - OAuth setup and token management
- Export Row Limits - Understanding the 25K/50K row caps
- GSC MCP Server - Skip rate limits with pre-synced data