Google Search Console API

Access GSC performance data programmatically. Learn API limits, authentication, and query patterns for SEO automation.

Harlan WiltonHarlan Wilton
1 min

The Google Search Console API lets you access search performance data programmatically (clicks, impressions, CTR, position) for any property you own. It's the foundation for SEO automation, custom dashboards, and AI-powered analysis.

What the API Provides

The Search Console API exposes three primary resources:

Performance Data - Search analytics (clicks, impressions, CTR, position) broken down by query, page, country, device, search appearance. This is what most developers use.

URL Inspection - Index status, crawl info, mobile usability, structured data validation for specific URLs. Useful for debugging indexing issues.

Sitemaps - Submit, list, and check sitemap status. Automate sitemap management for large sites.

API vs UI Capabilities

The GSC web interface shows 1,000 rows max. The API returns up to 25,000 rows per request and allows 50,000 rows per day per property according to Google's documentation.

For large sites, this matters. A lot.

The UI also rounds numbers at scale. The API gives you exact counts (within Google's sampling limitations).

Common Use Cases

SEO Dashboards - Pull GSC data into your own analytics stack. Combine with GA4, rank tracking, CMS data.

Automated Reporting - Schedule Python scripts to export daily metrics. Email stakeholders without manual exports.

AI Workflows - Feed search performance into Claude or GPT for content recommendations, keyword gap analysis, opportunity detection.

Large Site Management - For sites with 10k+ indexed pages, the UI's 1k row limit is unusable. The API is the only way to see full data.

Historical Backfill - GSC UI shows 16 months. The API lets you archive data permanently before it disappears.

Quick Start Example

// Fetch last 7 days of clicks/impressions by query
const siteUrl = 'sc-domain:example.com' // or 'https://example.com/'
const response = await fetch(
  `https://searchconsole.googleapis.com/webmasters/v3/sites/${encodeURIComponent(siteUrl)}/searchAnalytics/query`,
  {
    method: 'POST',
    headers: {
      'Authorization': `Bearer ${accessToken}`,
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({
      startDate: '2025-01-20',
      endDate: '2025-01-26',
      dimensions: ['query'],
      rowLimit: 25000
    })
  }
)

const data = await response.json()
console.log(data.rows) // Array of {keys: ['query'], clicks, impressions, ctr, position}

Note: GSC data has a 2-3 day lag. Today's data won't be available until ~48 hours later.

API Limitations You Should Know

Rate Limits - 1,200 queries per minute per site/user (20 QPS). URL Inspection is stricter: 600 QPM. See Rate Limits for details.

Row Limits - 25k rows per request, 50k rows per day per property. For large sites, this means strategic date/dimension chunking.

Data Sampling - When grouping by page + query, Google drops data to reduce cardinality. Large sites lose ~66% of impression data. This is documented but often surprising.

No Backfill - BigQuery export only syncs forward from connection date. Historical data requires manual API calls before it expires.

Next Steps

Why gscdump Exists

The API's 25k row limit and 16-month retention means you're constantly fighting data loss. gscdump pulls your full GSC dataset daily, stores it permanently, and exposes it via MCP for Claude Code workflows.

No row limits. No retention windows. Query your complete search history with AI.

Try gscdump free: gscdump.com

gscdump
© 2026 GSCDUMP.COM - BUILT FOR DEVELOPERS