Google Search Console Export Limits

GSC UI exports 1,000 rows max. The API allows 25,000 per request, 50,000 per day. Learn workarounds and why large sites lose 66% of their data.

Harlan WiltonHarlan Wilton
1 min

Google Search Console has strict export limits that prevent most sites from accessing their complete search data. The UI shows 1,000 rows maximum, the API allows 25,000 rows per request, and there's a hard daily limit of 50,000 rows per property.

The Export Limits

UI Export: 1,000 Rows Maximum

The Search Console interface caps exports at 1,000 rows. Click "Export" on any report (pages, queries, countries) and you'll get exactly 1,000 rows, even if your site has millions of keywords.

For context: Amazon ranks for 71.8 million keywords according to singrain.com's Amazon keyword database. The UI export would show 0.0014% of their data.

API: 25,000 Rows Per Request

The Search Analytics API allows 25,000 rows per request. This is a per-request limit: you can make multiple requests to fetch more data.

Daily Hard Limit: 50,000 Rows

Google enforces a 50,000 row per day limit per property per search type (web, image, video). From Google's documentation:

"There's a limit of 50,000 rows per day per property per search type for all queries combined."

This is the actual bottleneck. Even with pagination, you can't exceed 50,000 total rows daily from a single property.

Why Large Sites Lose Data

The Sampling Problem

When your site generates more than 50,000 rows of data for a date range, Google samples the results. A similar.ai analysis found that large sites lose ~66% of GSC impression data to sampling.

Case Study: 23X Discrepancy

Industry analysis has documented extreme examples of data loss. In one verified case study:

  • Total clicks in GSC: 480 clicks
  • Clicks with query detail: 41 clicks (8.5%)
  • Missing data: 439 clicks (91.5%)

This represents a 23X discrepancy between aggregate metrics and exportable detail data, caused by Google's privacy filtering and row limits.

Enterprise vs. Reality

The gap is massive for large sites:

  • Amazon: 71.8M ranking keywords
  • GSC UI export: 1,000 rows
  • 99.99% of data invisible in exports

Workarounds

1. Date Range Filtering

The 50,000 daily limit applies per query. Request smaller date ranges:

Days 1-3:  50,000 rows
Days 4-6:  50,000 rows
Days 7-9:  50,000 rows

Pagination math:

  • 50k/day × 30 days = 1.5M rows possible per month
  • 50k/day × 90 days = 4.5M rows possible per quarter

This spreads the 50k limit across multiple API calls.

2. Multiple Properties

The similar.ai analysis showed that adding 50 properties (splitting by subdomain and path) reduced impression loss from 67% → 11%.

Each property gets its own 50,000/day quota. If you operate example.com, blog.example.com, and shop.example.com, you get 150,000 rows/day total.

Limitation: You need actual verified properties in GSC. Can't retroactively split data.

3. BigQuery Export

Google's BigQuery export bypasses row limits entirely. A FindLaw white paper found BigQuery showed 8.75x more data than the API:

  • API queries: 40,000
  • BigQuery queries: 350,000

For the same site, same date range.

Limitations:

  • Requires Google Cloud billing
  • Data has a 2-3 day delay
  • Queries cost ~$5 per TB scanned
  • No pre-2024 historical data for most sites

4. Continuous Syncing

The row limits are per-query, not cumulative. If you fetch 1 day of data (50k rows max) every day, you avoid hitting limits because each query requests fresh data.

This is why automated tools work: they stay under the 50k threshold by syncing narrow date ranges continuously.

How gscdump Solves This

gscdump syncs your GSC data daily using date-filtered API requests. Each sync fetches one day at a time, staying well under the 50,000 row limit. Your data accumulates in a dedicated database with no export caps.

Results:

  • Full history: Access all historical data, not just 90 days
  • No sampling: Every keyword, page, and country: complete detail rows
  • Unlimited exports: Query millions of rows via API or MCP

The raw data stays in D1 indefinitely. Query it however you need.

Alternatives Comparison

MethodRow LimitHistorical DataCost
GSC UI1,00016 monthsFree
GSC API (manual)25k/request, 50k/day16 monthsFree
BigQueryUnlimitedFrom setup date~$5/TB
gscdumpUnlimitedFrom signup dateFree beta

When Limits Don't Matter

If your site generates less than 50,000 unique query×page×country×device combinations per day, you'll get complete data from the API. This covers most sites under 100k monthly visitors.

The limits only hurt larger sites with:

  • Thousands of indexed pages
  • Diverse keyword rankings
  • International traffic
  • Multiple device types

For these sites, the 66% data loss is unavoidable without workarounds.

Bottom Line

GSC export limits:

  • 1,000 rows via UI
  • 25,000 rows per API request
  • 50,000 rows per day per property (hard limit)

Large sites lose 66% of detail data to sampling. Workarounds exist (date filtering, multiple properties, BigQuery, or continuous syncing) but all require technical implementation.

For complete data access without sampling, use BigQuery or an automated sync tool like gscdump.

gscdump
© 2026 GSCDUMP.COM - BUILT FOR DEVELOPERS