Google Search Console Data Retention: The 16 Month Limit Explained
GSC permanently deletes search data after 16 months. Learn exactly what disappears, the real cost of lost data, and how to preserve your search history.
Google Search Console permanently deletes all search data after 16 months. No archive. No recovery. No exceptions.
What Gets Deleted
After 16 months, GSC purges:
- Query-level data (what keywords drove traffic)
- Page-level performance (clicks, impressions, position per URL)
- All click and impression counts
- Historical position tracking
- Country and device breakdowns
Everything older than 16 months vanishes completely.
Historical Context
GSC originally limited data to 90 days. In January 2018, Google increased retention to 16 months in response to user feedback.
However, the data within that 16-month window is subject to "resets." In September 2025, multiple site owners reported historical impression data disappeared without warning: it "fell off a cliff."
According to Google and industry analysis, this was a major AI bot filtering update (specifically targeting automated scrapers and rank trackers). Google purged impressions generated by LLM scrapers and bot traffic. While this makes data "cleaner," it creates a Baseline Shift: Comparing October 2025 impressions (human only) against October 2024 (human + bot) shows a massive "loss" that isn't real. Without permanent historical storage to analyze these shifts, your YoY reporting becomes misleading.
The Algorithm Recovery Problem
Algorithm recoveries take 3-6 months typically. Search Logistics case studies show sites need 90-180 days post-update to measure full impact.
Here's the timeline problem:
March 2024: Algorithm hit, traffic drops 40%
Apr-Aug 2024: Recovery period, slow climb back
September 2024: Full recovery confirmed: back to baseline
August 2025: March 2024 baseline data DELETED
By month 17, you can't measure recovery because your baseline disappeared. Can't prove you recovered. Can't show clients progress. Can't document what worked.
Year-Over-Year Comparisons Break
YoY analysis is standard for seasonal businesses. The 16-month limit kills it.
Real example:
- April 2024: You want to compare vs April 2023
- Problem: By August 2024 (month 17), April 2023 data expires
- Result: Can't compare April 2024 performance after August
For seasonal businesses, this is devastating:
- Black Friday 2023 data gone by March 2025
- Holiday 2023 baseline deleted before Holiday 2024 planning
- Summer 2023 trends unavailable for Summer 2024 strategy
You lose the ability to identify multi-year patterns.
The BigQuery "Solution" Doesn't Help
Google offers BigQuery export, but it has a critical limitation: no historical backfill.
BigQuery only captures data from the day you enable export forward. Google confirmed "no plans to add historical data exporting."
Starting BigQuery today means:
- All data before today: lost after 16 months
- All data after today: preserved permanently
The earlier you start, the more you save. But you can't recover what's already aging toward deletion.
Why 16 Months?
Google hasn't published technical reasons. Likely factors:
- Storage costs at Google's scale (billions of rows per property)
- Query performance (smaller datasets = faster UI)
- Data freshness priority (recent data matters most for active optimization)
From Google's perspective, GSC is a real-time monitoring tool, not a data warehouse. The 16-month window balances historical context with operational efficiency.
The Manual Export Workaround
Some developers export CSV monthly via GSC UI or API. This works but:
- Requires remembering every month (miss one month = permanent gap)
- CSV exports lack granularity (1000 row limit on queries)
- API exports are tedious (paginate through dates, tables, dimensions)
- Storage/organization becomes your problem
It's technically possible but operationally fragile.
The Better Approach
gscdump syncs your GSC data daily and stores it permanently in your own database:
- Full granularity: every query, page, date combination
- Permanent retention: data never expires
- Automatic: no manual exports, no gaps
- API access: query your own data programmatically
Data older than 16 months stays accessible because it's in your database, not Google's.
What You Should Do
If you care about long-term data:
- Start preserving now: Every month you wait is another month of data that will age out
- Don't rely on manual exports: They fail the moment you forget
- Consider automated sync: Tools like gscdump handle daily backups automatically
The 16-month limit is the most significant constraint in GSC. Once data deletes, it's gone forever. The only solution is preservation before deletion.
The Real Cost: A Worked Example
Consider a mid-size e-commerce site with 200 pages and 15,000 ranking keywords.
Data generated per month:
- ~15,000 unique query×page combinations
- ~50 countries with traffic
- 3 device types (desktop, mobile, tablet)
- Daily granularity = ~450,000 data points/month
After 16 months, this site permanently loses:
- 7.2 million data points (16 months × 450k)
- Every seasonal pattern before that window
- All pre-algorithm-update baselines
- Complete keyword discovery history (new terms that appeared and disappeared)
What this prevents:
- Can't identify which keywords you ranked for 2 years ago but lost
- Can't measure multi-year content decay
- Can't compare holiday seasons across years
- Can't prove ROI of SEO work completed 18+ months ago to stakeholders
For agencies, the last point is critical. A client asks "what did our SEO investment 2 years ago achieve?" and the data no longer exists.
Retention by Report Type
Not all GSC data follows the same 16-month rule. According to Google's Data Retention documentation:
| Report | Retention Period |
|---|---|
| Search Performance (queries, pages, clicks, impressions) | 16 months |
| Indexing / Coverage (historical trends) | 16 months |
| Crawl Stats | 90 days |
| Removals | 6 months |
| Core Web Vitals | Rolling 28 days (CrUX) |
| URL Inspection | Current state only (no history) |
| Links | Current snapshot only |
| Sitemaps | Current state + last crawl |
| Manual Actions | Until resolved |
The 16-month limit specifically affects the Search Performance data — the metrics most valuable for long-term SEO analysis.
FAQ
How exactly is the 16-month window calculated?
GSC retains data on a rolling basis. Each day, the oldest day of data beyond the 16-month window is deleted. So on March 4, 2026, any data from before November 4, 2024 is gone. This happens automatically with no warning.
Can I request my old data from Google?
No. Google does not offer data recovery, export services, or extended retention for any GSC account. Once data ages past 16 months, it is permanently deleted from Google's systems. There is no paid tier, enterprise plan, or support request that recovers it.
Does BigQuery export preserve data beyond 16 months?
Yes, but only data captured from the day you enable the export forward. BigQuery does not backfill historical data. If you enable BigQuery today, everything before today still ages out of GSC normally. The earlier you start any form of data preservation, the more you save.
Was the retention limit always 16 months?
No. GSC originally limited data to 90 days. In January 2018, Google increased retention to 16 months. There has been no further increase since, and Google has not indicated plans to extend it.
Does the 16-month limit apply to the API too?
Yes. The Search Analytics API accesses the same data as the UI. The 16-month retention limit is server-side — it applies regardless of how you access the data (UI, API, BigQuery export, or third-party tools).
Related Articles
- GSC Export Row Limits — Why the API caps rows at 25,000
- BigQuery Alternative — Compare gscdump to Google's BigQuery export
- 1000 Row Limit — Getting your full keyword data
- Data Delay — Normal processing delays vs actual problems
- GSC API Authentication — Set up OAuth for automated data sync