Primitive for Google Search Console data · GSCDump

[gscdump](https://gscdump.com/)[ ≡ menu ]

[gscdump](https://gscdump.com/)

§ contents

- [1.0→NAME](#name)
- [2.0→SYNOPSIS](#synopsis)
- [3.0→INSTALL](#install)
- [4.0→OPTIONS](#options)
- [5.0→EXAMPLES](#examples)
- [6.0→COMPATIBILITY](#compat)
- [7.0→SEE ALSO](#seealso)

§ elsewhere

- [›Tools](https://gscdump.com/tools)
- [›Learn](https://gscdump.com/learn-google-search-console)
- [›MCP](https://gscdump.com/mcp)
- [›Pricing](https://gscdump.com/pricing)

[GitHub](https://github.com/harlan-zw/gscdump)

# gscdump(1)gscdump — a primitive for adding Google Search Console data anywhere. Read sixteen months of GSC from any AI agent through MCP. Open source on npm.

2.0/SYNOPSIS

gscdump syncs up to four-hundred-and-fifty days of Google Search Console data into a local DuckDB store; twenty-one analyzers ship with the CLI; the bundled MCP server exposes the same store to any agent that speaks the protocol. Read-only, credentials encrypted at rest, no data leaves your machine.

SHELL · the entire CLI surfacecmd · 5 lines

gscdump init                                  # oauth round-trip with google
gscdump sync     --site <url>                # 16 months to local parquet
gscdump query    --site <url> [--live]       # typed query, optional live api
gscdump analyze  <tool> --site <url>     # 21 built-in seo analyzers
gscdump mcp                                   # run local mcp server

local · duckdb + parquet`@gscdump/cli` on npm

3.0/INSTALL

One npm install, one OAuth round-trip. Data lives in a local DuckDB/Parquet store; nothing of yours leaves the machine.

SHELL · install + first-run

# 1. install
$ npm install -g @gscdump/cli

# 2. authorise (browser opens)
$ gscdump init

[→ npm package](https://www.npmjs.com/package/@gscdump/cli) [Full install guide](https://gscdump.com/mcp)

works in:Claude Desktop·Claude Code·Cursor·Codex·Windsurf

- [!]Read-only access. We never write to your Google account.
- [*]OAuth scoped to Search Console. You can revoke at any time.
- [?]Tokens encrypted at rest with per-account keys. Open source codebase.
- [§]No tracking, no ads, no third-party cookies. DPA on request.

4.0/OPTIONS

gscdump separates the boring half from the interesting half. _read_ covers the GSC primitives — sync, query, inspect, MCP — and is the open-source CLI. _analyze_ dispatches to twenty-one built-in analyzers ranging from simple SEO heuristics to statistical change-point detection.

4.1

## READ

cli + mcp · open source

Sync up to 450 days of Google Search Console data into a local DuckDB/Parquet store. Run typed queries against it. Expose the same store to any MCP-compatible agent. Cross-process locking; idempotent sync; pagination walks past GSC's 25k-row cap.

→init

OAuth round-trip with Google; credentials stored locally.

→sites

List verified properties on the connected account.

→sitemaps

List/manage sitemaps for a site.

→inspect

URL Inspection API — live indexing status.

→sync

Pull GSC into the local store (default 7 days; --full = 450).

→query

Typed query over the local store; --live for fresh API.

→mcp

Run the local MCP server over stdio.

4.2

## ANALYZE

analyst · 21 tools

Run any of 21 built-in analyzers against the local store. Three families: core SEO (striking-distance, cannibalization, movers, decay…), statistical (CTR anomalies, change-point, STL decompose, survival, Bayesian CTR), and structural (long-tail, intent-atlas, clustering, concentration, query migration).

→striking-distance

Queries ranked positions 4–20 with impression weight.

→cannibalization

URLs competing for the same query intent.

→movers

Largest position deltas, weighted by traffic.

→decay

Keywords losing ground over the trailing window.

→ctr-anomaly

Outlier CTRs vs the rolling baseline.

→change-point

Statistical break-points in time series.

→intent-atlas

Map queries to intent clusters.

→…

14 more — see analyze --help for the full list.

Open-source CLI; `@gscdump/cli` on npm. Analyzers ship in `@gscdump/analysis`. Storage engine is `@gscdump/engine`. [See github(1)](https://github.com/harlan-zw/gscdump).

5.0/EXAMPLES

§ 5.1 · sync, query, analyze

### Pull GSC into a local store, then dig.

SHELL · gscdump sync + analyzesession

# default sync — last 7 days, idempotent
$ gscdump sync --site https://example.com
▶ syncing 7 days · 3 tables
  pages      ━━━━━━━━━━━━  12,840 rows
  keywords   ━━━━━━━━━━━━   3,217 rows
  sitemaps   ━━━━━━━━━━━━      42 rows
✓ done · 1.2 MB on disk · 2.1s

# run an analyzer over the local store
$ gscdump analyze striking-distance --site https://example.com

  KEYWORD                          POS    IMPR     CTR
  ─────────────────────────────    ────   ──────   ─────
  nuxt seo meta tags                8.7   2,012   1.10%
  nuxt content auth                 9.2   4,128   0.40%
  nuxt sitemap xml                 11.4   1,847   0.22%
  vue 3 composition api            12.1   3,605   0.31%
  …                                                  18 more

local · duckdb + parquet21 analyzers

Sync is idempotent. `--full` backfills 450 days; `--start --end` for a custom window. Pagination walks past GSC's 25k-row cap automatically.

§ 5.2 · expose to an agent

### One JSON block, the agent sees the store.

FILE  ~/.config/claude/claude_desktop_config.json

{
  "mcpServers": {
    "gscdump": {
      "command": "npx",
      "args": ["@gscdump/cli", "mcp"]
    }
  }
}

stdio transport · local mcp7 lines

- ▸"What pages lost traffic this week?"
- ▸"Find keywords in striking distance (position 4–20)."
- ▸"Which queries have cannibalization issues?"
- ▸"Compare this month vs last month for /blog/ pages."

6.0/COMPATIBILITY

Reading GSC from an agent is now table stakes. OSS MCPs fragment over a dozen self-host packages. SEO Gets is Claude-only and dashboard-led. gscdump is the hosted primitive built for any agent.

TABLE · feature comparison4 vendors

| feature | gscdump | oss mcps | seo gets | gsc ui |
| --- | --- | --- | --- | --- |
|  |  | |
| 16mo stored history, no row cap | ■ | · | ■ | · |
| Hosted MCP for any agent | ■ | self | claude | · |
| Pre-rolled deep analyses | ■ | · | partial | · |
| URL Inspection at scale | ■ | partial | · | manual |
| Hosted, no install | ■ | · | ■ | ■ |
| Free tier | ■ | ■ | · | ■ |

■ supported · · absent · partial / self / claude / manualfig. 6.1

7.0/SEE ALSO

<dl>

<dt>mcp(1)</dt>
<dd>[Install guide for Claude, Cursor, Codex, Windsurf](https://gscdump.com/mcp)</dd>

<dt>pricing(7)</dt>
<dd>[Free tier & credit packs](https://gscdump.com/pricing)</dd>

<dt>tools(7)</dt>
<dd>[Hosted analyses you can run now](https://gscdump.com/tools)</dd>

<dt>learn(7)</dt>
<dd>[A primer on Google Search Console](https://gscdump.com/learn-google-search-console)</dd>

<dt>github(1)</dt>
<dd>[Source · MIT · open issues](https://github.com/harlan-zw/gscdump)</dd>

</dl>

FIG. 7.1 · read flowschematic

  ┌─────────────┐
  │  AI agent   │
  │  (claude…)  │
  └──────┬──────┘
         │ MCP
         ▼
  ┌─────────────┐
  │  gscdump    │
  │  /mcp       │  ◀── api key
  └──────┬──────┘
         │ oauth
         ▼
  ┌─────────────┐
  │ Search      │
  │ Console API │
  └─────────────┘

read mode · 1 hop~120ms p50

Free for any agent. Bring your own OAuth.

Sign in with Google, drop the key into your MCP config, query sixteen months of Search Console from the tool you already use.

[→ Get an API key](https://gscdump.com/auth/google) [MCP install guide](https://gscdump.com/mcp)