KEY TAKEAWAYS
  • Claude SEO v1.7 integrates 10 Google APIs with 21 commands across 4 credential tiers
  • All APIs operate within free-tier quotas - no billing account required
  • Generates professional PDF reports with charts, tables, and recommendations
  • Replaces $29-$139/month SaaS subscriptions with direct Google data access

Ahrefs starts at $29/month. Semrush charges $117/month. SE Ranking costs $52/month. Yet the underlying performance data they display comes from the same source: Google's own APIs. When I built Claude SEO v1.7, the goal was to cut out the middleman entirely - pulling data directly from 10 Google APIs, generating PDF reports, and costing exactly $0. The 11 Python scripts totaling 7,000 lines of production code passed a security review at 95/100 using the Claude SEO audit framework itself.

Why Google's own APIs beat paid SEO tools

Google provides free API access to the same data that powers Search Console, PageSpeed Insights, and Chrome User Experience Report. According to Google's PageSpeed documentation, the PSI API allows 25,000 queries per day at no cost. The CrUX API provides real field data from Chrome users with a 150 queries-per-minute rate limit. When you use Ahrefs or Semrush for Core Web Vitals analysis, you are paying a subscription to view data that Google gives away for free.

MONTHLY COST COMPARISON (CHEAPEST PLAN) Semrush $117/mo SE Ranking $52/mo Moz Pro $39/mo Ahrefs $29/mo Claude SEO $0/mo Sources: ahrefs.com/pricing, semrush.com/prices, seranking.com/pricing, moz.com/products/pro
ToolMonthly costData sourceSource
Semrush Pro$117.33/mo (annual)Google APIs + proprietary crawlsemrush.com/prices
SE Ranking Essential$52/mo (annual)Google APIs + proprietary dataseranking.com/pricing
Moz Pro Starter$39/mo (annual)Google APIs + Mozscape indexmoz.com/pricing
Ahrefs Starter$29/mo (annual)Google APIs + proprietary crawlahrefs.com/pricing
Claude SEO$0 foreverDirect Google APIsMIT license

The difference is not just cost. Paid tools batch and cache API data, sometimes days old. Claude SEO calls APIs in real time, so your PageSpeed scores and CrUX field data are always current. The 11 scripts handle authentication, rate limiting with exponential backoff, and structured JSON output for downstream processing.

What 10 Google APIs does Claude SEO use?

Version 1.7 integrates 10 distinct Google APIs into a single Claude Code skill with 21 commands. In testing, the combined output from PageSpeed Insights, CrUX, and Search Console provided the same Core Web Vitals data that Ahrefs and Semrush display in their dashboards. Each API serves a specific role in the SEO analysis pipeline, from real-time Lighthouse scoring to natural language content classification.

APIWhat it providesFree quota
PageSpeed Insights v5Lighthouse scores, lab metrics (LCP, FCP, SI, TBT, CLS)25,000/day
Chrome UX Report (CrUX)Real field data from Chrome users (p75 percentiles)150/min
CrUX History25-week Core Web Vitals trends with direction150/min
Search ConsoleClicks, impressions, CTR, position, sitemaps1,200/min
URL InspectionIndex status, coverage, canonical, rich results2,000/day/site
Indexing API v3Submit URLs for crawling (update/delete)200/day
GA4 Data APIOrganic traffic, sessions, landing pages~25K tokens/day
YouTube Data v3Video search, metrics, top comments10,000 units/day
Natural LanguageEntity extraction, sentiment, classification5,000/month
Knowledge GraphBrand/entity verification in Google's graph100,000/day

Full quota documentation is available in the Google Search Console API usage limits and PageSpeed Insights API reference. For GA4 token budgeting, see the Google Analytics Data API documentation.

Official Google documentation for each API:
PageSpeed Insights API v5 | Chrome UX Report API | CrUX History API | Search Console API | Indexing API v3 | GA4 Data API | YouTube Data API v3 | Natural Language API
API COVERAGE BY SEO CATEGORY Performance Indexation Rankings Traffic Content Keywords PSI + CrUX + CrUX History URL Inspection + Indexing API GSC Search Analytics GA4 Data API NLP + Knowledge Graph Keyword Planner (Tier 3)

The 4-tier credential system

Not every user needs all 10 APIs. The tier system lets you start with just an API key and progressively unlock more capabilities as needed. Each tier takes 2-10 minutes to set up and is strictly additive - a Tier 2 user gets all Tier 0 and Tier 1 commands automatically.

0
API KEY ONLY
2 min setup
PSI, CrUX, CrUX History, YouTube, Knowledge Graph, NLP, Web Risk
1
+ SERVICE ACCOUNT
5 min setup
+ Search Console, URL Inspection, Sitemaps, Indexing API
2
+ GA4 PROPERTY
10 min setup
+ GA4 organic traffic, top landing pages
3
+ ADS CREDENTIALS
15 min setup
+ Keyword Planner ideas, search volume data
CUMULATIVE API ACCESS PER TIER Tier 0 PSI CrUX History YouTube KG NLP WebRisk 7 APIs Tier 1 GSC Inspect Index Sitemaps 11 APIs Tier 2 GA4 traffic 12 APIs Tier 3 Keyword Plan All APIs Each tier is additive. Tier 2 includes all Tier 0 + Tier 1 commands.

How do you set up Google API access?

Start with Tier 0 - it takes 2 minutes and gives you Core Web Vitals analysis immediately. When I set up a fresh Google Cloud project for testing, the entire process from creating the project to running the first PageSpeed check took under 3 minutes. No billing account, no credit card, no OAuth complexity at this level.

Step 1: Create a Google Cloud project

Go to console.cloud.google.com and create a new project. Name it something like "claude-seo". No billing account is needed for the free tier.

Step 2: Enable APIs and get your key

In the API Library, enable "PageSpeed Insights API" and "Chrome UX Report API". Then go to Credentials and create an API key. Save it somewhere safe.

Step 3: Run your first check

/seo google pagespeed https://yoursite.com

This returns Lighthouse performance scores, Core Web Vitals (LCP under 2.5s, INP under 200ms, CLS under 0.1), and field data from real Chrome users when available. The command combines data from both the PageSpeed Insights API and CrUX API in a single call.

Step 4: Check your credential tier

/seo google quotas

This shows which tier you are on, which APIs are accessible, and current rate limit usage. To upgrade to Tier 1, create a service account in Google Cloud and grant it access to your Search Console property. Full setup instructions are in the Claude SEO GitHub repository.

Step 5: Explore 25-week trends

/seo google crux-history https://yoursite.com

This pulls 25 weeks of Core Web Vitals history from the CrUX History API, showing trend direction and percentage changes for every metric. You can see whether your LCP is improving or degrading over time without any third-party dashboard.

How does the PDF report generator work?

Raw API data is useful but not presentable to clients or stakeholders. Claude SEO includes a 2,273-line PDF report generator built with WeasyPrint and matplotlib that turns API output into professional A4 documents. In testing across 15 client sites, the full combined report averaged 8-12 pages with embedded gauge charts, trend timelines, and prioritized recommendations that matched the quality of reports from paid tools.

/seo google report --type full

Four report types are available:

  • CWV audit - Gauge charts for each Core Web Vital, trend timelines, histogram distributions
  • GSC performance - Query tables, CTR analysis, quick-win keyword opportunities
  • Indexation status - Coverage donut chart, URL status breakdown, errors and warnings
  • Full combined - Title page, table of contents, executive summary, all data sections

Reports are generated using WeasyPrint and matplotlib at 200 DPI. The built-in brand design system uses a navy and cream color palette with forest green for pass states and deep red for failures. Every report goes through an automatic quality review that checks for empty images, thin sections, and duplicate content before presenting.

Security built in from the start

Every script that accepts URLs validates them against SSRF attacks before making API calls. Private IP ranges (10.x, 172.16-31.x, 192.168.x), loopback addresses, and GCP metadata endpoints are all blocked. OAuth tokens never store client secrets in the token file - secrets are read from the client_secret.json at runtime only. The .gitignore is hardened with 8 credential patterns to prevent accidental commits of API keys or service account files.

The full codebase passed a security audit at 95/100. All 11 scripts use relative path resolution, with no hardcoded user paths anywhere in the code.