The full leaderboard

Scores below come from Google's PageSpeed Insights API in mobile mode, the same engine that powers Lighthouse. Two runs per URL, median reported. Color tiers follow Google's standard: green for 90 and above, amber for 50 to 89, coral for 0 to 49.

Click VERIFY on any row to open Google PageSpeed Insights with that tool's URL pre-filled. Don't trust these numbers, check them yourself.

// CHECK YOUR OWN SITE

Curious how your homepage compares? Drop the URL below and we'll deep-link you straight into Google PageSpeed Insights. No tracking, no signup. The form just builds the URL and opens it in a new tab.

https://

Powered by pagespeed.web.dev - Google's official Lighthouse front end.

# Tool Mobile Desktop LCP TBT Verify
1 Claude SEO USopen source 99 99 1.7s 0ms VERIFY
2 MarketMusecontent strategy 98 98 1.9s 0ms VERIFY
3 Semrushseo suite 53 68 4.8s 1116ms VERIFY
4 Fraseai content briefs 50 67 6.7s 1778ms VERIFY
5 RankIQblogger seo 39 57 17.7s 925ms VERIFY
6 DataForSEOapi-first 34 52 17.7s 1051ms VERIFY
7 Surfer SEOserp optimization 31 60 18.7s 1326ms VERIFY
8 Ahrefsbacklink analysis 26 56 17.5s 2880ms VERIFY

Last run: - methodology - CC-BY-4.0

Methodology

Every URL is tested with the same engine, same throttling, same device profile. The methodology stays constant across quarterly refreshes so historical comparisons remain valid.

Test engine
Google PageSpeed Insights API v5
Device profile
Moto G Power, mobile
Network
Simulated slow 4G
Runs per URL
2 runs, median reported
Refresh cadence
Quarterly
License
CC-BY-4.0, raw data on request

The full URL list and raw data live in the Claude SEO GitHub repo. Republish, quote, or rebuild this report freely - attribution required.

What a slow site actually costs

Speed isn't a feature, it's a revenue lever. Google's own research and the Web Almanac's annual analysis converge on the same conclusion: every additional second on mobile loses customers.

  • Bounce probability rises 32% when mobile load time goes from 1 to 3 seconds, and 90% from 1 to 5 seconds (Think with Google).
  • Pages meeting Core Web Vitals thresholds show a 24% lower abandonment rate during loading (web.dev).
  • Conversion rate drops by an average of 4.42% with each additional second of load time between 0 and 5 seconds (Portent industry benchmark study).

For a SaaS pricing page, that's the difference between a trial signup and a closed tab. For a publisher, it's the difference between a subscriber and a bounce. Half the tools in the table above are leaving money in the parking lot before the visitor even reads the headline.

Largest Contentful Paint - the "is this page even loading" metric

LCP measures how long the largest above-the-fold element takes to render. Google's threshold for "good": 2.5 seconds or less. "Poor" starts at 4 seconds.

In this dataset only 2 of 8 tools hit the good threshold. The bottom of the table sits at 11-22 seconds - many times Google's good cut-off. The villain on those pages is almost always the same culprit: a large hero video, an unoptimized hero image, or a render-blocking third-party script (chat widgets, A/B tests, analytics) competing for the network.

The fix is pedestrian. Resize hero images to actual display dimensions. Serve them as WebP. Add fetchpriority="high". Defer everything that isn't above-the-fold. None of this requires a rewrite. It requires someone running a Lighthouse test and acting on the four "Opportunities" it reports.

Total Blocking Time - the tap-and-wait metric

TBT measures how long the main thread is locked up parsing JavaScript while the user taps and gets no response. Google's threshold for "good": 200ms or less. The bottom of this leaderboard sits at over 1 second of pure tap-and-wait.

TBT correlates almost perfectly with how many marketing pixels a site loads on first paint. Every analytics SDK, every retargeting tag, every chat widget adds tens to hundreds of milliseconds. Most of them could load lazily after the user has scrolled or interacted - but they don't, because the marketing team set them up and the engineering team didn't push back.

Mobile is where Google actually ranks

Some of the worst-scoring homepages here look fine on desktop. That's the trap. Google has been mobile-first since 2020 and went mobile-only for indexing in 2024. The desktop column on this leaderboard is informational. Only mobile counts for ranking.

This is why the report ranks by mobile and lists desktop as a secondary column. A tool with 96 desktop and 47 mobile is failing the version of the page Google indexes.

Why claude-seo.md ranks #1

We hit a stable median of 99 across three back-to-back PSI runs. MarketMuse hit a stable 98. The other six tools all dropped below 60. The 1-point lead at the top isn't the headline. The 40+ point cliff to the rest of the field is. Here's exactly what this site does. None of it requires a framework upgrade or a CDN migration.

  1. All CSS and JavaScript inlined. Zero external bundles. Zero render-blocking <link> tags beyond two web fonts loaded with display=optional. The browser has everything it needs to paint after the HTML arrives.
  2. WebP for every image. Hero images use fetchpriority="high" and decoding="async". Below-fold images use loading="lazy".
  3. Cache-Control: max-age=31536000, immutable on the /assets/** directory via vercel.json. Returning visitors get every asset from disk cache.
  4. No analytics scripts. No chat widgets. No marketing automation. No A/B testing SDK. The page that's shown to a visitor is the page the developer wrote, not the page after the marketing team's fifteen pixels finish booting.
  5. System fonts as the fallback path. If Google Fonts is slow, the browser renders Space Grotesk's system fallback in zero milliseconds. No FOIT, no FOUT, no waiting.
  6. No JavaScript framework. Single-file static HTML. The entire site builds in zero seconds because there's nothing to build. Every byte is intentional.

The full source for this site is on GitHub under MIT license. Copy it. Strip out the SEO content. Put your own product on top. The skeleton is the point.

DA
Daniel Agrici
AI Automation Specialist - Chisinau, Moldova
Creator of Claude SEO and 20+ open-source repos with 4,800+ combined GitHub stars. GovTech Hackathon Moldova 1st place. Builds developer tools for AI-native SEO workflows. Find him on YouTube and LinkedIn.

Frequently asked questions

Each homepage is tested with Google Lighthouse via the PageSpeed Insights API in mobile mode. The Performance score is a weighted aggregate of Largest Contentful Paint, Total Blocking Time, Cumulative Layout Shift, First Contentful Paint, and Speed Index. We run each URL three times back-to-back and report the median to reduce PSI variance, which is the standard methodology recommended by web performance researchers. Color tiers follow Google's standard: green for 90 and above, amber for 50 to 89, coral for 0 to 49.
Google has used mobile-first indexing as the primary ranking signal since 2024. Many of these tools score well on desktop but collapse on mobile, where their actual customers and Google's crawler land. Desktop scores are included as a secondary column for context only.
Quarterly. Each refresh re-runs Lighthouse against every URL, captures new median scores, and updates the publish date. The methodology, tool, and color thresholds stay constant across refreshes.
Yes. Open an issue on the Claude SEO GitHub repo with your tool's homepage URL. We add tools that fit the AI SEO category and have a public marketing page. We do not exclude or favor any tool by score - the methodology is identical for every URL.
Marketing homepages often carry tracking scripts, hero videos, third-party widgets, and large feature animations that hurt Lighthouse scores. The score reflects only the homepage, not the product itself. A slow homepage does not mean a slow product, but it does mean visiting customers and Google's mobile crawler see a slow page first.
For Claude SEO, all CSS and JavaScript are inlined into each HTML file - zero external bundles, zero render-blocking resources beyond two fonts. Images use WebP. Fonts load with display=optional so the page never waits on the network. Cache-Control: immutable on the assets directory. No analytics scripts, no chat widgets, no marketing automation. The full source is published on GitHub for anyone to copy. MarketMuse achieves a similar score with a different stack: aggressive Cloudflare caching, lean hero, restrained third-party scripts. Two paths, same destination - prioritize the critical render path.