Skip to content




‘Google Zero’ misses the real problem: Your next visitor isn’t human

Featured Replies

‘Google Zero’ misses the real problem- Your next visitor isn’t human

Barry Adams recently published “Google Zero is a Lie” in his SEO for Google News newsletter, arguing that the narrative of Google traffic disappearing is false and dangerous.

His data backs it up. Similarweb and Graphite data show only a 2.5% decline in Google traffic to top websites globally. Google still accounts for nearly 20% of all web visits.

The widely cited Chartbeat figure showing a 33% decline? It’s skewed by a handful of large publishers hit by algorithm updates. Publishers who abandon SEO in the face of this panic are making a self-fulfilling prophecy, ceding traffic to competitors who keep optimizing.

He’s right. And he’s looking at the wrong problem.

Humans are still clicking Google results. What has changed is that a growing share of your visitors isn’t human at all.

The tipping point already happened

Automated traffic surpassed human activity for the first time in a decade, per the 2025 Imperva Bad Bot Report. Bots now account for 51% of all web traffic. Not “soon.” Not “by 2027.” Now.

That number includes everything from scrapers to brute-force login bots. But the fastest-growing segment is AI crawlers.

AI crawlers now represent 51.69% of all crawler traffic, surpassing traditional search engine crawlers at 34.46%, Cloudflare’s 2025 Year in Review found. AI bot crawling grew more than 15x year over year. Cloudflare observed roughly 50 billion AI crawler requests per day by late 2025.

Akamai’s data tells a similar story: AI bot activity surged 300% over the past year, with OpenAI alone accounting for 42.4% of all AI bot requests.

Chart 1 - bot-vs-human-traffic

So while Adams is correct that human Google traffic hasn’t collapsed, something else is happening on the other side of the server logs.

The take-versus-give ratio

Cloudflare published crawl-to-referral ratios for AI bots. Look at these numbers.

Anthropic’s ClaudeBot crawls 23,951 pages for every single referral it sends back to a website. OpenAI’s GPTBot: 1,276 to 1. Training now drives nearly 80% of all AI bot activity, up from 72% the year before.

crawl-to-referral-ratio

Compare that to traditional Googlebot, which has always operated on a crawl-and-send-traffic-back model. Google crawls your site, indexes it, and sends 831x more visitors than AI systems. The deal was simple: let me read your content, and I’ll send you people who want it.

That deal is fraying even on Google’s own turf. Queries where Google shows an AI Overview see 58-61% lower organic click-through rates, according to Ahrefs and Seer Interactive studies covering millions of impressions through late 2025.

Google’s newer AI Mode is worse. Semrush data shows a 93% zero-click rate in those sessions. AI Overviews now trigger on roughly 25-48% of U.S. searches, depending on the dataset, and that number keeps climbing.

And when Google’s AI features do cite sources, they’re increasingly citing themselves. Google.com is the No. 1 cited source in 19 of 20 niches, accounting for 17.42% of all citations, an SE Ranking study of over 1.3 million AI Mode citations found. That tripled from 5.7% in June 2025. Add YouTube and other Google properties, and they make up roughly 20% of all AI Mode sources.

So the old deal is being rewritten even by Google. AI crawlers from other companies skip the pretense entirely: let me read your content so I can answer questions about it without ever sending anyone your way.

The agentic shift

The bot traffic numbers are already here. The next wave is bigger: AI agents acting on behalf of humans.

In 2024, Gartner predicted that traditional search engine traffic would drop 25% by 2026 as AI chatbots and agents handle queries. That prediction is tracking. Its October 2025 strategic predictions go further: 90% of B2B buying will be AI-agent intermediated by 2028, pushing over $15 trillion in B2B spend through AI agent exchanges.

This isn’t theoretical.

  • Salesforce reported that AI agents influenced 20% of all global orders during Cyber Week 2025, driving $67 billion in sales.
  • Retailers with AI agents saw 13% sales growth compared to 2% for those without.
  • Google is building for this with initiatives like the Universal Commerce Protocol for agent-led shopping.

Gartner says 40% of enterprise applications will have task-specific AI agents by the end of 2026, up from less than 5% in 2025. eMarketer projects AI platforms will drive $20.9 billion in retail spending in 2026, nearly 4x 2025 figures.

agentic-commerce-trajectory

Think about what that looks like in practice. An AI agent researches vendors for a procurement team. It doesn’t see your hero banner. It doesn’t notice your trust badges. It reads your structured data, compares your specs to those of three competitors, and builds a shortlist.

That “visit” might show up in your analytics as a bot hit with a zero-second session duration. Or it might not show up at all.

Get the newsletter search marketers rely on.


What agentic SEO actually looks like

So what do you optimize for when the visitor is a machine making decisions for a human?

It’s not the same as traditional SEO. And it’s not the same as the AI Overviews optimization most people are focused on right now. AI Overviews are still Google. Still one search engine, still largely the same ranking infrastructure, still (mostly) one answer format.

Agentic SEO is about being useful to software that’s pulling from search APIs, crawling directly, and using LLM reasoning to make recommendations. That software doesn’t care about your page layout. It cares about whether it can extract what it needs.

I think a few things start to matter a lot more.

Structured data becomes load-bearing

Schema markup has always been a “nice to have” for rich snippets. When an AI agent compares your product to three competitors, structured data lets it read your specs without having to guess. Think product schema, FAQ schema, and pricing tables in clean HTML. These go from SEO hygiene to core infrastructure.

Dig deeper: How schema markup fits into AI search — without the hype

Content needs to answer questions 

AI agents don’t search for “best CRM for small business.” They ask compound questions: “Which CRM under $50/user/month integrates with QuickBooks and has a mobile app with offline capability?” If your content only answers the first version, you’re invisible to the second.

Freshness and accuracy get audited differently

A human might not notice your pricing page is 8 months stale. An AI agent cross-referencing your pricing against competitors will flag the discrepancy. Or worse, use the outdated number in its recommendation and cost you the deal.

Your robots.txt policy is now a business decision

Blocking AI crawlers feels protective, but it means AI agents can’t recommend you. Allowing them means your content trains models that may never send you traffic. There’s no clean answer.

But pretending it’s just a technical setting is a mistake. New IETF standards are emerging to give publishers more granular control, but they’re not widely adopted yet.

Dig deeper: Technical SEO for generative search: Optimizing for AI agents

The measurement gap

Most analytics setups can’t tell the difference between a human visit, a bot crawl, and an AI agent evaluating your site on someone’s behalf. GA4 filters most bot traffic. Server logs show the raw picture, but take work to parse. Even then, figuring out whether an AI agent’s visit led to an actual sale is basically impossible right now.

This is where the “Google Zero” framing does real damage.

If you’re only measuring organic sessions from Google, you’re blind to a channel that doesn’t show up in that number. Your traffic could look stable while an AI agent steers $50,000 in annual spend to your competitor because their product schema was more complete.

I don’t think we have good measurement for this yet. Nobody does. But ignoring the problem because Google sessions look fine is like checking your print ad response rate in 2005 and deciding the web wasn’t worth paying attention to.

What to do about it

I don’t have a playbook for this. It’s too new. But I can tell you what we’re doing at our agency.

  • Audit your structured data like it’s your storefront: Evaluate whether your website’s schema is present and well-formed. Look into structured data, content structure, and technical health. Make sure product, service, FAQ, and organization markup is complete, accurate, and current. This is table stakes.
  • Answer compound questions: Look at your top landing pages. Do they answer the specific, multi-variable questions an AI agent would ask? Or just the broad keyword query a human would type?
  • Check your server logs: Look for GPTBot, ClaudeBot, PerplexityBot, and other AI user agents. Understand how much of your traffic is already non-human. If you’re on Cloudflare, their bot analytics dashboard makes this easy without parsing raw logs. You’ll probably be surprised either way.
  • Make a conscious robots.txt decision: Understand the trade-offs, and make it a business decision with your leadership team.
  • Start tracking AI citations: Tools like Semrush, Scrunch, DataForSEO, and others can show when AI platforms mention your brand. The data is directional, not precise. But it’s better than nothing.
  • Don’t abandon Google SEO: Adams is right that Google traffic is still massive and still valuable. The agentic web doesn’t replace Google. It adds a new layer. You need both.

The real question

The “Google Zero” argument pits one extreme against another, even as the actual shift is quieter and more important.

The web is becoming a place where the majority of visitors are machines. Some send traffic back. Most don’t. Some of them make purchasing decisions on behalf of humans. That number is growing fast.

The SEOs who do well here won’t be the ones arguing about whether Google traffic moved 2.5%. They’ll be the ones who figured out how to be useful to both human visitors and the AI agents acting on their behalf.

We’ve spent 25 years optimizing for how humans find things. Now we need to figure out how machines find things for humans.

That’s not Google Zero. We don’t have a name for it yet. But it’s already here.

If you want to go deeper on GEO and agentic SEO, I’m teaching an SMX Master Class on Generative Engine Optimization on April 14. It covers structured data implementation, AI visibility measurement, content optimization for AI systems, and the practical side of everything in this article.

View the full article





Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.