Skip to content




Is there still a long-term game for SEO in AI search?

Featured Replies

Is there still a long-term game for SEO in AI search

SEO sits at an interesting crossroads. One camp insists on optimizing for large language models (LLMs) and AI engines, and the other insists on doing SEO the same way we’ve always done it.

But there’s another way to approach it: combining the fundamentals of SEO with an understanding of how LLMs operate and why.

With this approach, you can keep what’s always worked — like on-page SEO and backlinks from reputable sources. Yet you can also look ahead to new tactics, such as optimizing for query fan-out and emerging prompt intents.

Since 2023, and the rise of tools like ChatGPT, Gemini, Claude, and Perplexity, I’ve been researching how AI engines display search results and where SEO is headed.

Here’s what I’ve found, and how you can use it to rethink your approach to a future where AI SEO considers human behavior at its core.

How the Red Queen theory applies to AI search

The Red Queen evolutionary model says that for everything to stay the same over time, everything must change. But as you adapt to the changing environment, so does the competition.

As a result, you and your competitors remain the same distance apart. In your attempt to become the predator, your prey adapts in equal measure, leaving the status quo firmly in place.

Essentially, if you don’t adapt, you’ll get eaten.

How to apply the Red Queen principle to your AI SEO strategy

Along the same lines, AI search is a natural progression of what has existed for at least a decade. A hybrid search model has been in place since 2015, with the introduction of RankBrain

That’s why many of the same SEO tactics still work now. Instead of a fundamental change, a series of big and small shifts has taken place over time.

For example:

  • LLMs still use retrieval-based search engines.
  • Content quality and freshness still matter.
  • Site speed remains crucial for performance.
  • Intent matching across the major categories is still relevant.

“Stop optimizing for ‘AI,’” says Britney Muller via LinkedIn.

  • “Optimize for search engines (so retrieval-based AI can cite you) + earn third-party coverage (so the model already knows you before the prompt is typed).”

So, what makes a worthy source for LLMs? What are people using AI assistants to accomplish? Is it to find information, analyze an issue, or create a list of recommendations?

Research from Moz shows that only 12% of AI Mode citations mirror the URLs in organic results. This means AI engines only somewhat follow the traditional rules of SEO. And over time, these changes will likely become more extensive.

While Google denies that the search engine will be entirely generative, my prediction is that Google will continue along a generative path that encompasses AI assistant behavior, such as questions, actions, analysis, and creation.

As a result, your short- and long-term strategies must work together to remain innovative yet grounded.

Focusing on human behavior and traditional search while working to understand LLMs is how you worship the Red Queen.

Why RAG is essential to understanding AI search

The most effective approach is focusing on where LLMs fall short: their limited databases. Their systems rely on retrieval-augmented generation (RAG) to address gaps in their databases without requiring constant retraining. 

AI assistants like Google AI Mode and Gemini need RAG to prevent hallucinations and to continue surfacing relevant answers for consumers.

Here, I gave Google AI Mode and ChatGPT the same prompt:

  • “I am looking for a skincare routine that prioritizes anti-aging. What routines and products should I use?”
image-247.png

Both returned relevant results, but the specifics differed. Google AI mode returned anti-aging tips and routines, while ChatGPT sourced anti-aging products.

They also used different sources for their information. Where ChatGPT preferred a fresh Today.com source, Google referenced dermatology websites and even Google Shopping listings. 

In both instances, the AI assistants needed external sources. 

How to optimize for AI search vs. traditional search

For SEO, you need to understand how your content aligns with the limitations of AI engines. They do the searching for themselves and then generate a response for the user, only showing external sources some of the time. 

It’s a subtle shift in thinking. Optimizing for search is less about crafting SEO content and more about becoming a trusted supplier for these LLMs — so when people enter a prompt, your brand shows up in the answer.

In that way, the Red Queen evolution involves studying AI answers, learning their quirks, comparing their preferences, and evaluating their most common intents.

Then, you can feed the database. Make sure Google, which has the largest database of any LLM, has sufficient data to keep you in the pool of trusted sources.

Without people, AI assistants have no power. That’s why you have to put people first.

Where are people using AI assistants to create, achieve, build, search, and prompt? And where does it make sense for your brand to be?

Now that the AI search landscape is more competitive, you have to think like a social media professional or a traditional marketer.

Get the newsletter search marketers rely on.


Short-term SEO tactics rely on topical authority

A short-term SEO strategy can work now, in the overlap between traditional and AI search. It uses topical authority to deliver results immediately, shortening clients’ time to success. Here’s the short-term plan. 

Use internal links to build entity relationships

As Kevin Indig explains:

  • “Today, internal links aren’t just distributing authority. They’re defining the semantic structure of your site.”

Internal links help search engines understand your site’s overall structure. AI Mode, for example, is built with vector search models, and entities are crucial to their operation.

Vector search puts your website’s information into a 3D model, allowing algorithms to go beyond keywords and determine the intent behind someone’s search. Internal links help strengthen these signals. 

As Gianluca Fiorelli suggests:

  • “We should link internally and externally to content that reinforces entity connections, because this helps LLMs map embeddings to a wider network of connected entities, hence increasing our authority in the knowledge graph.”

Links have long mattered for search, and they still do. As you develop your long-term SEO strategy, they become increasingly important for surfacing your content in LLMs and AI assistants. 

Think in terms of topical coverage versus keyword research

Plan your topical authority through these four lenses:

  • Topical coverage: Develop pages that cover the overall topic and its subtopics in a relevant, useful way.
  • Query fan-out: Study the query fan-out behavior for your most valuable search terms to identify gaps in your website content.
  • Intent: Be ruthless in determining intent by breaking down the categories in your niche that do or don’t have AI visibility potential.
  • Content quality: Make sure your content follows strong experience, expertise, authority, and trust (E-E-A-T principles) and is optimized for AI SEO.

These are all based on traditional SEO tactics. However, they consider a hybrid or LLM-based approach versus focusing solely on organic search. 

Optimize and maintain your site’s technical health

Technical health is rooted in what works for search now: site speed, schema markup, and optimized titles and descriptions. 

After all, LLMs are expensive to maintain and run. It’s in their best interest to use resources that are fast and easy to extract information from. 

Consider recent site speed findings from Mike King, who notes, “Slow responses can trigger 499 errors, where the AI stops waiting.”

These three short-term goals — topical coverage, internal links, and technical health — are all important for visibility in LLMs and AI engines.  

But search has evolved because human behavior has changed. So, the long-term play involves adapting to human behavior. 

The long-term future of SEO relies on human behavior

Long-term SEO strategies should focus on the intent and actions of human behavior surrounding AI.

Identify search intent

The four traditional search intents (informational, navigational, commercial, and transactional) are still relevant. But AI search has added a few more.

According to MIT, examples include zero-shot, instructional, and contextual prompts. Grammarly considers other intents, including educational, opinion-based, and problem-solving.

I tend to break down intent into multiple categories of SEO opportunity based on the clients I’m working with. Some common examples include directional, recommendation, local, booking, and shopping.

Consider query fan-out

Once you identify the most relevant search intents, you can hypothesize what people are looking for the generative engine to do. From there, you can do one of two things:

  • Rule a subset of topics out of your strategy. For example, if you don’t have a local business but the results have local intent, you don’t need to focus on those topics.
  • Create web pages optimized for LLMs. For example, you can break down a topical category, study its query fan-out results, and reverse engineer what answer engines find valuable based on their behavior.

Say your target customers are U.S. home buyers. They want to know: “Is now a good time to buy a house?”

Plug the prompt into an AI engine and study the AI-generated answer. In AI Mode, for example, you can infer that Google fans out across multiple topics, including market conditions and pros and cons. 

image-250.png

ChatGPT, in contrast, looks at trends, forecasts, and seasonality.

image-251.png

Based on the data, develop a content strategy that supports query fan-out behavior. 

As Aleya Solis explains:

  • “By ‘fanning out’ the original query, the system can explore various facets and subtopics simultaneously based on semantic understanding, user behavior patterns, and logical information architecture around the topic, leading to a more complete and contextually rich understanding of the user’s need.”

For example, you can break down the complexities of buyer’s markets, buyer and seller perspectives, or the changes in rising inventories. You could even build a useful tool around mortgage rates or national home price trends. 

I use a variety of tools to help with analyzing query fan-out. But the most popular options include Semrush, Ahrefs, and Profound.

Prepare for the future of AI search

Prompting may not even be a concern in the future if AI assistants become more sophisticated at solving problems rather than responding to prompts.

Instead, AI engines may be able to anticipate searchers’ needs and intentions, according to Harvard Business Review. That means it may be increasingly helpful to focus less on prompts and more on problems.

In the absence of keyword research, it will be more important than ever to analyze human behavior, evaluating and pivoting based on how people use AI assistants. 

It’s helpful to consider how social media professionals and brand experts think creatively about where their audiences are and how to attract attention while building brand power and recognition. 

For example, Rare Beauty and Rhode have both grown their brands with creativity and consumer listening, especially in the last six years. 

They’ve put considerable effort into brand campaigns, public relations (PR) campaigns, TikTok content, and in real-life (IRL) experiences that have gone viral globally.

Looking at ChatGPT, the first product recommended for “best makeup gifts for Gen Z” is Rare Beauty. 

image-249.png

Google makes similar recommendations, with Rare Beauty and Rhode leading the list. The results are influenced by PR coverage and social media virality. 

image-248.png

SEO’s role in the future of search

SEO will have a future as long as there are search engines with AI experiences. While it might look like SEO has become the prey, it’s evolved just as much as the predator has.

Everything’s changed. Yet everything’s the same.

View the full article





Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.