Scraping Central is reader-supported. When you buy through links on our site, we may earn an affiliate commission.

3.27intermediate5 min read

AI Overviews and AI Mode, The 2025–2026 Shift

Generative AI is now embedded in the SERP. The block at the top changes click-through, citation flow, and what 'rank 1' even means.

What you’ll learn

  • Explain what AI Overviews are and how they differ from Featured Snippets.
  • Read AI Overview JSON: generated text, citations, follow-ups.
  • Track AI Overview ownership as a SEO KPI.
  • Understand the implications for scraping, SEO, and traffic.

The biggest change to the SERP since the Knowledge Graph rolled out a decade ago: Google now puts a generated AI answer at the top of many queries, with citations. Bing has the same pattern (Copilot). Perplexity built the entire UX around it.

If you're scraping SERPs in 2026 and you ignore AI Overviews, you're scraping yesterday's web.

What an AI Overview is

A multi-sentence paragraph (or list) generated by a large model, synthesizing content from multiple cited sources, positioned above any organic results.

Visually:

  • Block at the very top of the SERP.
  • "AI Overview" or "Generative AI" label.
  • One or more cited source links inline or as a sidebar.
  • Often a "show more" expansion and a "follow-up" question chip.

What it changes

For users:

  • The answer is on the SERP, no click-through to a source needed for many queries.
  • Trust signals shift to the cited sources, not page positions.

For SEO:

  • Featured snippets are still a thing but AI Overviews steal share-of-attention.
  • The new game: getting cited in an AI Overview, not just ranking #1.
  • New term: "GEO" (Generative Engine Optimization), content optimized for being cited by AI answer-engines.

For scrapers:

  • The AI Overview block is itself a new data source to track.
  • "Rank 1 organic" no longer = "most prominent answer."
  • Tracking which sources are cited tells you whose content the model trusts.

SERP-API shape

{
  "ai_overview": {
  "text": "API scraping is the practice of extracting data directly from a website's underlying JSON or GraphQL APIs, rather than parsing rendered HTML. It is generally faster, more reliable, and more cost-effective than scraping HTML, though it may require authentication or reverse-engineering of request signatures.",
  "type": "paragraph",
  "sources": [
  {
  "title": "API Scraping Guide",
  "link": "https://scrapingcentral.com/learn/api-scraping",
  "domain": "scrapingcentral.com",
  "rank": 1
  },
  {
  "title": "What is API scraping?",
  "link": "https://example.com/api-scraping",
  "domain": "example.com",
  "rank": 2
  }
  ],
  "follow_up_questions": [
  "Is API scraping legal?",
  "How is API scraping different from web scraping?",
  "What is the most efficient way to scrape APIs?"
  ]
  }
}

The sources array is the key new data point: who got cited, in what order, on what query.

Tracking AI Overview "ownership"

Like featured-snippet ownership but more nuanced:

  • Cited or not? Is your domain in the sources list for query Q?
  • Citation rank? First cited source or third?
  • Citation density? How many sources does the AI Overview cite total?
  • Overlap with organic? Are the cited sources also the top organic results, or does the AI Overview reach beyond them?
def track_ai_overview(serp_data: dict, my_domain: str) -> dict:
  ao = serp_data.get("ai_overview")
  if not ao:
  return {"present": False}

  sources = ao.get("sources", [])
  my_rank = next(
  (i + 1 for i, s in enumerate(sources) if my_domain in s["domain"]),
  None
  )
  return {
  "present": True,
  "cited": my_rank is not None,
  "citation_rank": my_rank,
  "total_sources": len(sources),
  "domains_cited": [s["domain"] for s in sources],
  }

Run that across hundreds of queries daily and you have a real GEO dashboard.

Variants, AI Mode, Perplexity, ChatGPT Search

Google also has "AI Mode" (a separate tab where the entire results page is an AI conversation). Bing has Copilot answers. Perplexity is AI-first by design, no traditional SERP at all. ChatGPT Search (released late 2024) generates a similar block.

Each has its own JSON shape but the same conceptual structure:

  • Generated answer.
  • Cited sources.
  • (Sometimes) follow-up questions.

A SERP-API that supports multiple AI surfaces returns each under its own key. Pick the one(s) you need to track.

Block availability

Not every query gets an AI Overview. Triggers:

  • Informational queries (how-to, what is, comparisons).
  • Specific verticals (health, finance, technology, though Google is more conservative in regulated spaces).
  • Not yet for many transactional queries (Google preserves ad inventory there).

Geography matters too, AI Overviews rolled out US-first; coverage in non-English markets is uneven through 2026.

Implications for HTML scraping

If your scraper parses Google's HTML directly:

  • AI Overview blocks have their own DOM structure that changes frequently.
  • The "cited sources" links use different anchor patterns than organic.
  • Direct HTML scraping of the AI Overview is brittle even by Google-scraping standards.

A SERP-API is the right abstraction here, providers maintain the parsing as Google iterates.

A worked tracking workflow

import requests
from datetime import date

def daily_ai_overview_audit(queries: list[str], my_domain: str, api_key: str):
  today = date.today().isoformat()
  results = []
  for q in queries:
  r = requests.get("https://api.example-serp.com/search", params={
  "q": q,
  "engine": "google",
  "gl": "us",
  "hl": "en",
  "api_key": api_key,
  })
  data = r.json()
  stats = track_ai_overview(data, my_domain)
  stats["query"] = q
  stats["date"] = today
  results.append(stats)
  return results

Run this nightly, persist to a database, plot "share of citation" over time. That's a 2026 SEO dashboard.

Caveats

  • Hallucinations. AI Overviews occasionally cite sources that don't actually support the claim. Don't trust the AI's summary, verify against the cited page if accuracy matters.
  • Rapid iteration. Google ships changes to the format weekly. Your parser may need updates. (SERP-APIs absorb most of this churn.)
  • Volatility. Same query might show different AI Overviews hours apart, depending on the model version and prompt routing.

Hands-on lab

This is a conceptual lesson, Catalog108 doesn't host a mock AI Overview. Instead: in a SERP-API playground (most providers offer one with a free trial), run several queries and observe whether each returns an ai_overview block. Note the citation list. Pick a domain you care about (your own site, a competitor) and check whether it's cited. This is exactly the dashboard a 2026 SEO team checks every morning.

Quiz, check your understanding

Pass mark is 70%. Pick the best answer; you’ll see the explanation right after.

AI Overviews and AI Mode, The 2025–2026 Shift1 / 8

What is the primary difference between an AI Overview and a Featured Snippet?

Score so far: 0 / 0