How Do I Know if AI Search is Sending People to My Website?

The big idea

AI search is already sending traffic to websites. The gap isn’t whether it’s happening — it’s whether you have a system to see it.

Most brands have no measurement in place for AI-driven traffic. This post gives you a practical three-step system: check if you’re showing up, connect it to real data in GA4, then make it a monthly habit — no expensive tools required.

Step 1

Run buyer prompts across ChatGPT, Perplexity & Google AI Overviews

Step 2

Find AI traffic in GA4 using source filters & utm_source=chatgpt.com

Step 3

Build a monthly cadence so citation decay doesn’t catch you off guard

Key Takeaways

  • AI traffic is already real and measurable — 63% of websites were receiving LLM traffic as of early 2025.

  • The fastest visibility check requires no tools: run buyer-style prompts across ChatGPT, Perplexity, and Google AI Overviews, and score what you find.

  • ChatGPT is the only major AI tool that consistently appends utm_source=chatgpt.com. Most others show up as referrals or direct traffic.

  • GA4 already has the data — you just need to know where to look and what to filter for.

  • AI citation decay is real. A one-time check is a snapshot, not a strategy. Build a monthly cadence.

If someone asked ChatGPT or Perplexity about the problem your business solves, would your name come up?

Most people I talk with don't know the answer. And that's a gap, because AI search isn't some kind of “future” concept for marketing.

It’s already driving traffic and, in some categories, converting at rates that rival those of organic search.

So you need to learn how to gain visibility into what's happening with your brand, your industry, and AI search.

There are some ways to get insight into how you are (or aren’t) showing up. And while there are some expensive tools out there, you don’t necessarily need to invest a ton to get an idea of your status.

Here’s my basic process:

  1. Check whether you're showing up at all

  2. Connect that to real traffic data in GA4

  3. Turn it into a repeatable process so you're not flying blind every month

If you've been following my posts on what to include in your content to get cited by AI or how GPT-5.4 changed citation behavior across ChatGPT's models, this is the natural next step — the feedback loop that tells you whether any of it is working.

63% of websites

were already receiving LLM traffic as of February 2025, based on an Ahrefs sample of 150,000 sites. If you’re not tracking it, you’re not seeing it — but it’s likely already there.

Source: adpushup.com

Am I showing up in AI search at all?

The fastest check is manual: brainstorm 10–15 prompts your buyers would actually use, run them across ChatGPT, Perplexity, and Google AI Overviews, and score whether your brand appears — mentioned, linked, or absent.

Before you touch GA4, start here. This takes about an hour and tells you whether you have a presence problem, a tracking problem, or both.

The key is using buyer-language prompts — not how you would describe your own business, but how a potential customer would describe their problem.

For example:

  • "Best HOA software for self-managed communities" instead of "community association management SaaS."

  • "How do I fix content that sounds too generic after AI edits it?" instead of "AI copywriting brand voice services."

That distinction matters because AI systems answer questions people actually ask — not landing page copy.

Here’s how you run the check:

Pick one buyer persona. List 10–15 questions they would genuinely ask an AI tool. Run each one across all three engines and record what you find.

  • ChatGPT (with search/browsing on): Ask your prompt, then follow up: "List the sources you used and include direct URLs." Check both the answer text and the citation list for your domain and your competitors.

  • Perplexity: Run the same prompts and check the citations panel and "View sources" area. Perplexity is generally more transparent about its sourcing than ChatGPT.

  • Google AI Overviews: Search your prompts in Google. Note whether an AI Overview appears and whether your brand or pages show up in the overview or follow-on links.

Then track the results in a simple spreadsheet that includes:

  • The prompt

  • "Brand mentioned?" — (yes / no)

  • "Link included?" — (yes / no)

  • "Competitors cited?" — (list them)

  • Notes on how you're positioned when you do appear

Score each result 0–2: not mentioned (0), mentioned without a link (1), mentioned with a link (2).

That spreadsheet is your baseline — the number you're trying to move.

Your AI visibility scorecard

0 Not mentioned Your brand doesn’t appear in the answer or the citation list. Presence problem.
1 Mentioned, no link You’re in the answer but not cited with a clickable URL. Visibility without traffic.
2 Mentioned + linked Cited with a URL. This is the version that shows up in GA4. Target state.

Track for each prompt

Prompt text
Brand mentioned? yes / no
Link included? yes / no
Competitors cited?
How you’re positioned

Run across all three

ChatGPT (search on → ask for source URLs)
Perplexity (check citations panel)
Google AI Overviews (search → note overview)

A note on paid tools: platforms like Profound, Rankscale AI, AirOps Insights, and Semrush's AI Visibility module automate this prompt testing at scale.

But the manual spreadsheet gets you 80% of the insight at zero cost.

Start there. Add tools when the spreadsheet breaks.

How do I track traffic that's actually coming from AI?

ChatGPT is the only major AI tool that consistently appends utm_source=chatgpt.com to outbound links. Other tools pass a referrer domain or nothing — meaning some AI traffic lands as Direct and disappears into the noise.


Once you know whether you're being cited, the next question is whether those citations are actually driving clicks.

The answer lives in GA4 — but AI traffic doesn't always show up cleanly, and the reason matters.

How each AI tool shows up in GA4:

How AI tools appear in GA4  ·  Signal strength by source

ChatGPT
Strong
Often appends utm_source=chatgpt.com — shows as a distinct, filterable source. Not appended to every link.
source = chatgpt.com
Perplexity
Referral
Passes referrer domain (perplexity.ai) — shows as standard referral. No UTM injected by Perplexity.
source contains
perplexity
Copilot / others
Inconsistent
Referrer domain when present (copilot.microsoft.com) — inconsistent. Some clicks land as Direct.
source contains
copilot
Google AI Overviews
Blended
Arrives as regular Google organic. No AI-specific identifier available in GA4 yet.
Watch landing pages tied to AI Overview queries
Copy-paste clicks
Dark traffic
No referrer, no UTM — lands as Direct regardless of which AI it came from.
Cannot be isolated reliably

The honest caveat: not every AI click gets tagged. ChatGPT doesn't append utm_source=chatgpt.com to every link — direct URL requests go untagged.

And when users copy-paste a URL from an AI response into their browser, it arrives as Direct with no attribution at all. Your GA4 numbers are directional, not exhaustive — but they're still worth having.

The fastest GA4 check — no setup required

You don't need to build a custom report first. Start here:

  1. Go to Reports → Acquisition → Traffic acquisition

  2. Change the primary dimension to "Session source / medium"

  3. Use the search box to filter for: chatgpt, openai, perplexity, gemini, copilot

  4. Note which sources appear, how many sessions they're driving, and which pages they're landing on

If you see nothing — no chatgpt.com, no perplexity.ai — that's still useful information.

Either you're not being cited with clickable links, or the citations are happening without trackable attribution. The manual prompt check from the first section tells you which.

Building a dedicated GA4 segment for AI traffic

For ongoing tracking, create an exploration or segment filtering session where the source contains your AI referrer strings. Run it monthly and track: total sessions, top landing pages, conversion rate, and engagement for that traffic slice.

This is also where understanding how GPT-5.4 is working with search pays off. GPT-5.4 appends UTM parameters to roughly 87% of its citations.

As premium ChatGPT adoption grows, that segment becomes a meaningful, measurable channel — one you want baseline data for now.

LLM traffic vs organic search  ·  Conversion efficiency

LLM traffic smaller volume 2x
Organic search larger volume 1.7x

The takeaway

LLM traffic converts at roughly 2x its share of total traffic. The volume is smaller — but the intent behind it is high. Someone who found you through a ChatGPT answer already did their research.

Source: adpushup.com  ·  Cross-industry dataset

For SaaS specifically: in that same dataset, conversion rates from LLM traffic (6.69%) and organic search (6.71%) were nearly identical — separated by just 0.02 percentage points.

The volume is smaller, but the quality isn't.

Is my AI visibility getting better or worse over time? What is AI citation decay?

AI citation decay is the rate at which AI search results rotate out your content as models update and fresher sources get indexed. It means a mention or link that exists this month may not exist next month as models update and training data shifts. A one-time check is a snapshot. A monthly cadence is a measurement system.


This is the part most people skip — and it's the part that actually matters for making decisions.

AI search doesn't work like Google rankings, where a page can hold a position for months. Citation sources shift constantly.

Research shows 40–60% of cited sources change month-to-month across major AI engines. A mention you earned this week can disappear next week when a model updates or a fresher source gets indexed. This is AI citation decay — and it makes one-time tracking meaningless.

A simple monthly cadence helps here

This doesn't need to be complicated. Once a month, do three things:

  1. Re-run your saved prompt set across ChatGPT, Perplexity, and Google AI Overviews. Re-score each query on your 0–2 scale. Note which competitors appear alongside you now versus last month.

  2. Open your GA4 AI traffic segment. Log sessions, top landing pages, and any conversion activity for the month. Look for directional trends — growing, flat, or dropping.

  3. Compare to last month. If visibility is improving but traffic is flat, you're being mentioned without links. If traffic grows while mentions stay stable, your cited pages are working harder. Both tell you something different about what to fix.

What the trends mean
Visibility ↑ + Traffic flat

The diagnosis

You’re being mentioned but not linked. AI is citing your brand in the answer text — your URL just isn’t making the cut. Focus on page structure and citable content formatting.

Traffic ↑ + Visibility flat

The diagnosis

Your cited pages are pulling harder — more volume or more clicks per citation. Protect them with fresh data and answer-first copy so they stay in rotation.

The monthly prompt test also catches something GA4 can't: how you're being described when you appear.

Are you cited as a trusted resource or as one option among ten? Is the AI framing your service accurately?

That qualitative read can tell you a lot, and you can only get it by running the prompts yourself.

Your monthly GEO pulse check ~20 min / month
1 For each target prompt: Am I cited?
2 For each citation: Am I linked?
3 Scan the answers: Are competitors cited instead?
4 Compare to last month: What changed?

When the manual system gets unwieldy — when you're tracking 50+ prompts or need share-of-voice charts across multiple brands — that's when purpose-built tools like Profound, Rankscale AI, or Semrush's AI Visibility module earn their subscription.

These tools can automate the prompt testing and store results over time. But the manual system gets you further than most people expect, and it keeps you close to the actual answers your customers are seeing.

If you're not confident your content is set up to earn citations in the first place, the measurement system will surface the gap — it just won't close it. That's where a content audit for AI citability is useful: it tells you which pages have the structure, specificity, and signals that earn mentions, and which ones are getting skipped over.

Work with Brad Fiverr Pro vetted · 4.9 stars · 1,600+ clients

Not sure if your site is set up to be seen — or tracked — by AI?

The manual check in this post tells you where you stand. A GEO Content Audit tells you why — and what to do about it. I run your site through the same prompt-testing framework across ChatGPT, Perplexity, and Google AI Overviews, then map out exactly what’s working, what’s missing, and what to fix first.

What the audit covers

Cross-engine visibility check across your target prompts
GA4 AI traffic setup and baseline measurement
Page-level citation readiness review
Prioritized action plan — what to fix first and why
Get in touch to get started →

Frequently Asked Questions

  • Open GA4 and go to Reports → Acquisition → Traffic acquisition. Change the primary dimension to "Session source / medium" and search for "chatgpt." 

    If ChatGPT has cited your site with clickable links, you'll see sessions attributed to utm_source=chatgpt.com

    If you see nothing, either you're not being cited with trackable links, or the citations are happening without attribution — arriving as Direct traffic. The manual prompt check (running buyer prompts in ChatGPT with search on, then asking for source URLs) tells you which it is.

  • Yes, but without UTM parameters. Perplexity passes a standard referrer domain (perplexity.ai) rather than appending its own tracking tags. 

    In GA4, look for sessions where the source contains "perplexity" under Traffic acquisition. Any UTM tags you see on Perplexity-driven traffic will be ones you placed on the URL yourself — Perplexity doesn't inject them.

  • AI citation decay is the way AI search results shift over time as models update, training data changes, and fresher sources get indexed. 

    Research shows 40–60% of cited sources change month-to-month across major AI engines. A mention earned today is not guaranteed next month. 

    This is why a one-time visibility check is a snapshot, not a strategy — you need a monthly cadence of prompt testing and GA4 review to catch the shifts and respond to them.

  • No. GA4 already captures most of what you need — ChatGPT referrals via utm_source=chatgpt.com, Perplexity referrals via the perplexity.ai domain, and Copilot via its referrer. 

    The manual prompt check covers visibility across ChatGPT, Perplexity, and Google AI Overviews. 

    Paid tools like Profound, Rankscale AI, or Semrush's AI Visibility module are useful when you're tracking many prompts at scale or need multi-client dashboards — but the manual system gets you further than most people expect.

  • Once a month is sufficient for most brands. Re-run your saved prompt set across ChatGPT, Perplexity, and Google AI Overviews and re-score each for mention and link presence. 

    Pull your GA4 AI traffic segment for the month and note sessions, top landing pages, and conversion trends. 

    Compare to the prior month and flag changes — new competitors appearing, pages dropping out, or traffic trends shifting. That's your GEO pulse check, and it takes about 20 minutes once the system is set up.

Brad Bartlett — Copywriter and Content Strategist based in Kansas City

Written by

Brad Bartlett

Brad is a copywriter and content strategist who helps creators, brands, and organizations build content that's actually worth reading — and built to be found. He specializes in conversion-focused copy, brand voice, and SEO and AI search optimization, with a straightforward philosophy: great content has to be authentic before it can perform. He works comfortably across the AI content space, helping clients use the tools without losing the voice. Fiverr Pro vetted, 4.9 stars out of 5 across 1,600+ clients.

Next
Next

GPT-5.4 Changed Who Gets Cited in ChatGPT — And Most Brands Are Optimizing for the Wrong Model