The 3 Ways AI Agents & Bots Break Your Web Analytics (And Why JavaScript Tracking Fails)

Your analytics are lying to you. Snowplow's Nick Stanchenko breaks down the three distinct challenges created when AI agents, scrapers, and bots interact with traditional JavaScript-based analytics tools like GA4.The three visibility problems:

1️⃣ Skewed metrics – Scrapers run JavaScript, polluting your human behavior data

2️⃣ Invisible AI traffic – Most AI agents don't run JavaScript, creating massive blind spots

3️⃣ Agentic browsing – Human + AI agent activity in the same session breaks the old binary model

The assumption that sessions are either human OR bot is obsolete. Agentic browsers mean you need to detect and separate behaviors within a single session, not just filter entire sessions as "bot traffic."

What gets measured:

- Humans (mostly, minus opt-outs)

- JavaScript-enabled bots (scrapers, some search bots)What goes invisible:

- Most AI agents (GPTBot, Claude, Perplexity crawlers)

- Server-side bot traffic

- Agentic portions of hybrid sessions

Catch the full session here: https://snowplow.io/events/ai-agent-bot-behavioral-intelligence-in-the-agentic-era