How to Track and Optimize for LLM Bot Traffic
About This Episode
In this episode of the AI Agents Podcast, host Demetri Panici speaks with Kevin White, Head of Marketing at Scrunch, about how to identify AI bot traffic, analyze it, and optimize your content to ensure accurate representation in AI-driven results.
Here’s what you’ll learn:
- How to identify different types of AI bot traffic (training agents, retrieval bots, search crawlers)
- Where to detect bot traffic — including at the CDN or CMS layer
- How analytics tools reveal which models are crawling your site and which pages they visit most
- Why bot traffic volume may surprise you
Ready to take control of your AI traffic strategy? Watch now to learn how.
Subscribe to AI Agents Podcast Channel: https://link.jotform.com/subscribe-to-podcast
#LLMAgents #AICrawlers #AISEO #AIContentStrategy
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Sign up for free ➡️ https://link.jotform.com/ZrYYrqTs3L
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Follow us on:
Twitter ➡️ https://x.com/aiagentspodcast
Instagram ➡️ https://www.instagram.com/aiagentspodcast
TikTok ➡️ https://www.tiktok.com/@aiagentspodcast
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Transcript
A few things there. One is just identifying that traffic in general. And we have like an analytics product that will tell you, you know, what types of what types of um LLM agents are are crawling your site. if it's like a training agent, if it's a retrieval bot, if it's just like your normal search index crawler. um when you the way you identify that is um you can look at the traffic uh and there's unique identifiers for each of those different types of um that different type of bot traffic and then if you can identify at the um CDN layer or your CMS layer um which we have partnerships with like ai and versell and um cloud cloudflare and all these sorts of CDNs um if you can if you can identify that traffic then you and one like say uh create analytics for it
so that you know like what are the top models that are crawling our site, what are the top pages that are being crawled um you know get a real-time view of how much bot traffic which is kind of if you if you look at it most people are surprised by how much bot traffic you're getting. Um so that is just like insightful in itself and then um once you have once you are able to identify that traffic you can then serve different pages or different experience to uh that user in that in the case of this user as a bot. Um and the the the way to the the way right now at least to surface that content is typically through markdown. So you can take the the the a the human um optimized page maybe has JavaScript and video and all this kind of
stuff on it that's heavy on um that's dense on like code and then convert it into markdown so that's much more crawable for these AI bots and then um you can also uh take the intent of the page and add more uh additional content to it like an FAQ section or something like that that you maybe like don't need a surface to the the human visitor. Um, so it's just giving the the AI bots more context and more accuracy of like what's the intent of his page and um and so that's kind of the next step after identifying the traffic is you know providing the most accurate representation of what the intent of that page is so that the bot can take that u return it back to the user and provide an accurate representation of what you're trying to express on that page.