As AI-powered search tools gain adoption (ChatGPT, Perplexity, Google’s AIO, Claude, and others) we wanted to answer a straightforward question: Are automotive dealer websites ready for these systems to access and understand them?
We wanted something measurable. Can AI crawlers actually reach the site? Once there, can they parse what the dealership sells and where it’s located?
We couldn’t find a tool that answered this for automotive, so we built one.
What the AI Compatibility Analyzer Measures
The tool evaluates two dimensions:
Access
Can AI crawlers reach the site? The tool sends HTTP requests using actual AI bot user-agent strings, GPTBot (OpenAI), Claude-Web (Anthropic), PerplexityBot, Google-Extended, Bytespider (ByteDance), CCBot (Common Crawl), and others. It checks whether these requests succeed, get blocked, or return errors.
Understanding
Can AI systems parse the content? Even if a crawler can access a page, it needs structured data to understand what the page represents. The tool evaluates JSON-LD schema markup, specifically, whether the site includes LocalBusiness or AutoDealer schema, Vehicle or Product schema on inventory pages, and proper organization markup.
Note: This tool was designed specifically for automotive dealer websites. The structured data scoring evaluates automotive-relevant schema types. Results shouldn’t be extrapolated to other industries.
How Scoring Works
Sites receive a score out of 100, broken into three categories:
CategoryPointsWhat It MeasuresBlocking Prevention25Whether AI crawlers can access the siteStructured Data56JSON-LD schema (LocalBusiness, AutoDealer, Vehicle)Discoverability15Robots.txt, sitemaps, crawl directives
A score of 80+ indicates strong AI compatibility. A score below 60 indicates significant optimization opportunities in access, structured data, or both.
The Research
We conducted initial research on 100 dealer websites, selected to represent a cross-section of OEMs, website providers, and U.S. regions. After publishing the findings, we made the tool publicly available. Since then, more than 628 unique dealer websites have been scanned through community submissions.
Methodology: The community-submitted data is self-selected, not a random sample. We disclose this distinction for transparency. Both datasets showed similar patterns.
What the community data showed:
Average score: 33 out of 100
83.4% scored below 60
46.3% blocked at least one AI crawler
9.6% scored 80 or above
Among sites that don’t block AI crawlers, 70.6% still scored below 60
That last stat is important. Access alone isn’t enough. A site can allow every AI crawler in the world, but if there’s no structured data telling those systems what the business is and what it sells, the access doesn’t accomplish much.
The Technical Case for Structured Data Over Whitelisting
When we shared this research, some website providers responded with: Our firewall verifies legitimate bots and lets them through. We whitelist the ones that matter.
Historically, this is a reasonable security posture. But as an AI accessibility strategy, it has a fundamental scaling problem.
The AI landscape today includes crawlers from OpenAI, Anthropic, Google, Perplexity, ByteDance, Meta, Amazon, Apple, Microsoft, and Common Crawl, among others. But this is just the beginning. The number of AI systems that will want to crawl, index, and reference web content is growing, not shrinking.
Whitelisting works when you have a handful of known crawlers. It becomes a maintenance burden when you have dozens. It becomes untenable when you have hundreds.
The more sustainable approach is to focus on what you can control:
Structured data that any AI system can parse. JSON-LD schema markup is a standardized format. When a site includes proper LocalBusiness, AutoDealer, and Vehicle schema, any AI system, current or future, can understand what the business is, where it’s located, and what it sells. You implement it once; it works for every system that follows the standard.
Sensible default access policies. Rather than blocking by default and whitelisting exceptions, consider allowing by default with appropriate rate limiting and abuse prevention. This is the approach we’ve implemented for our own dealer websites at Savvy Dealer. It doesn’t mean removing security; it means configuring security so it doesn’t require constant updates as new legitimate crawlers emerge.
Why JSON-LD Matters for AI Systems
Traditional search engines built sophisticated systems to infer meaning from unstructured HTML. Google’s algorithms can figure out that a page is about a car dealership based on content patterns, even without explicit markup.
AI systems work differently. Large language models (LLMs) are trained on structured and unstructured data, but they perform better when information is explicitly defined. JSON-LD provides that explicit definition.
Consider the difference:
Without Schema
An AI crawler visits a page, but it doesn’t see the page the way you do in a browser. Most crawlers don’t load images, run animations, or execute the code that builds modern websites. They see a simpler version and often miss the inventory, pricing, and business details that only appear after the page has fully loaded. Even when content is visible, the crawler has to guess what it means.
With Schema
JSON-LD markup is embedded directly in the page source, crawlers see it immediately, no loading required. It explicitly tells them: This is an auto dealer at this address, part of this organization, selling these specific vehicles. No guessing. No waiting for the page to build itself.
This distinction matters more as AI systems become the interface through which consumers discover local businesses. When someone asks ChatGPT or Perplexity, Where can I buy a used Honda CR-V near me? the system that provides an accurate, structured answer will be the one with access to accurate, structured data.
The Third-Party Question
There’s a strategic dimension to this that goes beyond technical optimization.
When AI systems can’t access or understand a dealer’s website, they don’t give up. They look elsewhere. Dealer information exists across the web, Cars.com, DealerRater, Google Business Profile, Yelp, Edmunds, and OEM sites. AI tools will pull from whatever sources are accessible and well-structured.
If your website isn’t the source, a third party becomes the source.
That has implications:
You don’t control the accuracy of the information AI surfaces about your business
You don’t control the narrative; how your dealership is described or positioned
You risk growing your dependency on third-party providers; you’re likely already paying for leads and distribution
This isn’t a future problem. It’s happening now. Ask ChatGPT or Perplexity about a dealership and observe where the information comes from. In many cases, it’s not the dealer’s own website.
Dealers who want to be the primary source of truth about their own business, rather than having that controlled by third parties, should consider whether their website is positioned to serve that role.
What This Means for Dealers and Providers
We’re not suggesting that traditional SEO doesn’t matter. Google still drives the majority of traffic to dealer websites, and that won’t change overnight. But the data on AI adoption in car shopping is trending in one direction
44% of car shoppers have already used AI tools, and 97% say it will influence their decisions.
Cars.com
One in four car buyers is using AI to get a better deal, and 40% of future buyers plan to use AI in their search.
CarEdge
AI referral trafic to retail sites is up 10x since mid-2024.
Adobe
The sites that will perform well in this environment share two characteristics: AI systems can reach them, and once there, AI systems can understand them. That’s access plus structured data.
The good news is that these are solvable problems. Structured data implementation doesn’t require rebuilding a website. Access policies can be adjusted at the configuration level. Neither requires massive investment, they’re closer to website hygiene than website overhaul.
Try the Tool
We made the AI Compatibility Analyzer available for free. Enter a dealer website URL to see a breakdown of access issues, structured data presence, and specific recommendations.
Whether you’re a dealer evaluating your own site, a provider assessing your platform, or just curious about how AI compatibility works in practice, the tool provides a starting point for the conversation.
AI Compatibility Analyzer for Dealer Websites
©2026 DK New Media, LLC, All rights reserved | DisclosureOriginally Published on Martech Zone: We Built a Tool to Measure AI Compatibility for Dealer Websites. Here’s What 700+ Scans Revealed.