
We ran AI visibility audits on 24 B2B SaaS companies over the past three months.
The companies ranged from Series A to Series C. They included Dreamdata, Avoma, metadata.io, Sybill, Arrows, Warmly, Siena AI, Coldreach, Sendspark, REGAL, Visto, and others across revenue intelligence, sales engagement, and marketing automation.
Every single one invests in content marketing. Most have solid SEO fundamentals. Some are even selling AI-powered tools themselves.
The average AI visibility score across all 24 companies was 40 out of 100.
That number should concern you.
Our audit evaluates five categories:
Each category contributes to a composite score out of 100. We tested citation presence across ChatGPT, Perplexity, and Google AI Overviews using category-relevant queries.
The score range across all 24 companies: 38 (lowest) to 62 (highest). Nobody cracked 65.
Only 12% of the companies we audited had any JSON-LD schema markup on their site. That's 3 out of 24.
Schema markup is how you give AI systems structured, machine-readable information about your content. Without it, you're forcing LLMs to parse your HTML and guess what matters.
Most of these companies have engineering teams that could implement schema in a single sprint. They just haven't prioritized it.
This was the most common gap across all 24 audits.
Blog posts opened with long anecdotal intros. Key definitions were buried in paragraph four. Answers to obvious questions were scattered across multiple sections without clear headers.
Only 8% had FAQ sections structured for AI extraction. That's 2 out of 24.
LLMs need content formatted in a specific way to extract and cite it reliably. That means:
If your blog reads like an essay, AI can't pull a clean answer from it. And if AI can't pull a clean answer, it won't cite you.
This one surprised us.
Not a single company out of 24 had explicitly addressed AI crawlers in their robots.txt file. Most had standard Googlebot directives and nothing else.
Here's the thing: new AI crawlers like GPTBot, ClaudeBot, and PerplexityBot check your robots.txt. If you haven't addressed them, you're leaving your AI visibility to chance.
Worse, one company was actively shooting itself in the foot.
Sybill, an AI-powered conversation intelligence platform, was blocking AI crawlers in their robots.txt while simultaneously trying to build visibility in AI search. They were investing in content that AI literally could not access.
That's like running ads to a landing page that's behind a login wall.
Most blogs we audited had weak internal linking structures. Posts existed as islands. Related content wasn't connected. Topic clusters were incomplete.
This matters for AI visibility because LLMs evaluate topical authority partly through content relationships. If you have 15 blog posts about revenue attribution but none of them link to each other, AI systems have a harder time recognizing you as an authority on that topic.
Dreamdata was an interesting case here. They rank well in traditional search for revenue attribution queries. They're even visible in AI search, showing up as the #2 result for their category in Perplexity. But their technical AEO score was still low because their content structure doesn't help AI systems extract and organize information efficiently.
Being visible today doesn't mean you're optimized. It means you have brand equity that's compensating for technical gaps. That won't last forever.
This was the most ironic finding.
Visto, a company that sells GEO optimization services, had zero schema markup on their own website. No JSON-LD. No structured data of any kind.
Several other companies selling AI-powered marketing and sales tools scored below the average of 40/100. The assumption seems to be that building AI tools and being visible to AI tools are the same skill set. They're not.
Optimizing for AI visibility is a distinct discipline. It requires specific technical implementation and content formatting that most teams haven't learned yet, even teams that work with AI every day.
Google AI Overviews are expanding. ChatGPT search is growing. Perplexity's user base is climbing.
Your buyers are already using these tools to research solutions. When a VP of Sales asks ChatGPT "what's the best conversation intelligence tool for mid-market teams," your brand either shows up or it doesn't.
The 29% citation rate we found means roughly 7 out of 24 companies appeared in at least one AI citation test. The other 17 were invisible.
Those 17 companies are spending money on content that AI search can't find, can't parse, or can't cite. That's not a future problem. That's a today problem.
Here's how the 24 companies distributed across our scoring rubric:
Half the companies we audited scored below 40. These aren't small startups with no marketing budget. These are funded B2B SaaS companies with content teams, SEO strategies, and six-figure annual marketing spend.
The gap isn't resources. It's awareness.
You don't need to overhaul your entire content strategy. Start with these five fixes that take less than a week each.
Start with BlogPosting schema for your highest-traffic articles and Organization schema for your homepage. There are free generators online, or your dev team can implement it from schema.org documentation.
This is the single highest-impact technical change you can make. It takes a few hours and immediately makes your content more parseable for AI systems.
Check whether you're blocking GPTBot, ClaudeBot, or PerplexityBot. If you are, decide whether that's intentional. If you want AI visibility, you need to allow these crawlers.
Add explicit allow directives:
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
Take your five highest-traffic posts and restructure them:
This doesn't require rewriting. It's restructuring. You're taking the same content and making it extractable.
Your product pages and use case pages should have FAQ sections. Not buried at the bottom in an accordion. Visible, structured, with clear question-and-answer formatting.
Use the questions your sales team hears most often. These are the same questions buyers are asking AI tools.
Go to ChatGPT, Perplexity, and Google AI Overviews. Search for your core category queries:
If you don't appear in any results, you have a baseline. If you appear in some, note which content is being cited and reverse-engineer why.
The three companies that scored above 55 shared a few traits:
They had at least some structured data. Not perfect implementation, but something. Even basic BlogPosting schema put them ahead of 88% of the group.
Their content led with answers. Instead of "In today's fast-paced business environment..." they opened with "Revenue attribution is the process of identifying which marketing touchpoints drive closed-won deals." Direct. Extractable. Citable.
They had topical depth. Not just one blog post per topic, but clusters of related content with clear internal linking. This signals authority to both traditional and AI search.
None of them were doing anything exotic. They were doing the basics that 88% of their peers weren't.
If you're a B2B SaaS company relying on content marketing for growth, AI visibility isn't optional anymore. It's not a 2027 priority. It's a right-now priority.
The average score of 40/100 tells us that most companies haven't started. That's bad news for them. But it's good news for you, if you start now.
The bar is low. A few weeks of focused work on structured data, content formatting, and crawler accessibility can put you ahead of the majority of your competitive set.
The companies that figure this out in 2025 will own the AI search results in 2026. The companies that wait will wonder where their organic traffic went.
An AI visibility score measures how well your website and content are optimized to appear in AI-powered search tools like ChatGPT, Perplexity, and Google AI Overviews. It evaluates structured data, content formatting, crawler accessibility, citation presence, and technical readiness on a scale of 0-100.
AEO is the practice of optimizing your content to be found, extracted, and cited by AI-powered answer engines. It differs from traditional SEO because it focuses on making content machine-parseable and directly answerable, not just rankable.
Traditional SEO optimizes for ranking in a list of blue links. AEO optimizes for being the extracted answer. This requires structured data (JSON-LD), answer-first content formatting, FAQ sections, and explicit AI crawler permissions. You need both, but most companies are only doing SEO.
JSON-LD is a structured data format that helps machines understand your content. It tells AI systems what type of content you have (article, FAQ, product, organization) and provides key details in a standardized format. Without it, AI has to guess what your content means.
Check your robots.txt file (yourdomain.com/robots.txt) for directives related to GPTBot, ClaudeBot, and PerplexityBot. If these crawlers are blocked or not mentioned, AI tools may not be able to index your content.
Based on our audit of 24 companies, the average is 40/100 and the highest score was 62/100. A score above 55 puts you in the top tier. Most companies have significant room for improvement, which means early movers have a real advantage.
The five fixes outlined in this article can each be completed in under a week. Most companies can meaningfully improve their score within 30 days by implementing structured data, reformatting key content, and updating crawler permissions.
Want to see how your company scores? We run free visibility audits for B2B SaaS companies. Book 20 minutes at supermarketers.ai.

Most founders spend 5-10 hours per week on content and get inconsistent results. Here is the three-layer system that cuts that to 30 minutes.

A data-backed guide to Answer Engine Optimization for B2B SaaS. Based on audits of 24 companies. Includes a 90-day implementation timeline.
Book a strategy call. We'll audit where you're invisible, show you where competitors are winning, and map out exactly what a visibility system would look like for your business—even if we're not the right fit.