Who Owns AI Visibility? We Analysed 8,571 Citations to Find Out
Research — 8,571 AI citation occurrences · 3,773 URLs · 5 engines · 5 regions · April 2026

Who Owns AI Visibility? We Analysed 8,571 Citations to Find Out

We mapped every URL cited by ChatGPT, Grok, Google AI Mode, Google AI Overview, and Perplexity when answering B2B software queries — across 3,773 unique URLs, 1,767 domains, and five global markets. The results upend almost every assumption the industry holds about how AI visibility actually works.

8,571 Total citation occurrences
3,773 Unique URLs analysed
1,767 Distinct domains
5 AI engines tracked

Where this data comes from

AI answer engines — ChatGPT, Grok, Google AI Mode, Google AI Overview, Perplexity — are rapidly becoming the first stop for B2B software research. Buyers ask “what’s the best software for X” and receive curated answers with citations. Those citations function as recommendations.

Understanding which content gets cited — and why — is the new frontier of B2B visibility strategy. We used Rankscale to export citation data across a B2B software category, capturing every URL cited across multiple query themes and five regions.

The result is 8,571 total citation occurrences across 3,773 unique URLs and 1,767 distinct domains. Every number in this analysis is drawn directly from that dataset.

Data source breakdown
Citation distribution by AI engine
ChatGPT and Grok together account for 55% of all citations in the dataset
ChatGPT 28% 2,363 cites
xAI Grok 27% 2,277 cites
Google AI Mode 18% 1,513 cites
Google AI Overview 12% 1,037 cites
Perplexity 12% 995 cites

Vendors dominate — overwhelmingly

The single most important finding in the entire dataset: 60.3% of all AI citations point back to a vendor’s own website.

Not a journalist. Not an analyst. Not a review site. A software vendor, publishing content about its own category.

This finding holds across every content type we examined. Even in the broad “blog” category — which intuitively feels like independent editorial — it splits roughly half-and-half once you separate vendor content marketing from genuine independent publishing.

60%
of every AI citation in B2B software categories points back to a vendor’s own website. Glossary pages are even more lopsided at 71% vendor-owned.

The implication is profound. AI visibility is not a PR and media relations challenge. It is a content publishing challenge — and the content you need to publish is primarily on your own domain.

All 8,571 citations · source ownership
Who AI engines actually cite
Broken down by the type of organisation that owns the cited content
Vendor brand (60.3%)
Independent blog (20.0%)
Comparison portal (10.5%)
YouTube (2.5%)
Other (6.7%)

Glossary pages only — 909 citations

Vendor brand (71.2%)
Independent blog (19.8%)
Knowledgebase (7.3%)
Comparison portal (1.0%)

Three content types capture nearly everything

Across 8,571 citations, the content landscape is surprisingly concentrated. Three URL archetypes — listicles, glossary pages, and review portals — account for the overwhelming majority of what AI engines cite.

Everything else — YouTube, Wikipedia, Reddit, LinkedIn, news media, academic research — fills in the remaining margin. Valuable, but not the primary battleground.

Citation occurrences by content archetype
The three URL types that win the citation game
“Best of” listicles
1,530
Product pages
1,400+
Generic blog posts
~1,300
Glossary / “what is” pages
887
Comparison portals
395
YouTube
211
Reddit / community
109
Wikipedia
107

The top domains driving the conversation

Looking at the domains generating the most citations reveals the competitive landscape clearly. Note the mix of vendor-owned and third-party sources — and the relative absence of media or news outlets.

Top cited domains · all citation types
Where AI engines go most often
Breakdown by domain
Top 15 cited domains with type and citation share
Domain Type Citations Share of top domain

Listicles: the counterintuitive playbook that actually works

1,530 citation occurrences flow through “best X software” listicles. These are the single highest-volume citation type in the dataset.

The counterintuitive part: many of the highest-cited listicles are published by vendors themselves — and they include themselves in the ranking.

Vendors are openly writing “the best software in our category” posts, ranking themselves alongside competitors, and AI engines cite these pages as if they were neutral third-party assessments. The top-cited vendor-written listicle in our dataset is the second most-cited URL in the entire study — across all 3,773 URLs.

Key data point

The highest-cited vendor listicle in the dataset outperforms the vast majority of independent review portal pages on a per-URL citation basis. There is no editorial convention preventing vendors from doing this.

The “independent blog” category accounts for 25.7% of listicle citations — but this figure is likely overstated. Many of those sites are agency SEO content farms or vendor-adjacent publications rather than genuinely impartial sources. The true independent third-party share of listicle citations is closer to the comparison portal slice at around 12%.

This is not a grey area. Publishing a “best software in your category” post, including your own product, and doing it well enough that AI engines treat it as a credible source — this is the dominant playbook. Every week you don’t have that content is a week competitors collect citations you don’t.

“Best of” listicles — 1,826 citations
Who owns the listicle citations
Even in ostensibly impartial “best of” roundups, vendor content dominates
Vendor brand (59.3%)
Independent blog (25.7%)
Comparison portal (12.2%)
Community (1.8%)
News / editorial (1.0%)
Independent third-party share
~12%
The real impartial slice, once SEO content farms are excluded from the “independent blog” count
Vendor + adjacent content
~88%
The true combined share of vendor-controlled or vendor-adjacent listicle content

Glossary pages are a quiet winner hiding in plain sight

887 citations go to definition-style “what is X” content. Of those, 71.2% are published by software vendors themselves.

This is the most lopsided finding in the dataset. When an AI engine answers “what is a CMMS?” or “what does CAFM stand for?”, it overwhelmingly cites vendor-published definition pages — not Wikipedia, not TechTarget, not academic sources.

The winning pages are not sophisticated. They are simple, well-structured definition pages that explain a category term clearly and completely. The leaders in this category — large platform vendors — don’t have anything technically impressive on these pages. They just have one well-structured definition page per acronym, and they published them years before the AI visibility conversation started.

Opportunity

Every acronym in your category that lacks a vendor-published definition page on your domain is a missed citation opportunity. Every time an AI engine answers that question, it cites a competitor instead of you.

The investment required is minimal. The upside — being the cited source every time a buyer asks what your category even means — is disproportionate. This is the clearest example in the dataset of a low-effort, high-return AI visibility play.

Glossary / definition pages — 909 citations
71% of “what is X” answers cite a vendor — not an independent source
71.2%
Vendor-published definition pages
647 of 909 citations
19.8%
Independent blog coverage
180 of 909 citations
7.3%
Independent knowledgebases (TechTarget etc.)
66 of 909 citations

Review portals are AI infrastructure, not just sales tools

Gartner, G2, Capterra, Comparesoft, Software Advice, and TrustRadius appear in the dataset at a frequency that belies their relatively small number of cited URLs.

Gartner alone delivers 134 citations from just 23 URLs — a citation density of 5.8 per URL. That is the highest citation-per-page ratio of any major source category in the entire dataset. A single well-maintained Gartner Peer Insights profile generates more AI citations than almost any other content type on a per-page basis.

5.8×
Gartner’s citation density compared to the dataset average. 23 URLs generate 134 citations — more citations per page than any other major source type.

The implication is significant for how teams prioritise. Review platform maintenance has traditionally sat in the sales or customer success function — managing testimonials, requesting reviews, responding to ratings. The data reframes this completely. These platforms are AI visibility infrastructure.

An unclaimed, outdated, or poorly maintained listing on Gartner, G2, or Capterra is not just a missed sales signal. It is a missed AI citation — potentially thousands of times per month, across five AI engines, in every market where buyers are researching your category.

Strategic reframe

Review platform investment is now a top-of-funnel AI visibility play, not just a mid-funnel sales enablement task. The ROI calculation changes significantly when you factor in citation volume.

Citation density by source category
Review portals punch well above their weight
Citations per URL — Gartner delivers nearly 6x the dataset average

Reddit matters more than its citation count suggests

Reddit generates 109 citations in the dataset. That is a small number in absolute terms — 1.3% of all occurrences. But the distribution within that 1.3% is striking.

A single Reddit thread on a relevant subreddit ranks among the top 15 “best of” sources in the entire dataset — alongside content from major platforms, comparison portals, and well-resourced vendor blogs.

This is not a coincidence. LLMs treat peer-to-peer buyer conversations as high-trust signals. The reasoning is straightforward: real buyers comparing notes on industry subreddits are not optimising for search rankings or AI citations. The content is therefore treated as more authentic than vendor-published material.

Community signal

One genuine community discussion, organically generated, can outperform dozens of purpose-built vendor listicles in citation quality — even if not in citation volume. The value is disproportionate to the effort.

The right play here is not aggressive community marketing. It is genuine participation — answering real buyer questions in relevant communities, without a promotional agenda. The citations that result are among the most valuable in the dataset because they signal buyer trust to AI systems.

Editorial media is nearly invisible to AI engines

This is the finding most likely to disrupt traditional B2B marketing budgets.

Only ~1% of citations in the dataset came from genuine editorial or news media. Trade press coverage, journalist features, industry analyst reports, and traditional PR placements are — in terms of AI citation volume — almost entirely absent from the citation landscape.

The brands generating the most AI citations are not doing more PR. They are not chasing analyst reports or trade press bylines. They are publishing relentlessly on their own domains.

Important caveat

This does not mean editorial coverage has no value. Brand credibility, backlinks, and direct buyer reach remain important. But editorial coverage is not an AI visibility strategy on its own — and treating it as one will produce disappointing results.

The data suggests a clear reallocation of attention. Content resources that currently flow toward PR campaigns, journalist relationships, and media outreach should be partially redirected toward owned content publishing — specifically the three archetypes that generate the majority of citations: listicles, glossary pages, and optimised review profiles.

Geographic concentration is both a risk and an opportunity

88% of citations in this dataset come from a single region. The international spread — across four other major markets — accounts for only about 8% of all occurrences combined.

This finding cuts both ways. It is a risk for brands that assume AI visibility in one market translates globally. It does not. AI engines are regionalised in meaningful ways, and citation landscapes vary significantly across geographies.

But it is also an opportunity — specifically for brands with genuine international presence. The equivalent “best of” and “what is” content clusters in non-English markets are typically far less developed than their English-language counterparts. Competition for AI citations in Portuguese, Spanish, French, and German markets is dramatically lower than in English.

Opportunity

Native-language glossary and listicle content in non-English markets can achieve AI citation dominance far more quickly than in crowded English-language markets — often with a fraction of the content investment required.

Citation distribution by region
Geographic concentration in the citation dataset
One region accounts for 88% of all citations — international visibility is thin across all brands
88%
GB
7,506 cites
~3%
PT-BR
~260 cites
~2%
ES
~180 cites
~2%
PT-PT
~170 cites
~1%
FR
~80 cites

The AI citation market is overwhelmingly owned media, not earned media. The brands winning are not the ones with the best press coverage — they are the ones publishing most consistently on their own domains.

— Analysis finding, 8,571 citations across 5 AI engines and 5 regions

What to do with this data

The dataset paints a clear picture. AI visibility in B2B software categories is a content publishing challenge, not a PR or media relations challenge. The brands generating the most citations have mastered owned media — not earned media.

Six concrete actions follow from the data, ordered by the citation leverage each delivers.

Six actions ordered by citation leverage
1
Publish “best of” listicles on your own domain for every category term buyers search. Include yourself. The data shows this is standard practice — and it works. The highest-cited vendor listicle in our dataset is the second most-cited URL across all 3,773 analysed.
2
Create a definition page for every acronym in your category. “What is X” content is 71% vendor-published. If you do not have a definition page for a term, a competitor has the citation every time that question is asked — which is thousands of times per month.
3
Treat Gartner, G2, Capterra, and Comparesoft as live AI infrastructure. Maintain them actively. Gartner delivers 5.8 citations per URL — the highest density in the dataset. A strong profile here generates more AI citations per page than almost any other investment.
4
Participate genuinely in community discussions on Reddit and relevant forums. One authentic thread can rank alongside well-resourced vendor content. Do not market — answer real questions. The citation value of peer-to-peer signals is disproportionate to the effort required.
5
Build native-language listicle and glossary content for non-English markets if you operate internationally. AI citation competition in non-English markets is dramatically lower than English. The opportunity to achieve citation dominance is significantly easier and faster right now.
6
Reframe how you evaluate PR and media. Editorial coverage remains valuable for brand credibility and backlinks. But it accounts for ~1% of AI citations — it is not an AI visibility strategy on its own. Reallocate content resources accordingly.
Finding 01
Core finding 60%

Vendors dominate all citations

60.3% of every AI citation points to a vendor’s own website. This figure rises to 71% for definition and glossary content.

Finding 02
Content types 3

Three archetypes capture everything

Listicles, glossary pages, and review portals account for the overwhelming majority of all AI citations in the dataset.

Finding 03
Highest leverage 1,530

Listicles are the citation engine

1,530 occurrences flow through “best of” posts — mostly vendor-written, including the vendor itself in the ranking.

Finding 04
Quick win 887

Glossary pages are underexploited

887 citations flow to definition content. 71% vendor-published. Low effort, high return. One page per acronym is enough.

Finding 05
Infrastructure 5.8×

Review portals deliver outsized density

Gartner’s 23 URLs generate 134 citations — 5.8x the dataset average. The highest citation density of any source type.

Finding 06
Underrated Top 15

Reddit outperforms most editorial

A single community thread ranks among the top 15 “best of” sources. Peer signals carry disproportionate trust weight.

Finding 07
Reframe ~1%

Editorial media is nearly invisible

Real news and editorial coverage accounts for ~1% of all citations. Traditional B2B PR is not an AI visibility strategy.

Finding 08
Opportunity 88%

Geographic concentration creates gaps

88% of citations come from one region. Non-English markets are nearly wide open — far less competitive for AI citations.


Methodology

Citation data was exported from Rankscale across a defined B2B software category. The export covers 3,773 unique URLs and 8,571 total citation occurrences across five AI engines: ChatGPT, xAI Grok, Google AI Mode, Google AI Overview, and Perplexity. Five regions were included: GB (dominant at 88%), PT-BR, ES, PT-PT, and FR.

Vendor vs third-party classification: any domain with at least one product, product blog, or owned content row was treated as a software vendor. The top blog and listicle domains were manually audited and obvious vendor sites appearing only via blog content were reclassified. All remaining domains were treated as third party.

The “independent blog” category likely overstates genuinely impartial coverage. Many domains in this category are agency SEO content farms or vendor-adjacent publications. True independent third-party coverage is closer to the comparison portal slice.

AI Citation Research — 8,571 occurrences · 3,773 URLs · 1,767 domains · 5 engines · April 2026