For one company in our portfolio, a single report attracts 40% of all their AI referral traffic. For another, an explainer on carbon emissions in their niche brings in more AI referrals than the rest of their website combined.
We provide marketing support for 20+ organisations developing deeptech and climate adaptation solutions across biotechnology, nature data, construction, and energy. None of them hired us specifically to grow AI referral traffic but we spotted these patterns in the analytics we already monitor. What started as an accidental observation became a systematic investigation. After reviewing twelve months of data across 17 sites, the picture is consistent: AI tools have strong preferences about what they cite, and most companies are producing the wrong content for it.
That gap is starting to matter. B2B buyers complete roughly 70% of their evaluation before contacting a vendor3. They spend 27% of their buying time researching independently online and just 17% in meetings with potential suppliers4. What ChatGPT, Perplexity, or Claude tells them about your company - or whether it mentions you at all - depends on what's on your website. One study found that structured optimisation for AI visibility can increase visibility in AI-generated responses by up to 40%1.
This article breaks down what we learned: which content types attract AI referrals, which don't, where the traffic actually comes from, and what to do about it.

If your website doesn't say it, AI tools won't either.
Several organisations in our portfolio have strong reputations in their sectors that they built through conferences, peer networks, partnerships, and years of direct relationships. However, the websites they built before we started working together functioned as digital brochures: a homepage, an about page and some simple news items. When we checked what AI tools say about them, the results were thin, outdated, or inaccurate. Their offline reputation and the trust they've built through years of work, simply doesn't exist in the dataset AI tools draw from.
AI tools don't know what you said at a conference. They don't know about the partnership you announced in a private newsletter. They don't know your CEO is a recognised expert unless that expertise is published, structured, and findable on a website they can crawl. If your methodology or approach isn't documented anywhere public, AI tools will describe your competitors' approaches instead - because those competitors wrote it down. The only version of your company that exists for ChatGPT or Perplexity is the version that's publicly indexed online.
Your next investor, client, or partner may ask ChatGPT about your sector before they ever visit your website. What it tells them is shaped entirely by what you've published.
The companies in our portfolio that attract the most AI referral traffic aren't necessarily the biggest or best known. They're the ones with the most published, structured, findable content. One site with deep research libraries and regular publications attracts more AI referrals than three others in similar sectors combined - not because its work is better, but because AI tools can actually find and cite it.
Across the 17 sites we analysed, AI referral traffic as a share of total sessions ranged from under 0.5% to 3.2%. Most sites clustered between 0.5% and 1.1%. By any standard measurement, these are small numbers.
But the trajectory tells a different story.
When we compared the first half of 2025 to the second half, the majority of sites showed meaningful growth. Several saw AI referral traffic double between H1 and H2. The sites with the richest content libraries showed the steepest and most sustained growth curves - not a spike and retreat, but a steady climb month on month.
Our own site, adopter.net, saw AI referral traffic grow 760% over the course of 2025. By December, AI tools accounted for 3.2% of our total traffic - roughly 3x the average across the other sites we track. This wasn't something we planned for. The content driving it - data-rich resource articles, structured directories, and interactive content - was created as part of our broader content strategy. But it turned out to be exactly the kind of content AI tools prefer to cite.

If the trajectory looks familiar, it should. The early growth curve of AI referral traffic mirrors what organic search looked like in its first years - small percentages, easy to dismiss, but compounding steadily. Gartner predicts that by 2026, traditional search engine volume will decline by 25%, with a significant share shifting to AI tools and agents2. ChatGPT alone went from 100 million weekly active users in November 2023 to 300 million by December 20245.
The practical question isn't whether AI referral traffic matters today - for most sites, it's still a rounding error. The question is whether you want to be visible when it stops being one.
The companies building content that AI tools can cite right now are establishing citation precedence. Research on large language models shows that their accuracy on any given topic correlates directly with how frequently that topic appears in their training data6. Content that's already being cited feeds back into the dataset these models learn from. A site that gets cited consistently in 2025 is more likely to keep getting cited in 2026 and beyond - not because of any ranking algorithm, but because the content is already embedded in the training data and retrieval indexes these tools rely on. The compounding effect works in reverse too: companies that aren't being cited now will find it harder to break in later, as AI tools default to sources they already trust.
This is the window. The traffic is small enough that most companies aren't paying attention, but the patterns are established enough that you can see exactly what works.
Not all content attracts AI referrals equally. After categorising every AI-referred page across the 17 sites we tracked, a consistent hierarchy emerged. The same content types appeared at the top across different sectors, different site sizes, and different audiences.
Here's what works, in order of impact.
This is the single strongest category. Content built around specific numbers, structured data, and quantitative findings attracts disproportionately more AI referrals than any other type.
A scaleup in the biotech sector published an industry report using proprietary data. That one report attracts 40% of all their AI referral traffic - more than their homepage, product pages, and blog combined. When someone asks an AI tool about specific statistics in that industry, it points straight there.
On adopter.net, our data-rich resource articles collectively account for over a third of our AI referral traffic. When someone asks an AI tool about trends or statistics in our sectors, these are the pages it cites - not our homepage or service descriptions.
An international green finance and data innovator saw AI tools consistently reference their quantitative research across multiple publications. Each dataset or statistical finding created a separate entry point for AI citation.
The pattern is clear: AI tools need numbers to cite. If your content contains specific, current, well-structured data, it becomes a source. If it doesn't, AI tools will cite whoever does have the numbers.
Where statistics content provides the data points, original research provides the frameworks and findings that AI tools use to explain concepts.
An international data innovation think tank regularly publishes original research on climate policy, finance and risk. Their publications each attracted AI referrals independently - not as a cluster around one piece, but as individual pages that AI tools cited when users asked about specific topics. The breadth and depth of their research library made them a default reference source across multiple subject areas.
A global sustainable finance policy initiative saw significant AI engagement with a particularly detailed case study and their flagship policy framework - two very different pieces of content, each addressing a specific question that users ask.
An international organisation operating in the bioeconomy space saw one resource emerge as a standout magnet for AI-driven referrals: a comprehensive, data-intensive sector overview. Its structured design, clear taxonomy, and evidence-backed analysis made it particularly extractable for AI systems responding to user queries.
The common thread: original research that maps a landscape, proposes a framework, or presents findings that don't exist elsewhere. AI tools cite these because they have to - there's no other source for the same information.
Well-structured content that explains a technical concept in accessible language is consistently cited by AI tools answering user questions.
A materials innovation company in the energy space published a blog post explaining Scope 1, 2, and 3 emissions in their context. It attracted more AI referrals than any other page on their site - including product pages and the homepage. When someone asks an AI tool about carbon emissions in their sector, that article gets surfaced.
An alternative protein scaleup saw a similar pattern. Their article on the top technical challenges in their production process was their highest AI referral page. Not a product pitch - a genuine, detailed explanation of a problem their industry faces.
These aren't long or elaborate pieces. They're clear, specific answers to questions that real people ask AI tools. The format is simple: explain the concept, provide context, include specifics. AI tools are essentially looking for the best available answer to a question - and well-structured explainers are exactly that.
Structured, queryable content creates a different kind of AI referral pattern - not a single high-traffic page, but a long tail of referrals spread across many individual entries.
On adopter.net, our structured directories and interactive content generate AI referrals across dozens of individual pages. No single page dominates, but collectively they account for a substantial share of our AI traffic. When someone asks an AI tool about specific organisations or resources in our sectors, these structured pages are what get cited.
A green finance think tank saw their classification framework for their sector attract consistent AI referrals - a structured taxonomy that AI tools reference as a definitional resource. When users ask what categories exist within that space, the taxonomy is the answer.
Databases work for AI because they're structured by design. Each entry has consistent fields, clear categorisation, and specific information - exactly the format AI tools find easiest to parse and cite.
Case studies attract fewer AI referrals than the categories above, but when they work, the reason is specific: they contain concrete details that AI tools can cite as evidence.
An international sustainable finance policy initiative published a case study illustrating the practical application of an innovative financing mechanism. It generated consistent AI-driven referrals because it provided concrete figures, a clearly defined jurisdiction, and measurable outcomes. When users ask how such instruments function in practice, AI systems prioritize real-world examples with sufficient specificity - and this resource met that threshold.
The case studies that don't attract AI referrals tend to be the ones written as marketing narratives - light on specifics, heavy on testimonials, structured around the client relationship rather than the outcome. AI tools can't cite a feeling. They can cite a number, a result, or a concrete process.
.jpg)
The hierarchy above shows what works. The data also shows clear patterns in what doesn't. None of the organisations below were trying to optimise for AI referral traffic - these are patterns in the data, not performance reviews.
Content that describes a company in broad terms without making a specific, citable claim generates almost no AI referrals. Pages that only say things like "we help companies reduce their environmental impact" give an AI tool nothing to work with. No data point to extract, no framework to reference, no question being answered.
Several sites in our portfolio rank well in Google for competitive terms - solid technical SEO, strong backlink profiles, optimised metadata. Their AI referral traffic is minimal.
SEO and GEO reward different things. Google ranks pages based on authority signals and relevance. AI tools cite pages based on whether they contain a specific, extractable answer to a question. A page can rank first in Google and never appear in a ChatGPT response. Don't assume your SEO performance translates 1-1 to your AI visibility. While there's some overlap, GEO has to go further.
An alternative protein scaleup saw AI referral traffic spike sharply in mid-2025, driven by a run of thought leadership content. Within a few months, traffic dropped by 95%.
The content was never designed for AI optimisation, so this isn't a failure - it's a pattern worth understanding. A burst of publishing without a supporting body of work doesn't build lasting visibility. The sites that maintained steady AI referral traffic had deep, regularly updated content libraries. Not because they were optimising for AI, but because that's what consistently useful content looks like.
The content hierarchy tells you what to publish. This section covers how to structure it so AI tools can actually find, parse, and reference it.
These six principles showed up consistently across the sites with the strongest AI referral traffic. None of them are complicated. Most of them are things good content should do anyway.
AI tools extract answers from content. Research on how language models process documents found a clear pattern: they pull information from the beginning and end, and largely ignore what's in the middle7. If your key finding, definition, or data point is buried in paragraph eight, AI tools may never reach it - or may pull a less useful sentence from higher up the page instead.
Put your most important claim, number, or conclusion in the first two paragraphs. If someone asks an AI tool a question and your page has the answer, make sure the answer is near the top.
Pages with clearly structured question-and-answer sections consistently appeared among the top AI-referred content in our data. The format maps directly to how people use AI tools: they ask a question, the tool looks for a page that answers it.
This doesn't mean bolting a generic FAQ onto every page. It means identifying the specific questions your audience actually asks - and answering them clearly, with one question per heading.
AI tools prefer content they can cite precisely. "The market grew significantly" gives them nothing. "The market grew 23% year-on-year to reach $4.2 billion" gives them a citable fact.
Every page that performs well for AI referrals in our data contains specific figures - percentages, financial values, quantities, dates. The more precise and current your numbers, the more likely AI tools are to treat your content as a source.
AI tools frequently answer comparative questions: "How does X compare to Y?" or "What are the differences between A and B?" Content that directly addresses comparisons - between approaches, technologies, products, or frameworks - gets cited in these responses.
If your sector has competing methodologies, overlapping standards, or commonly confused concepts, a clear comparison piece is one of the highest-value things you can publish for AI visibility.
This is counterintuitive but well-supported by the data and by the KDD 2024 research on GEO: content that cites external sources gets cited more by AI tools, not less. The GEO study at KDD 2024 found that adding citations and quotations from credible sources was the single most effective optimisation strategy tested, improving visibility in AI-generated responses by up to 40%1.
When your content references a specific study, links to a dataset, or attributes a claim to a named source, it signals credibility to AI tools in the same way it signals credibility to a human reader. Unsourced claims are harder for AI tools to trust and therefore harder for them to repeat.
AI tools parse content hierarchically. Clear H2 and H3 headings, consistent formatting, and schema markup all make it easier for AI tools to understand what a page covers and extract the relevant section.
This doesn't require technical expertise beyond what a modern CMS provides. It means using proper heading hierarchy rather than bolded text, breaking content into clearly labelled sections rather than long unbroken passages, and adding basic schema markup (Article, FAQ, Dataset) where relevant.
Most of this is just good content practice. The difference is that AI tools are less forgiving of poor structure. A human reader will scan past a weak introduction to find what they need. An AI tool will stop at the weak introduction and cite something else instead.
If you've read this far, you probably want to know what to do first. Here's a practical breakdown based on what the data shows makes the biggest difference, ordered by effort.
Check what AI tools currently say about you. Open ChatGPT, Perplexity, and Claude. Ask each one what your company does, who your competitors are, and what the key trends in your sector are. Note what's accurate, what's missing, and whether you appear at all. This takes 20 minutes and gives you a baseline.
Review your highest-traffic pages. Look at your top 10 pages by organic traffic. How many contain a specific, citable fact - a number, a finding, a clearly stated methodology? If the answer is fewer than half, that's your starting point.
Publish one data-rich resource. A statistics roundup, an industry benchmark, a data-backed trend analysis. Based on our data, this single content type attracts more AI referrals than any other. It doesn't need to be original research - well-structured, properly sourced aggregation of existing data works.
Add FAQ sections to your key pages. Identify the five to ten questions your audience most commonly asks - on sales calls, in emails, at conferences - and answer them clearly on your site. One question per heading, answer in the first sentence, detail after.
Build a structured content asset. A directory, a glossary, a comparison database, an interactive tool. Something with consistent structure across many entries that AI tools can query at the individual entry level. These take longer to build but they generate the long-tail referral pattern we saw across multiple portfolio sites.
Audit your existing content for citable substance. Go through your blog archive and service pages. For each one, ask: does this page contain a specific claim an AI tool could extract and cite? If not, update it or deprioritise it.
If you're thinking about how AI tools represent your company - or whether they mention you at all - we can help.
We offer a GEO content audit: a structured review of your website's AI visibility, the content gaps that matter most, and a prioritised plan for what to publish, update, or restructure first. It's based on the same methodology behind this article, applied specifically to your site.
1. What is Generative Engine Optimisation (GEO)?
GEO is the practice of structuring your content so that AI-powered tools - ChatGPT, Perplexity, Claude, Gemini, and others - can find, understand, and cite it in their responses. It was formally defined in a 2023 research paper and presented at KDD 20241.
2. How is GEO different from SEO?
SEO optimises for ranking position in search engine results pages. GEO optimises for presence and citation in AI-generated answers. They share some foundations - clear structure, quality content, technical accessibility - but GEO places much more emphasis on citable specifics: data points, direct answers, structured formats. A page can rank well in Google and never appear in an AI response.
3. How do you track AI referral traffic?
In Google Analytics 4, AI referral traffic appears as sessions from referral sources including chat.openai.com (ChatGPT), perplexity.ai, gemini.google.com, copilot.microsoft.com, and claude.ai. You can create a custom segment or exploration filtering by these sources to isolate AI-referred visits.
4. How much AI referral traffic should I expect?
For most sites today, AI referral traffic represents between 0.5% and 3% of total sessions. The number matters less than the trajectory - across the sites we track, AI referral traffic grew consistently throughout 2025, with several sites seeing it double between the first and second halves of the year.
5. Does AI referral traffic convert?
We're still building data on this, but the early signals are promising. Users arriving via AI tools tend to have high intent - they've already been told about you by a tool they trust, and they're clicking through for a specific reason. That's a qualitatively different visit from someone landing via a generic Google search.
6. Can I optimise for specific AI tools?
In practice, you don't need to. The content that gets cited by ChatGPT also gets cited by Perplexity, Claude, and Gemini. The underlying principles - specific data, clear structure, direct answers - are consistent across all of them. Optimise for one and you optimise for all.
7. How quickly does GEO work?
Faster than SEO in some cases. AI tools with real-time web access (Perplexity, ChatGPT with browsing) can discover and cite new content within days of publication. Longer-term citation - being included in AI model training data - takes months, but the immediate retrieval-based citation can happen quickly.
8. Is GEO relevant for B2B companies?
Especially so. B2B buyers, investors, and potential partners increasingly use AI tools as a first step in research. When an investor asks ChatGPT about companies in your sector before a meeting, when a potential client asks Perplexity to compare approaches, or when a journalist asks Claude for expert sources on a topic - GEO determines whether your company appears in the answer. For B2B companies where deals are high-value and research-intensive, being present in these AI-generated answers matters more, not less.
9. Do I need to rebuild my website for GEO?
No. Most GEO improvements are content-level: adding specific data to existing pages, restructuring content with clear headings and FAQ sections, publishing data-rich resources. You don't need a new site - you need better content on the site you have.
10. Is this just a trend, or is it here to stay?
Gartner predicts traditional search engine volume will decline by 25% by 2026, with traffic shifting to AI tools and agents2. The underlying behaviour change - people asking AI tools questions instead of typing keywords into Google - is accelerating, not slowing. The companies building AI-citable content now are establishing citation precedence that will compound over time.
11. What are other names for GEO?
You may see GEO referred to by several names. Generative Engine Optimisation is the most widely used term in industry and academic literature. Answer Engine Optimisation (AEO) is often used interchangeably, though some practitioners use AEO specifically for optimising content for featured snippets and voice search. You might also encounter AI Search Optimisation, AI SEO, LLM Optimisation, or AI Visibility Optimisation. They all describe the same discipline: structuring your content so that AI-powered tools can find, understand, cite, and recommend it.
LinkedIn for B2B in 2026: Why Technical Companies Now Have an Advantage
Can your audience tell you’re using AI to write? Half of them can, and they stop reading.
1 Aggarwal, P. et al. (2024) GEO: Generative Engine Optimization. SIGKDD.
2 Gartner (2024) Gartner Predicts Search Engine Volume Will Drop 25% by 2026, Due to AI Chatbots and Other Virtual Agents.
3 6sense (2024) B2B Buyer Experience Report 2024.
4 Gartner (2019) The New B2B Buying Journey.
5 Field, H. (2024) OpenAI's active user count soars to 300 million people per week. CNBC.
6 Kandpal, N. et al. (2023) Language Models Struggle to Learn Long-Tail Knowledge. In Proceedings of the 40th International Conference on Machine Learning (ICML 2023).
7 Liu, N.F. et al. (2024) Lost in the Middle: How Language Models Use Long Contexts. Transactions of the Association for Computational Linguistics, 12, pp.157-173.