SourcesSources Overview
Sources

Sources Overview

Understand how the Sources tab reveals which often third-party pages shape your AI visibility and how to read its metrics to prioritize action.

What the Sources tab shows

The Sources tab shows which URLs AI engines actually cite when they answer your tracked prompts, and how often those URLs drive answers in your category. For a deeper definition of how these citations roll up into a core metric, see Citations (Frequency).

Each row is a URL that an AI model retrieved and used while generating answers to your prompts. These pages are often third-party content that sits between your buyers and your brand: reviews, comparisons, directories, and community threads where AI engines go to validate what they say.

If a URL appears here, AI engines are already reading and trusting it as a source for your topic. Whether your brand shows up in AI answers often depends on whether you are mentioned on these pages, not only on what lives on your own site.

AI engines repeatedly cite a small set of pages across many queries. Those high-frequency sources are the highest leverage places to earn, improve, or protect mentions of your brand.

Where to find Sources in AIclicks

Open the Citations page at /citations to access Sources.

The page defaults to a URL-centric view, but you can pivot the same underlying data:

  • URL view – Focus on individual pages that AI cites.

  • Domain view – Group citations up to the domain to see which sites matter most.

  • Brand view – Slice by brand mentions across sources.

Use the controls above the table to refine the view:

  • Date range (days) – Restrict to recent runs to see what is shaping answers now.

  • Model – Filter to a specific AI model or compare across models.

  • Page type – Filter sources by certain page type.

  • Mentioned (yes/no) – See if your brand is mentioned inside the specific URL

Why often third-party sources matter

AI models rarely rely on a single page when they answer a prompt. They retrieve from multiple URLs at once and blend signals: your own pages, competitor pages, media coverage, community posts, and aggregator sites.

Your own site is one signal in that mix. Often third-party pages that review, compare, or recommend you play a different role: they act as independent validation. For many commercial prompts, models give outsized weight to sources that look unbiased or comparative rather than self-promotional.

A comparison article on a well-known publication that lists you among top options can influence AI answers more than a product page on your domain. A negative thread in a popular community can show up repeatedly in model citations and steer sentiment. Sources reveal which of those pages are actually being pulled into answers across your prompts.

How to read source-level metrics

The Sources table exposes several fields that describe how a URL influences AI answers. Each field comes directly from citation data collected across runs and prompts.

Coverage and intensity metrics

Use these metrics together to understand both how widely and how strongly a page shapes your visibility.

  • Frequency – Total number of times the URL was cited across the runs and snapshots in your selected filters. This tells you how often the page appears in the AI retrieval fan-out, regardless of which prompt triggered it. For a full definition and how it behaves across the product, see Citations (Frequency).

  • Prompts – Count of unique tracked prompts that cited the URL at least once. This is your prompt coverage: how many distinct intents or questions this page helps answer for AI models. Learn more about how prompts work and are organized in Prompts Overview.

  • Topics – Topic coverage derived from the prompts that cited the URL. If a page appears across prompts in different topics, it is a cross-cutting source that shapes visibility in multiple parts of your customer journey.

High frequency with low prompt coverage usually points to a page that dominates a narrow slice of queries. High prompt coverage and broad topic coverage often identify foundational sources that sit under many of your key prompts.

Page, model, and position context

Context fields help you understand why a source matters and how models use it.

  • Page type – Classification for the URL into one of the supported formats. The full set of page types you will see in the Sources table is:

    • YouTube

    • Wikipedia

    • Homepage

    • Forums

    • Reviews

    • Alternatives

    • Comparison

    • How-to

    • Listicle

    • Landing page

    • Generic Article

    • Social

    • Marketplace

    • Other

    Page type can vary across sources and helps you distinguish formats like video walkthroughs, community threads, and review pages.

Using metrics to prioritize sources

Sources do not roll up into a single numeric priority score in the product. Instead, you use a combination of metrics to decide where to act first.

A practical pattern that works across most categories:

  • Start with Frequency to find pages that AI models touch the most within your filters.

  • Within those, sort by Prompts to keep URLs that matter across multiple intents rather than only one edge case.

  • Use Topics to check whether a page aligns with your most valuable topic clusters, not just any traffic.

  • Filter by Mentioned = no to highlight high-leverage pages where AI already depends on the content but does not yet mention your brand.

  • Layer in Page type to prefer formats where you can realistically earn mentions or placements, like comparison lists, reviews, and directories over static legal pages.

This approach turns raw citation data into a short, actionable list of pages that both matter to AI and are reachable through partnerships, PR, content, or community work.

What types of sources typically appear

Patterns vary by category, but you will often see a mix of:

  • Review and comparison sites – Software review platforms, product comparison pages, and side-by-side feature breakdowns.

  • Listicles and roundup articles – "Best tools for X" or "Top platforms for Y" content on blogs, media sites, and niche publishers.

  • Community discussions – Reddit threads, Q&A platforms, and forums where practitioners trade recommendations and experiences.

  • Industry publications – Analyst reports, industry newsletters, and thought-leadership pieces for your vertical.

  • Brand pages – Your own site and competitor pages that models use as factual grounding, such as docs, pricing, and feature overviews.

Seeing these side by side helps you understand not just where models go for facts, but which formats and outlets set the narrative that AI repeats back to your buyers.

What comes next

Use the related pages below to turn source insights into concrete action and then zoom out to the domain level.