RecommendationsGet mentioned
Recommendations

Get mentioned

Turn AI citations that overlook your brand into a prioritized outreach list, with workflow, status tracking, and impact measurement.

Understand the Get Mentioned tab

Get Mentioned turns AI answers that ignore or underplay your brand into a prioritized outreach queue you can work through and track.

What Get Mentioned shows

Each entry in the Get Mentioned tab gives you the context you need to decide whether and how to act.

  • Source URL and page title — The exact page being cited in AI answers and its human-readable title.

  • Domain — The publisher behind the page, useful for spotting high-value outlets and understanding relationship potential.

  • Topics covered — The user topics areas this page shows up in AI responses (for example, pricing benchmarks, security comparisons, implementation guides).

  • Prompts covered — The user prompts that led AI systems to cite this page, such as "best payroll software for US startups" or "how to implement SSO in a SaaS app".

  • Frequency — How often AI tools cite this specific URL across tracked prompts in the recent window.

  • Notes and status — Your internal workflow fields: add notes and mark each item as pending, done, or dismissed.

Use filters and search to slice by domain, topic, or prompt, then export CSV when you want to coordinate outreach in a CRM or spreadsheet.

Use the following workflow to move from raw recommendations to measurable improvements in AI-driven visibility.

Scan and prioritize high-impact URLs

Start by sorting Get Mentioned by frequency and scanning for URLs that map to your highest-value prompts. Use topics and prompts to confirm strategic fit, then tag a short list (for example, 10–20 URLs) as your current sprint.

As a success signal, your working list should skew toward high-frequency URLs on prompts that matter for pipeline or revenue.

Classify source type and choose playbook

Open each prioritized URL and classify it as review/directory, listicle/comparison, editorial content, or community post. Once you know the source type, apply the matching outreach play from the previous section so you are not reinventing the strategy for every item.

Update the note with the source type and planned approach so teammates see at a glance how you intend to engage.

Execute outreach and update statuses

Run the outreach or contribution work: claim profiles, send pitches, provide data, or participate in discussions. As you act, add brief notes and mark items as done once your current action is complete, or dismissed if you decide not to pursue them.

You should see the active queue of pending items shrink over time, with a growing history of done items you can reference later when evaluating impact.

Monitor AI coverage and citations over time

After 2–4 weeks, review how AI systems handle the prompts tied to the URLs you worked. Check whether the pages now mention your brand or reflect your updated positioning, and whether AI answers start incorporating that information.

In the product, track changes in Citations (Frequency) and related main metrics like Visibility and Share of Voice for those prompts. Positive movement here is your leading indicator that Get Mentioned work is turning into real exposure.

Measure impact in main metrics

Once your outreach efforts have had time to land and be incorporated into AI systems, confirm impact in your core metrics:

  • In Citations (Frequency), filter to the prompts associated with URLs you marked as done and look for upward trends.

  • In Visibility and Share of Voice, check whether your domain's exposure grows on the same prompts and topics.

  • In Position metrics, watch for shifts in how often you are recommended relative to key competitors in high-intent prompts.

If you see movement, feed those learnings back into your prioritization: invest more in the source types and domains that reliably convert outreach into better AI coverage.

Next steps

Use these related features to deepen your workflow and scale outreach.