Visibility Scores
Visibility scores measure how prominently your brand appears in AI-generated responses. They're the single most important metric in Cleotic -- a quick way to see whether AI models know about your brand and how favourably they feature it.
What the score means
Each brand in your project gets a visibility score from 0 to 100, calculated from two components:
- Mention rate -- The percentage of prompts where the brand was mentioned in the AI response. If you have 20 prompts and a brand appears in 15 of the responses, the mention rate is 75%.
- Average position -- How early in the response the brand is mentioned. A brand that consistently appears in the first paragraph scores higher than one buried at the end. Lower position numbers are better.
The overall visibility score is a composite of these two factors. A brand that's mentioned often and mentioned early will score close to 100. A brand that's rarely mentioned or only appears at the tail end of responses will score much lower.
Reading the score cards
The visibility score cards on the Analytics tab show a ranked list of all brands in your project:
- Rank -- Brands are ordered from highest to lowest visibility
- Brand name -- With the primary brand highlighted
- Overall score -- The composite 0-100 score, displayed as a progress bar
- Mention rate -- Shown as a percentage
- Average position -- The mean position across all responses
Your primary brand is visually highlighted so you can quickly spot where it stands relative to competitors.
Filtering by date range
All visibility data can be filtered by time period:
- Preset ranges -- Last 7 days, 30 days, or 365 days
- Custom range -- Pick specific start and end dates using the calendar
This lets you compare visibility across different periods. For example, you might look at the last 7 days to see recent performance, then switch to 30 days to check the broader trend.
Per-model visibility
Each brand's visibility score card can also be broken down by individual AI model. This reveals which models mention your brand most and least, helping you understand where your strengths and weaknesses lie across the AI landscape.
For a more detailed model-level analysis, see Model breakdown.
What a good score looks like
There's no universal "good" score -- it depends entirely on your industry and competition. What matters most is:
- Your score relative to competitors. If your primary brand scores 65 and the top competitor scores 40, you're in a strong position even though 65 isn't "perfect".
- Trend direction. A score of 50 that's been climbing is better than a score of 70 that's been falling.
- Consistency across models. A score that's high on one model but low on others suggests an opportunity to improve visibility on specific platforms.
Improving your visibility score
If your score is lower than you'd like:
- Check your answer gaps -- See Answer gaps to find specific prompts where competitors appear but you don't.
- Review citations -- See Citations to understand which sources AI models rely on. Getting your content cited increases your visibility.
- Create targeted content -- Use Content Studio to generate content aimed at filling gaps identified in your analytics.
- Track your progress -- Set a visibility goal and monitor the trend over time.
Related
- Share of voice -- Your brand's share of total AI mentions
- Visibility trends -- How your score changes over time
- Answer gaps -- Where competitors appear and you don't
- Model breakdown -- Visibility per AI model