Skip to content
English
  • There are no suggestions because the search field is empty.

AI Visibility Tracker: Methodology & Metric Reference

What the AI Visibility Tracker Measures

As more people turn to AI assistants for research and recommendations, visibility in AI chats is a meaningful driver of brand recognition, reputation and trust. The AI Visibility Tracker from 3BL monitors how often and how prominently your brand appears in responses generated by top AI platforms.

The AI Visibility Score

The AI Visibility Score is the primary metric in the tracker. It is a single composite index, scored on a 0-100 scale, that summarizes your company’s overall prominence in AI-generated responses.

Score Components

The AI Visibility Score is calculated across three dimensions, all scored out of 100:

Component What It Measures Weight

Query Coverage

The percentage of tracked AI responses where your brand is mentioned.

If we run 100 queries and your brand is mentioned in 40 of the responses, Query Coverage is 40%.  

40%

Position Score

How prominently you appear within responses. Brands that appear earlier in tracked responses receive a higher score.

If your brand is mentioned first in a response, your score is 100. If mentioned after 3 other brands (4th position), out of 10 total brands, the position score is 60. Position Score is the average across all tracked responses.

35%

Share of AI Voice

How frequently a brand is mentioned across tracked responses, relative to all of the brands mentioned.

If your brand is mentioned twice in a response that includes 10 brands, Share of AI voice is 20%. Share of AI Voice is the average across all tracked responses. 

25%

Note: These weightings may change over time as the product develops.

Query Coverage

Query Coverage measures the percentage of tracked AI responses in which your brand receives any mention. It is formatted as a percentage and reflects how broadly the brand appears across the full set of queries, regardless of how many times it's mentioned within any single response.

Position Score 

Position Score captures how prominently your brand is mentioned within AI responses across tracked queries on a 100-point scale. Mentions that appear earlier in a response, or that anchor a recommendation, carry more weight than passing references. 

Share of AI Voice

Share of AI Voice measures the rate at which your brand is mentioned across all tracked responses, relative to the total volume of brands mentioned. In the context of the Visibility Score, this metric serves as the Share of Voice component, representing a given brand’s share of the overall conversation related to their industry and topics of interest.

How the AI Visibility Score Is Calculated

The AI Visibility Score is calculated by multiplying each component by its weight and summing the results:

Visibility Score  = (Query Coverage × 0.30) + (Position Score × 0.20) + (Share of AI Voice × 0.50)

Visibility Leadership Indicator

In addition to the numeric score, the AI Visibility Tracker includes a Leadership Indicator classification that represents where your score falls relative to others. 

Topline Score Range Classification What It Means

≥ 50

Leader

Strong, consistent visibility across tracked queries compared to the average

≥ 30

Contender

Meaningful presence; room to expand coverage and position

< 30

Needs Improvement

Appearing in some responses, but limited reach or prominence

Supporting Metrics

Supporting metrics within the AI Visibility Tracker provide more information about what’s driving your score, the sources cited in conversations about your brand, and how you compare with industry peers across platforms and topics of interest. 

Rate Metrics

Rate metrics express mentions and citations as a proportion of total responses. This makes it possible to compare performance fairly across time periods or competitors, even if query volume changes.

Mention Rate

Shown as a percentage, the mention rate tallies how often your brand name appears in tracked AI responses, regardless of whether it is cited as a source. 

  • Company Mention Rate: Rate at which your brand is mentioned across all tracked responses.
  • Competitor Mention Rate: Rate at which other brands in the defined competitor set are mentioned across tracked responses.

Unlike Query Coverage, this metric counts volume. If your brand is mentioned three times in a single response, indicating the response was more significantly focused on you, that contributes three times more than a single mention would.

The AI Visibility Tracker also uses raw mention counts to support metrics such as Share of AI Voice.

Citation Rate

Shown as a percentage, the citation rate tracks how frequently a brand is cited as a source within a response. Beyond a simple mention, citations indicate the AI response positioned your brand as a source of information, either with attribution or a link back. 

  • Company Citation Rate: Rate at which your brand is cited as a source across all tracked responses.
  • Competitor Citation Rate: Rate at which other brands in the defined competitor set are cited as a source across tracked responses. 

Note: Broken or inaccurate links may appear due to simulated citations from older LLM models or stale search results.

Sentiment Score

The Sentiment Score reflects the tone of AI-generated language when your brand is mentioned. 

How the Sentiment Score is calculated

The Sentiment Score is measured at the response level and centers around language used in association with the target company, not with competitors, the industry, or problems in general. Sentiment is not measured at the mention level. Every mention of the target company in a single response will be tagged with the same sentiment score. 

The response-level scores are averaged in the tracker to produce a single Sentiment Score that reflects how your brand is discussed most often on a given AI platform or in association with a given topic. 

Importantly, the Sentiment Score does not simply reflect positive or negative keywords. A response can mention problems without reflecting negative sentiment toward the brands addressing those problems. For example, a response that includes the phrase “climate change is devastating, and Apple is leading with net-zero pledges" results in positive sentiment for Apple, not negative sentiment simply because a problem (“climate change”) and a negative keyword (“devastating”) are mentioned. 

3BL’s AI sentiment model analyzes whether the company's actions are favorable, not whether the topic itself is positive.

Positive (Score: 1, or 100%): Your company is portrayed favorably

Example: "Levi Strauss has reduced emissions by 40%."
Example: "Nike has committed to ending child labor in supply chains."

Negative (Score: -1, or -100%): Your company is portrayed unfavorablly

Example: "Target has faced multiple data breach lawsuits."
Example: "BP's oil spill in 2010 caused massive environmental damage."

Mixed (Score: .5, or 50%): Your company receives both praise and criticism, a likely scenario that follows the real-world nuance brands navigate each day 

Example: "Apple leads in accessibility features, but has been criticized for supply chain practices."

Neutral (Score: 0): Your company isn't mentioned at all, or is mentioned in purely factual terms without positive or negative association

Example: "Walmart has 4,700 stores nationwide."
Example: Article discusses ESG trends without specifically praising or criticizing your company

Unknown (Score: 0): The AI couldn't determine sentiment. This is rare and usually indicates a data quality issue.

Sentiment Label

Score Value

Description

Positive

1

Favorable language is used in the response when your brand is mentioned

Mixed

0.5

Both positive and neutral/negative language is present

Neutral / Unavailable

0

No clear sentiment in the response or the sentiment is not evaluable

Negative

-1

Unfavorable language is used in the response when your company is mentioned


Key Terms

Column

Description

Prompt

The exact question asked to the AI provider

User Intent

The nature of the question asked to the AI provider, classified into four categories:

  • Recommendations: Asking for information to provide a recommendation
  • Examples: Asking for evidence and real-world instances around a given subject
  • Topic authority: Asking about topics of interest or about companies with respect to a certain topic. 
  • Competitive comparison: Asking about competitive standing on a given topic based on industry and persona - may use specific company names

Prompt Topic

Subject matter of the query (e.g., business ethics, employee wellbeing, environmental stewardship)

Company

The company being tracked 

Response Text

The full text of the AI response

Response Sentiment

Qualitative sentiment label for the response

Response Model

The AI provider that generated the response

Citations

Source URLs or references included in the response

Target Competitors

Set of competitor companies tracked, if set

Full Metrics Reference

The following table lists all currently defined metrics in the AI Visibility Tracker. 

Metric Name

Format

Description

Visibility Score

0-100

Weighted composite index; the primary KPI

Leadership Indicator

Label

Leader / Strong Contender /Visible but Weak classification relative to threshold

Query Coverage

%

% of tracked prompts where the company is mentioned

Position Score

0-100

Prominence of company mentions within tracked AI responses

Company Mention Rate

%

Rate at which the company is mentioned across all responses

Competitor Mention Rate

%

Rate at which tracked competitors are mentioned across all responses

Company Citation Rate

%

Rate at which the company appears in cited sources

Competitor Citation Rate

%

Rate at which competitors appear in cited sources

Sentiment Score

-1 to 1

Average sentiment across all evaluated responses, derived from Response Sentiment values

Prompt Responses

Count

Number of AI responses collected

Prompts Asked

Count

Number of unique prompts submitted

The following table lists supporting metrics within the dataset that are used to calculate other metrics but are not displayed directly. 

Company Mentions

Count

Total target company mentions

Competitor Mentions

Count

Total competitor mentions

All Mentions

Count

Total mentions across all tracked companies

Company Citations

Count

Number of times the company is cited as a source

Competitor Citations

Count

Number of times competitors are cited as sources

Total Citations

Count

Total citations across all tracked responses

Important Caveats

This methodology is observational, not causal. The Visibility Score reflects how AI platforms represent a brand in their responses at the time of measurement. It does not predict or guarantee future outcomes.

Any responses generated by AI are shown as-is to the user, with active consideration not to influence the response in any way.  Generated responses may contain mistakes. 3BL does not verify the accuracy of model responses; it only shares the results.

Metric definitions, score weights, and thresholds are subject to change as the product develops. We will communicate any significant methodology changes through your account team and in product release notes.

Questions about this methodology or your results? Contact your 3BL Client Success Manager.