
A program-level view of comment sentiment — trends, creator rankings, and a searchable stream of every comment across your tracked videos.
Sentiment Radar aggregates comment data across all videos in your project and surfaces how audiences are responding to your program over time. Where the individual video Comments tab shows comments for one video, Sentiment Radar shows the full picture — filterable by campaign and date range.
Find it under Tools → Sentiment Radar in the sidebar.
Two filters control everything on the page simultaneously.
The campaign selector scopes all metrics and the comment stream to a single campaign's videos. When a campaign is selected, the KPI strip, chart, creator leaderboard, and stream all narrow to that campaign's content. Select All campaigns (the default) to see program-wide numbers.
The date-range picker controls which comments are included. The default is the last 30 days. Comments are matched by their original TikTok timestamp, not when they were scraped.
Presets include Last 7 days, Last 30 days, Last 90 days, This month, and All time.
When any video in scope has ≥ 40% negative comments with at least 10 analyzed comments in the selected period, a yellow alert bar appears at the top of the page listing up to three affected videos with their negative share percentage and a link to that video's Comments tab.
The alert can be dismissed. It re-fires if you change the date range (a different range = a new alert key).
Four cards show a summary of comment sentiment in the selected range.
| Card | What it measures |
|---|---|
| Total Comments | All comments scraped for tracked videos in this project. The sub-label shows how many have been analyzed. |
| Positive | Percentage of analyzed comments that scored positive, with the raw count below. |
| Negative | Percentage of analyzed comments that scored negative, with the raw count below. |
| Avg Score | Average AFINN comparative score across all analyzed comments. Positive scores lean good, negative lean bad. Typical range is −5 to +5. |
A stacked bar chart showing comment volume broken down by sentiment per time bucket.
Only comments with a sentiment label are counted in the chart. Unanalyzed comments appear in the Total Comments KPI but not in the chart.
Use this chart to spot when audience tone shifted — a spike in negative comments on a specific day often correlates with a specific video or creator post.
A leaderboard of up to 10 creators in scope, ranked by average AFINN sentiment score.
Each row shows the creator's handle, comment count, average score, and a mini bar showing the positive / neutral / negative share of their analyzed comments.
Minimum threshold: a creator must have at least 20 analyzed comments in the selected range to appear. This avoids rankings skewed by a single viral comment. Creators below the threshold are omitted — they'll appear once enough comments have been collected for the range.
A scrollable, paginated feed of all comments in scope. New pages load automatically as you scroll.
Filter the stream to a specific sentiment bucket:
| Tab | Shows |
|---|---|
| All | Every comment regardless of sentiment |
| Positive | Comments scored as positive |
| Neutral | Comments scored as neutral |
| Negative | Comments scored as negative — useful for triage |
Type in the search box to filter comments by text content. The search is case-insensitive and matches any substring. Results update after a short debounce.
Each comment shows the video it came from (thumbnail + caption) and the creator's name, so you can jump directly to the video Comments tab for more context. Click the video thumbnail or caption to open the video page.
Every comment is scored by the scraper using the AFINN word list, a lexicon-based sentiment analysis method. Each word in the comment is matched against the AFINN list and given a score from −5 (most negative) to +5 (most positive). The overall score is the sum of matched words divided by the total word count (the "comparative" score).
Comments that consist only of emoji, images, or non-ASCII text may not be scored and will show as unanalyzed.
Comments are scraped automatically on a 6-hour cycle. A video's comments are only scraped once — on the first pass after it is discovered. Re-scrapes are not performed automatically (this avoids redundant Apify actor runs).
This means:
To check when a specific video was last scraped, open the video detail page and look at the scrape freshness badge.