IMDb vs Rotten Tomatoes: Which Rating Should You Actually Trust?
If you've ever looked up the same film on IMDb and Rotten Tomatoes and seen wildly different scores, you've felt the problem. Joker (2019): 8.4 on IMDb, 68% on Rotten Tomatoes. The Last Jedi: 7.0 on IMDb, 91% on RT. Cats (2019): 2.8 on IMDb, 19% on RT. Sometimes both rating systems agree. Often they don't. So which one should you actually trust?
Short answer: neither, individually. Both, together. Here's what each rating actually measures, where they're each strongest, and how to read them in combination.
What IMDb actually measures
The IMDb score is a weighted average of registered IMDb user ratings, on a 1-10 scale. The weighting algorithm is opaque, IMDb describes it as "Bayesian" and adjusts for things like vote count and rating distribution to suppress brigading, but the raw signal is "what did the audience think."
Strengths:
- Huge sample sizes (popular films routinely have hundreds of thousands of votes)
- Updates continuously as a film stays in circulation
- Captures audience appeal across all demographics, not just film critics
Weaknesses:
- Vulnerable to brigading on politically or culturally divisive films (most famously, the post-release vote storms on big franchise releases)
- Skews toward fan-favourite franchises (Marvel, Star Wars, etc.) which often get inflated scores from fan voting
- The weighted average doesn't always match the displayed score on the page
- Underrates challenging or arthouse cinema, audiences who don't engage with slow films don't vote on them, but the few who do don't always like them
What Rotten Tomatoes actually measures
The Tomatometer is the percentage of approved film critics who gave a film a positive review. That's it. A 91% Tomatometer means 91 of 100 critics whose reviews were ingested wrote a positive review. It is not a quality score, not an average, not a rating. It's a binary "thumbs up" count.
This is the most misunderstood thing about Rotten Tomatoes. A film with 91% means almost every critic liked it, but it doesn't tell you how much. A film could be 91% Fresh because every critic gave it a B+. Another film could be 73% Fresh because the critics who liked it gave it A+ and the few who hated it gave it F.
RT also has an "Audience Score" (the popcorn icon), which is a separate average of audience ratings. This is closer to IMDb in concept but with a smaller sample.
Strengths:
- Critic consensus is genuinely useful for films, it filters out the worst marketing-driven productions
- Less vulnerable to brigading because critic reviews are vetted
- Rewards films that are well-made even if they don't have mass appeal
Weaknesses:
- Binary scoring loses nuance, a critically-divisive masterpiece and an inoffensive crowd-pleaser can get the same Tomatometer
- For TV series, the Tomatometer often only reflects Season 1 and doesn't update as the show evolves
- Genre bias, critics famously underrate horror and comedy
- The audience score is much less useful, small samples and easily brigaded
When they disagree, which is right?
The interesting cases are when IMDb and Rotten Tomatoes diverge. There are four common patterns:
| IMDb | RT | What it usually means |
|---|---|---|
| High | Low | Audience-pleaser that critics dismissed (often franchise films, blockbusters, broad comedies) |
| Low | High | Critically-loved arthouse or challenging film that audiences found difficult |
| High | High | Genuine consensus, both critics and audiences agreed. Most reliable signal. |
| Low | Low | Bad film by both standards. Skip. |
The most useful pattern for picking what to watch: when both scores agree, trust them. When they disagree, ask which side of the disagreement you usually fall on. If you tend to enjoy fan-favourite franchises, IMDb wins ties. If you prefer arthouse or critically-loved cinema, Rotten Tomatoes wins ties.
The shortcut: weight by content type
If you only want to remember one rule:
- For films, weight Rotten Tomatoes higher. Critic reviews are quickly available, the Tomatometer cuts out the worst films, and audience scores on RT are less reliable.
- For TV series, weight IMDb higher. RT often only reflects Season 1 critic reviews, while IMDb updates continuously as a show progresses. A 9.0 IMDb on Season 5 of a show is more meaningful than the original 91% Tomatometer from Season 1.
The case for using both at once
The single biggest win is having both numbers visible at the same time. If you're glancing at a Netflix thumbnail and see "8.7 IMDb / 93% RT" you know it's a genuine consensus pick, both audiences and critics agree. If you see "8.7 IMDb / 42% RT" you know you're looking at a fan-favourite that critics didn't love (which might still be exactly what you want, but you know what you're getting). If you see "5.5 IMDb / 89% RT" you've found something arthouse, proceed with intent.
This is why Flix-Rate Pro shows both scores side by side on every Netflix thumbnail. The disagreement between the two scores is often more informative than either score alone.
See both scores on every Netflix title
Flix-Rate Pro overlays IMDb and Rotten Tomatoes ratings side by side on every Netflix thumbnail. £4.99/year, currently £4.99 lifetime on the launch deal.
Add Flix-Rate to ChromeWorked examples
Parasite (2019): 8.5 IMDb / 99% RT. Genuine consensus, both audiences and critics agreed. Watch it.
Joker (2019): 8.4 IMDb / 68% RT. Audience-pleaser, critics divided. The split is the film's whole identity, divisive on purpose. If you like character-driven drama with a controversial protagonist, you'll love it. If you read film criticism, you'll see the structural issues critics flagged.
Tár (2022): 7.4 IMDb / 91% RT. Critically loved, audience split. Long, slow, demanding. If you enjoy that kind of cinema, the RT score is the right signal. If you prefer accessible films, the IMDb score is closer to what your experience will be.
The Room (2003): 3.7 IMDb / 25% RT. Bad by both measures. Famous as so-bad-it's-good cult cinema, which is the only reason to watch it.
FAQ
Are these ratings ever wrong?
Both are aggregations and have biases. IMDb gets brigaded on politically charged films. RT critics underrate horror and comedy. Neither is a substitute for actually thinking about whether you'll enjoy a film. The most useful thing they do is filter out clear duds, a film with 4.5 IMDb and 18% RT is almost never worth your evening.
What about Letterboxd?
Letterboxd has the most thoughtful audience ratings of the three but smaller sample sizes and skews towards cinephiles. If you want a fourth opinion on whether something arthouse is worth your time, Letterboxd is the place. For mainstream films, IMDb and RT are sufficient.
What about Metacritic?
Metacritic does what Rotten Tomatoes claims to do, averages numerical critic scores rather than a binary positive/negative. Arguably more accurate but covers fewer films. For our purposes RT is the more practical option because more films have RT scores than Metacritic scores.