Featured Image: Source
With all the controversy surrounding the petition that DC fans are signing to shut down Rotten Tomatoes as well as an increased tendency to use the site in my columns, I thought it would be best to clarify why (and more importantly how) I use Rotten Tomatoes’ scores. If you haven’t already read my first column on how to read the Tomatometer, basically I say that Rotten Tomatoes tallies up all critics’ reviews and turns the positive ones (6.0+/10) into a percentage that make up the Tomatometer score (e.g. 8 positive critic reviews out of 10 total will make the Tomatometer read 80%). The Tomatometer tells you if critics like the movie, while the oft-overlooked Average Critic Score (the rating /10 that appears under the Tomatometer on the full site) tells you to what degree a critic liked a movie. I’ve found Rotten Tomatoes to be a useful resource for supporting arguments or predictions in my columns
Why Not IMDb or Metacritic?
Why do I use Rotten Tomatoes over other movie rating sites like IMDb or another aggregate like Metacritic? The short answer is reputability, sample size, and objectivity.
While there is no shortage of sample size for IMDb (it actually provides a far better sample size for movie reviews than RT does) and there are scores available for the most obscure films, its reputability can be called into question quite a bit. For one, anybody in the world can create an IMDb account and start rating movies. Further, anybody in the world can create multiple IMDb accounts and rate movies multiple times with those accounts. Moreover, any group of people can organize raids to create multiple IMDb accounts and rate movies based on a common agenda. In fact, that very situation happened in 2008 with The Dark Knight and The Godfather. Dark Knight fanboys organized a group vote to give The Godfather 1 star out of 10 to position The Dark Knight to take the #1 spot on the IMDb Top 250. A website’s score that can be manipulated like that is not a reputable source in my opinion.
So why not Metacritic? While review aggregator Metacritic has a similar idea as Rotten Tomatoes, it is very different in execution and philosophy. First, Metacritic collects a much smaller sample size than Rotten Tomatoes. At the time of writing this, Suicide Squad has 279 reviews on RT, while Metacritic has 53. When gathering data, I prefer to have as much of a sample size as possible so results aren’t as extreme whenever outliers occur. Also, larger sample sizes reduce the possibility of cherrypicked data. Given that Metacritic only has 53 reviews total while Rotten Tomatoes has 261 reviews with 68 positive ones, it would be easy for Metacritic to selectively choose every positive review for Suicide Squad to generate a very high score and create a false consensus. Obviously, they haven’t done that in this case (Suicide Squad has a 40/100 at the time of writing), but its sample size is exploitable. Whenever I cite a Tomatometer score with less than 100 reviews (a threshold I feel comfortable recognizing as the “consensus”), I note it next to the score. Secondly, while Rotten Tomatoes uses Tomatometer to give the impression of how many people liked a movie, Metacritic uses a 0-100 scale that assigns scores based on the stars/grades/numerical score from reviewers. The Metascore is closer to RT’s Average Critic Score than the Tomatometer. Metacritic’s scale is a bit misleading if you look at it as an American school grade (i.e. C is 75, B- is 80, etc.), but they provide a visual look at how each score is represented (i.e. F is a 0, C is 50, B- is 67, A- is 91, etc.) While I don’t have an issue with the scale itself, I do have an issue with how the aggregate is weighted. Under the subheading “Why the term ‘weighted average’ matters”, the site reads:
Metascore is a weighted average in that we assign more importance, or weight, to some critics and publications than others, based on their quality and overall stature.
I believe that film is art and that art is subjective. As something that is subjective, any criticism formed is called an opinion. As many people learned in grade school, there are no right or wrong opinions. You wouldn’t call somebody an idiot for liking strawberry ice cream or say somebody was more qualified to be President because their favorite color was blue. In the same way, while film critics may have more experience in regards to movies, what they say is ultimately an opinion. Metacritic’s weighted average score assigns a greater importance to the opinion of certain critics than others and in doing so, imply that certain critics’ opinions are more right than others. No one person’s opinion is more right than another person’s. I refuse to cite data from a website that asserts that premise (even if it’s unintentional).
I use Rotten Tomatoes because it reports the largest sample size of critics and uses a unique aggregate score (the Tomatometer) along with average critic scores. Now, Rotten Tomatoes isn’t perfect and can’t capture the initial consensus of films prior to the site’s launch in mid-1998. However, I regard it as the best tool available for reference to critical consensus and projecting whether or not I’ll like a film. Like all tools, I do believe that the site was designed for a specific purpose and there are ways to use it correctly and incorrectly.
What I DON’T Use RottenTomatoes for:
- Forming opinions on movies for me – Going into a movie, I try to keep as open a mind as possible. Often, I’ll avoid looking at RT scores entirely for movies that I’m about to watch. I’m a big believer in the psychological phenomena of confirmation bias (recontextualizing new experiences/evidence to fit an existing belief/preconceived notion), so I try not to influence my own thoughts before watching a film. Furthermore, I’ve often disagreed with RT scores (Iron Man 3 RT’s 7/10 vs. my 2/5, Spectre 6.4/10 vs. 1.5/5, Godzilla ’14 6.6/10 Certified Fresh vs. 2/5, Kingsman 6.8/10 vs. 4.5/5, Primer 6.6/10 vs. 4.5/5, Carol 8.6/10 CF vs. 2.5/5, Charlie and the Chocolate Factory 7.2/10 CF vs. 2.5/10, etc.), so I’m not a slave to critical consensus.
- Asserting the objective quality of a movie – Like I said before, all film is subjective. While I do often refer to RT scores, I never use them for the purpose of validating a personal opinion on the quality of the movie itself. I will never refer to a particular film’s RT score to say, “See? Look how high its score it is. It must be good.” It’s annoying enough when people use this with Academy Awards (another column for another day), so I don’t do this with reviews.
What I DO Use RottenTomatoes for:
- Noting trends in public perception/critical acclaim – This is probably my most common usage of RT and likely will continue in future columns. My most recent usage was in my Marvel Cinematic Universe column noting that the critical acclaim allowed Marvel Studios to push forward with more blockbuster projects and set trends into the present.
- Assessing general consensus – While I will not use RT as reference for my own opinion, I can use it for referencing the general consensus of the critics or RT audience. A good example would be claiming that the overwhelming majority of the moviegoing audience hated the 2015 reboot of Fant4stic due to its 9%, 3.4/10 critic score and 19%, 2.1/5 audience score. Appealing to the Tomatometer is useful when trying to assess a film or franchise’s broad appeal and goodwill with an audience.
- Making predictions about future aspirations for a particular film – While critical and public perception aren’t useful for validating the inherent quality of a film, it is useful for things like box office predictions, awards nomination hopes, sequel aspirations, and studio reactions. Critical reception does matter when it comes to this stuff. Going back to our Fant4stic example, the overwhelming hatred likely resulted in it bombing at the box office ($168M on a $120M budget), winning Worst Picture at the Golden Raspberry Awards (the Academy Awards/Oscars for poorly received movies), making a Fant4stic sequel an incredibly ill-advised financial risk, and 20th Century FOX cutting ties with director Josh Trank.
- Making predictions about future releases from a particular director, movie studio, etc. – In my Pixar column, I predicted that future Pixar releases wouldn’t be as strong or as appealing to all audiences as consistently as they were before 2011. There’s only been one release since that column (Finding Dory), so it’s going to be a while before my prediction is proven right or wrong (and I sincerely hope I’m proven wrong and that Pixar hits another 15-year stride [especially when the Toy Story franchise is on the line]). However, going back to our Fant4stic example, I will put down money on a sequel to Josh Trank’s version of the Fantastic Four with all four cast members returning will never happen (contrary to producer Simon Kinberg’s insistence on making it happen). There is too much money and goodwill lost from audiences for FOX to put out a sequel.
- Deciding if I’ll spend my hard-earned money on a film that I’m on the fence about – While I don’t always agree with RT, the Tomatometer is a useful gauge for the probability that I’ll like a film. Do you remember those Pepsi/Coke commercials where people did blind taste tests as to which one they preferred? It’s easier to think of critics as a sample of the population “taste testing” each film and the Tomatometer as the results.
So why did I explain the Tomatometer for the 17th time? Because, I love writing columns and watching movies, but unfortunately this isn’t my day-job. My money is limited and where I spend it will count as a vote towards what types of movies I’ll want from studios in the future. If there’s a majority of people who dislike a certain release that I was on the fence about (e.g. Jason Bourne 57% 5.9/10, Chappie 32% 4.9/10, In the Heart of the Sea 43% 5.5/10, Triple 9 55% 5.8/10, etc.), I’ll decide to save my money for a movie that I’m either excited for or think deserves it. As a person who wants movie studios to keep putting out films of the highest quality possible, I’d prefer to not spend my money on films that are mediocre, unappealing, or just plain terrible. There are certain releases that I do see regardless of whatever RT score it ends up with (films from Marvel, Pixar, Disney, Refn, Nolan, Tarantino, Fincher, etc.), but for movies that I’m already in the middle towards, I’d rather not take the chance to see it in theaters barring incredible word of mouth.
All in all, Rotten Tomatoes is simply a tool. Rotten Tomatoes, as a review aggregate, does not create its own percentage. Getting mad at it for calculating scores is like getting mad at newspapers for reporting the news. Once again, I find Rotten Tomatoes to be a reliable projector of knowing whether I’ll like a film and predicting future releases. Further, it provides a large reputable sample size and weighs critical opinion equally. As far as I’m concerned, Rotten Tomatoes is one-of-a-kind and is currently the best critical film review tool available to the public.