Aggregate Sites Are Rotten

0
1205

Janani Ravikumar
Staff Writer

For years, sites such as IMDb, Metacritic and Rotten Tomatoes have provided movie watchers with ratings for newly released films by consolidating the reviews of renowned critics around the world. This allowed people to pick and choose which movies they want to watch in theatres well in advance, weeding through the bad movies and choosing only the good.

This works well in theory, as it ensures that people watch only the best movies without wasting their time and money on movies of lower quality, but such a system makes it easy for perfectly decent movies to slip through the cracks, as they fail to receive the required amount of recognition from critics to be rated highly on such sites.

Many of these aggregate sites operate by assigning a score to newly released movies, based on the reviews of renowned critics around the world. According to their website, Metacritic obtains its ratings by assigning scores to the reviews of some of the world’s most respected critics, ranging from 0 to 100, and then applying a weighted average to normalize the scores in a fashion similar to a class’s grading curve. With a weighted average, some critics and publishers are given more importance than others, and users’ votes are not included in the final score the website presents. Scores ranging from 61 to 100 are highlighted green, indicating positive reviews. Scores ranging from 40 to 60 are highlighted yellow, indicating mixed or average reviews. Scores ranging from 0 to 39 are highlighted red, indicating negative reviews.

Similarly, Rotten Tomatoes operates on a rating scale, the “Tomatometer,” where percentages are assigned to movies based on the opinions of hundreds of film and television critics, according to the official site. Movies and TV shows with a Tomatometer at 60 percent and above are flagged with the image of a healthy-looking red tomato and labeled as “fresh.” Movies and TV shows where the Tomatometer is at 59 percent or lower are flagged with an unhealthily green, squashed-looking tomato and labeled as “rotten.”

In addition to scores based off of critics’ reviews, Rotten Tomatoes includes in their ratings an audience score, indicating the percentage of users who rate a movie or TV show positively. A full and upright popcorn bucket denotes a movie or TV show receiving 3.5 stars or higher by users, while a tipped over popcorn bucket denotes a score lower than 3.5 stars.

When people place so much faith in websites like Metacritic and Rotten Tomatoes, they are essentially swearing by a handful of opinions that cannot possibly begin to accurately represent the entire demographic. They let a comparatively small group of people — a well-read and knowledgeable group of people, but with its own biases and prejudices — speak for everyone, and then they swear by these opinions as if they are the absolute truth.

This keeps people from the more “mediocre” movies, which they may find just as good if not better than what the critics deem “the best of the best” — because they assume that movies that receive more average scores are of lower quality by default.

When it comes to consuming media, people should be free to make their own judgments about what they watch. While it’s certainly nice to have these ratings so readily available when we don’t know whether a certain movie or TV show is worth investing our time and money, it’s important to remember that only we can truly speak for ourselves. That which renowned critics deem “the best of the best” may not necessarily align with our own favorites, and that makes us no less intelligent or cultured.