Is Rotten Tomatoes 'Erratic, Reductive, and Easily Hacked'? (vulture.com) 43
Rotten Tomatoes celebrated its 25th year of assigning scores to movies based on their aggregate review. Now Vulture writes that Rotten Tomatoes "can make or break" movies, "with implications for how films are perceived, released, marketed, and possibly even green-lit". But unfortuately, the site "is also erratic, reductive, and easily hacked."
Vulture tells the story of a movie-publicity company contacting "obscure, often self-published critics" to say the film's teams "feel like it would benefit from more input from different critics" — while making undisclosed payments of $50 or more.) A critic asking if it's okay to pan the movie was informed that "super nice" critics move their bad reviews onto sites not included in Rotten Tomatoes scores.
Vulture says after bringing this to the site's attention, Rotten Tomatoes "delisted a number of the company's movies from its website and sent a warning to writers who reviewed them." But is there a larger problem? Filmmaker Paul Schrader even opines that "Audiences are dumber. Normal people don't go through reviews like they used to. Rotten Tomatoes is something the studios can game. So they do...." A third of U.S. adults say they check Rotten Tomatoes before going to the multiplex, and while movie ads used to tout the blurbage of Jeffrey Lyons and Peter Travers, now they're more likely to boast that a film has been "Certified Fresh...."
Another problem — and where the trickery often begins — is that Rotten Tomatoes scores are posted after a movie receives only a handful of reviews, sometimes as few as five, even if those reviews may be an unrepresentative sample. This is sort of like a cable-news network declaring an Election Night winner after a single county reports its results. But studios see it as a feature, since, with a little elbow grease, they can sometimes fool people into believing a movie is better than it is.
Here's how. When a studio is prepping the release of a new title, it will screen the film for critics in advance. It's a film publicist's job to organize these screenings and invite the writers they think will respond most positively. Then that publicist will set the movie's review embargo in part so that its initial Tomatometer score is as high as possible at the moment when it can have maximal benefits for word of mouth and early ticket sales... [I]n February, the Tomatometer score for Ant-Man and the Wasp: Quantumania debuted at 79 percent based on its first batch of reviews. Days later, after more critics had weighed in, its rating sank into the 40s. But the gambit may have worked. Quantumania had the best opening weekend of any movie in the Ant-Man series, at $106 million. In its second weekend, with its rottenness more firmly established, the film's grosses slid 69 percent, the steepest drop-off in Marvel history.
In studios' defense, Rotten Tomatoes' hastiness in computing its scores has made it practically necessary to cork one's bat. In a strategic blunder in May, Disney held the first screening of Indiana Jones and the Dial of Destiny at Cannes, the world's snootiest film festival, from which the first 12 reviews begot an initial score of 33 percent. "What they should've done," says Publicist No. 1, "was have simultaneous screenings in the States for critics who might've been more friendly." A month and a half later, Dial of Destiny bombed at the box office even though friendly critics eventually lifted its rating to 69 percent. "They had a low Rotten Tomatoes score just sitting out there for six weeks before release, and that was deadly," says a third publicist.
Vulture tells the story of a movie-publicity company contacting "obscure, often self-published critics" to say the film's teams "feel like it would benefit from more input from different critics" — while making undisclosed payments of $50 or more.) A critic asking if it's okay to pan the movie was informed that "super nice" critics move their bad reviews onto sites not included in Rotten Tomatoes scores.
Vulture says after bringing this to the site's attention, Rotten Tomatoes "delisted a number of the company's movies from its website and sent a warning to writers who reviewed them." But is there a larger problem? Filmmaker Paul Schrader even opines that "Audiences are dumber. Normal people don't go through reviews like they used to. Rotten Tomatoes is something the studios can game. So they do...." A third of U.S. adults say they check Rotten Tomatoes before going to the multiplex, and while movie ads used to tout the blurbage of Jeffrey Lyons and Peter Travers, now they're more likely to boast that a film has been "Certified Fresh...."
Another problem — and where the trickery often begins — is that Rotten Tomatoes scores are posted after a movie receives only a handful of reviews, sometimes as few as five, even if those reviews may be an unrepresentative sample. This is sort of like a cable-news network declaring an Election Night winner after a single county reports its results. But studios see it as a feature, since, with a little elbow grease, they can sometimes fool people into believing a movie is better than it is.
Here's how. When a studio is prepping the release of a new title, it will screen the film for critics in advance. It's a film publicist's job to organize these screenings and invite the writers they think will respond most positively. Then that publicist will set the movie's review embargo in part so that its initial Tomatometer score is as high as possible at the moment when it can have maximal benefits for word of mouth and early ticket sales... [I]n February, the Tomatometer score for Ant-Man and the Wasp: Quantumania debuted at 79 percent based on its first batch of reviews. Days later, after more critics had weighed in, its rating sank into the 40s. But the gambit may have worked. Quantumania had the best opening weekend of any movie in the Ant-Man series, at $106 million. In its second weekend, with its rottenness more firmly established, the film's grosses slid 69 percent, the steepest drop-off in Marvel history.
In studios' defense, Rotten Tomatoes' hastiness in computing its scores has made it practically necessary to cork one's bat. In a strategic blunder in May, Disney held the first screening of Indiana Jones and the Dial of Destiny at Cannes, the world's snootiest film festival, from which the first 12 reviews begot an initial score of 33 percent. "What they should've done," says Publicist No. 1, "was have simultaneous screenings in the States for critics who might've been more friendly." A month and a half later, Dial of Destiny bombed at the box office even though friendly critics eventually lifted its rating to 69 percent. "They had a low Rotten Tomatoes score just sitting out there for six weeks before release, and that was deadly," says a third publicist.
Working as it should (Score:3)
Re: (Score:2)
Re: (Score:3)
Anything that doesn't get at least 85% on both the critic and audience scores isn't worth seeing in the theater
Audience scores are more useful in my experience. The Orville's first season was panned by critics and beloved by the audience scores. Sometimes the "professionals" just don't get it.
Re: Working as it should (Score:2)
As a rule, for modern releases I'd take a low critic score with a high audience score to indicate something worth watching. Dave Chappelle's stand-up specials illustrate this. Same with Ricky Gervais stand-up.
Critics reviewing content intended for sale to normal people makes as much sense as having Muslims reviewing gay cinema.
Re: Working as it should (Score:1)
Re: (Score:2)
Wherefore Metacritic? (Score:3)
Re: (Score:3)
Re: (Score:2)
I don't think RT had audience scores when it started? When it was just an aggregator for critic reviews it was a somewhat studio friendly resource as well as being easy for a viewer to understand (a modern version of the "two thumbs up" metric. Also in a time before everybody and their mother could be a movie critic having an easy to access reference to the real critics was useful.
Today though I think you are correct, RT is just a rehash of what metacritic is, IMDB ratings are kinda a who cares now and rea
Re: (Score:2)
I'm a little unclear on why Rotten Tomatoes won the meta ratings contest, Metacritic seems as though it's always been the superior option. Not that it would be immune to the problem in the article.
Simple: It publishes a big, easy number at the top.
Is it gamed? Yes. Is it paid for? Absolutely. But most of the time the score is near enough.
(this is definitely an exception to Betteridge's law)
eg. How can movies that aren't released until next week be "Top Streaming"?
The trick is to look at both the critics score and the public score. Big discrepancy? Proceed with caution.
Re: (Score:1)
All reviews are inherently troublesome (Score:3)
... and should be read with skepticism
This includes all reviews, for music, movies, products, etc
Leaving out paid fraud, even honest reviews are troublesome
Reviewers use different criteria based on their personal opinions
Some value political correctness or pop trendiness, while others value skill and artistry
A random review from an unknown critic is the worst
The best is when you calibrate your judgement of a known reviewer. This can work in both directions. Sometimes a reviewer likes the same things I do, sometimes a reviewer hates everything I like. As long as they are consistent, their reviews are useful once calibrated to my taste
A smidgen of restraint.... (Score:2)
I find it both interesting and disappointing that so many people rely on a single site for all the information they'll use to make a purchasing decision. "But what if it's a really good site??" ... It's still a single point of failure plus it uses only few opinions to come up with what are apparently extremely influential review scores.
Maybe watch the trailer, and if the company isn't honest, _fuck them_. Simply don't go and see the next movie in theaters. Given that theater numbers are still super importa
Re: (Score:2)
Maybe watch the trailer
No. Never, ever watch movie trailers. You want to ruin a movie? Watch the trailer.
Every crowdsourced rating system is gamed (Score:2)
The idea of crowdsourced rating systems was great, and it worked for the first few years of that they were generally available. Then the 10% of the world's population that are grifters and just plain enjoyers of destruction joined forces with the organized pumpers and scammers[1]. As of 2023 one has to assume that every crowdsourced rating system is being gamed by from 2 to n parties for multiple purposes. If you are very careful you can glean a little insight from a carefully chosen few, but generally the
Re:Every crowdsourced rating system is gamed (Score:4, Insightful)
Google Maps reviews are no longer reliable. I've had two reviews for negative experiences (one restaurant and one car repair shop) shadow-banned for no apparent reason. Nothing in my reviews broke the plain English reading of the TOS or walked up to the line of breaking it. Anecdotally, in talking to friends and colleagues, my experiences with these two businesses are not unique, yet they've got aggregate ratings >4.6 stars, and one has to wonder how many other negative reviews have been hidden from public view.
My tinfoil hat side wonders if Google isn't slyly protecting businesses that purchase adwords. More likely though, they've outsourced TOS enforcement to bots and algorithms that suck. If that's what they've done the algorithms are failing on both ends, it's not too hard to find businesses whose ratings were destroyed for unrelated reasons, like a local elected official whose business was bombed with negative reviews by political partisans, which is a violation of the TOS, and yet hundreds of those reviews remain online as of this writing.
The Internet used to be useful for product and business reviews but between the above, astroturfing, and reviewers that get bought off with early releases and other perks, it's almost impossible to find reviews of value.
Re: (Score:2)
"Street parking was full when I arrived. Food was good. 2 stars"
Re: (Score:2)
I mean, if all you have is street parking, I'm probably not going to bother if I can't even find parking. It's just food after all.
Problem with âprofessionalâ(TM) vs â (Score:2)
The biggest problem is that these movies consistently score high with the so-called professional reviewers who can see the movies days or weeks in advance in private screenings etc. They are paid for by the studios, get free tickets, free popcorn and/or in some kind of an echo chamber. Hence you see the ratings crater after a week or so, the movies the articles mentions are f-in garbage, so the âoelaymanâ reviews tank the ratings, the people paying $80 for a night out at the movies and is it worth
They've been gaming the audience review system too (Score:5, Informative)
You can find a number of Reddit posts where people will detail how they left a bad review on a certain show and Rotten Tomatoes rendered their accounts inactive so all their reviews went offline. You have to re-register your account and after a period of time your reviews eventually come back but not until the show has had enough time to claim they have a higher audience review than they really had. I'm pretty sure a popular media site covered this a few years back as well.
I only found out about this because it happened to me so I started googling and BAM!
Re: (Score:2)
Re: (Score:3)
Was it the Lord of the Rings show?
No, for the one show it was Y: The Last Man. I was insta-banned almost immediately after my review went online (0.5 stars and that's as someone who loved the graphic novel). The other show was either The 4400, She-Hulk, or Deputy. It would be one thing if the review was withheld on behalf of claims that people were review bombing the content but the message you get is concerning inactivity which is the exact opposite of what's going on.
There are a lot of allegations from people who were banned that if y
Re: (Score:2)
Nonsense (Score:2)
This is where Rotten Tomatoes helps you out, you check a few critic reviews, a few audience review, skipping the one-liners and look for one that matches your situation. Sorry Barbie, let's not party: https://www.rottentomatoes.com... [rottentomatoes.com]
Re: (Score:3)
I thought Barbie was A) not worth the hype but B) a fun movie worth seeing (but maybe not worth committing to memory forever) regardless.
To me it seems like a lot of people thought it was going to be a man-hating femnazi lecture in movie form, the ultimate in 'woke' propaganda. It wasn't. It seemed in the 2nd act like they might be leaning that way, but that is not the direction they were taking by the end.
I'm just surprised Mattel found enough of a sense of humour to be OK with it.
Re: (Score:2)
I may watch it on a streaming platform but when my female colleagues tell me to wait for the streaming release, I think I'll take their advice.
Aggregate Studies of Ratings Sites (Score:4, Informative)
I can't find the original article that did a study of the aggregate averages of the movie rating web sites and identified the skew that each website tended towards but I did find new studies and articles. I used that original data myself to adjust my own scripts for the weight and cut-off value for movies and series that I wanted to watch and to download metadata for my collection.
Metacritic (purple curve) has the review ratings that most closly resembles a standard normal distribution [wikipedia.org] curve when compared against the other websites. The image that is most telling is the one below showing the average and the relative skew that each website has.
The new articles below are helpful, especially the first one that shows the relative differences in the average scores for each site along with a few of the following ones that identify the skew for each genere of movies.
FreeCodeCamp.org - Whose ratings should you trust? IMDB, Rotten Tomatoes, Metacritic, or Fandango? [freecodecamp.org]
Data Analysis Blog - The IMDb Analysis: Genres and Ratings of Movies released between 2008-2018 [wordpress.com]
UCLA Davis / Medium - Move Ratings Analysis [medium.com]
Articles below have been repeating the same thing for so many years about Rotten Tomatoes and also Fandango.
FiveThirtyEight.com - Be Suspicious Of Online Movie Ratings, Especially Fandango’s [fivethirtyeight.com]
Wired.co.uk - You should ignore film ratings on IMDb and Rotten Tomatoes [wired.co.uk]
MakeUseOf.com - IMDb vs. Rotten Tomatoes vs. Metacritic: Which Movie Ratings Site Is Best? [makeuseof.com]
TheRinger.com - Has Rotten Tomatoes Ever Truly Mattered? [theringer.com]
The image below shows the skew increase after the site was bought out by the movie studios and then put under the Fandango umbrella. Very clear indication of shenanigans happening there after the sale due to the self-interest of the new owners.
Vulture.com - The Decomposition of Rotten Tomatoes [vulture.com]
Review aggregation is Goodhart's Law in action. (Score:2)
In the late 90s, maybe even extending into the 00s, aggregate reviews could be considered a vaguely-valid overall metric of sorts. But as soon as aggregation became the standard - for movie reviews and elsewhere - the entire conceit collapsed in on itself.
If you're writing a review that you consciously know will be part of an aggregate rating and you rate it outside the extremes, you're effectively saying "please count my opinion less".
Re: (Score:1)
Yes. (Score:2)
This is exactly why it's so damning that it's a more reliable indicator of whether a movie is worth watching than most critics.
The same question for Slashdot (Score:2)
One has to wonder if someone's being paid to post all these dupes [slashdot.org] we've been seeing.
Sound of Freedom (Score:4, Insightful)
Everything was "fine" until a movie about combat operations against professional pedophile rings got 99% Fresh and then Hollywood threw an absolute shitfit.
Wonder why that would be!
Also, it login system sucks. (Score:2)
No more passwords. Have to click on e-mail link to log in. Sometimes it takes a while to get their e-mails too for the link to be outdated. Lame!
At The Movies (Score:2)
If we can gaurantee movie review integrity... (Score:1)
GIF-based memes are being hacked (Score:2)
This morning, I was attempting to reply to a Facebook post with the "inconceivable" meme from The Princess Bride. Instead, all I could get were GIFs from a 2017 film with that title. It's a film I've never heard of whereas most people get the Princess Bride reference. That got me thinking that maybe studios are using GIF meme services to promote their movies and poisoning the true meme waters.