Thumbs down for Internet film reviews?

Internet film review sites are extending their influence but should we care what they say? John Horn, of the Los Angeles Times reports.

The Disney newspaper advertisement for Up featured the kind of blurbs typically associated with a critical hit, with Leonard Maltin and The Wall Street Journal's Joe Morgenstern among the reviewers quoted.

But a closer look at the full-page ad revealed a more unusual endorsement: a 98% "fresh" rating from the website Rotten Tomatoes.

The studios are always searching for new ways to sell movie tickets, and they are now reviewing aggregaters such as Rotten Tomatoes, Metacritic and newcomer Movie Review Intelligence to generate box-office buzz.

However, the sites are attracting questions about their methodologies, and who exactly qualifies as a "film critic" in the internet age?Movie aggregaters generally use professional critics, although Rotten Tomatoes includes a number of citizen-reviewers who write on obscure websites.

Movie marketers say they like the sites because they can boost movie admissions.

"Are they the driver? No. Can they help drive business? Yes," says Mike Vollman, MGM's marketing chief.

"People want to know what the consensus is. I am a huge believer that in today's culture, people don't pay as much attention to individual voices as to the aggregate score."

Rotten Tomatoes is by far the most popular of the three aggregater internet sites, attracting about 1.8 million unique visitors monthly, according to comScore.

As soon as the site deems a movie "certified fresh" - meaning that at least 60% of the roughly 400 critics it surveys give a movie a favourable review - studio executives call Rotten Tomatoes, asking for the small trophies the website dispenses to commemorate the accomplishment.

But as rivals Metacritic and Movie Review Intelligence point out, Rotten Tomatoes can give its coveted "fresh" rating to films that any number (and hypothetically all) of its counted reviewers do not really love.

And although all three sites present numerical averages in their ratings, the calculations involve subjective scoring by the aggregaters themselves, not just the critics.

Rotten Tomatoes' scores are based on the ratio of favourable to unfavourable reviews.

If a film lands 10 positive reviews and 10 negative reviews, in other words, it is 50% fresh, and if the ratio is 15 good to five bad, it is 75%.

But if all 20 of those critics give that same film the equivalent of a B-minus letter grade, it is 100% fresh, because all of the reviews were positive, even if only barely so.

"Our goal is the extension of thumbs-up and thumbs-down," says Shannon Ludovissy, Rotten Tomatoes' general manager.

"It's not a measure of the degree of quality."

Metacritic and Movie Review Intelligence try to come up with an average reflecting how much critics actually like a movie, rather than a ratio of raves to pans.

If a movie on those two sites gets a 50% score, it means the consensus of all of the reviews it read was 50% positive.

Like Rotten Tomatoes, Metacritic and Movie Review Intelligence assign every review they read a numerical score, a sometimes tricky endeavour because many film critics do not award letter grades or stars as part of their reviews.

And that's where the subjectivity comes in.

David Gross, a former market research and 20th Century Fox studio executive who launched Movie Review Intelligence a month ago, says he and his staff read (or watch and listen to) reviews from about 65 top US and Canadian outlets, excluding the little-known internet critics that Rotten Tomatoes includes.

Gross says about three-quarters of the appraisals his site tracks carry letter reviews, and that two analysts from his company score the reviews that don't with letter grades.

"If the analysts differ in their grades, we have a discussion about it," Gross says.

Those assigned grades are then translated into numerical scores - a B-plus is an 83, a C-minus rates a 42, and so on.

Rather than simply average those scores, Gross applies a weighting system based on a reviewer's circulation, with People magazine receiving the greatest weight.

"I think the most important thing is to reflect what is going on in the market - in the real world," Gross says.

Metacritic, which was launched in 2001, uses a similar methodology to assess the 43 reviewers it surveys, about half of whom don't use stars or letter grades.

But rather than translate a review into a letter grade, the site's staff scores notices on a 0-100 scale in 10-point steps.

Whereas Movie Review Intelligence weights reviews for audience size, Metacritic tips the scales for "prestige", so high-profile critics are more important than "someone you've never heard of," says Marc Doyle, one of the founders of Metacritic.

Doyle says that when critics are consistently 75% favourable in their reviews of a movie, its Metacritic score is a 75.

But that same movie could be 100% fresh on Rotten Tomatoes.

"That's a fundamental difference," he says.

Or as Gross says: "What difference does it make if some fan boy says thumbs down to Terminator Salvation?" 

Add a Comment