Interesting observation about Rotten Tomatoes and film reviews

One of my favorite sites is Rotten Tomatoes, which did the Nate Silver poll averaging thing—only in this case for film reviews instead political polls—long before Nate Silver became "Nate Silver." Forget what friends and family say about a film. For me, the reviewer averages and audience ratings on Rotten Tomatoes are my go-to guide on risking a small fortune to see a movie in an actual theater before it's available on the cheap via Netflix, Red Box, Hulu and a billion other services.

That said, I've recently noticed a fascinating trend regarding the site's Tomatometer and Certified Fresh rating. (Quick digression: In case you're living in the stone ages and think tomatoes are for eating and pelting at bad actors, Tomatometer is the averaging system used by the site. As Rotten Tomatoes states, they award their "Certified Fresh accolade to theatrical releases reviewed by 40 or more critics (including 5 Top Critics) with a steady score of 75% or higher on the Tomatometer. A film remains Certified Fresh unless its Tomatometer falls below 70%.")

Digression now passed. On with the story.

Anyway, a few days before Man of Steel came out it had a Tomatometer rating of 70%, which was good enough to be Certified Fresh. For people like myself who check out Rotten Tomatoes in advance of a film's release, that rating seemed like a decent reason to brave overly crowded theaters.

However, as the days passed an interesting thing happened to the Man of Steel's Tomatometer rating—it fell, going from 70% shortly before its release to 56% a week later. The film is no longer fresh and now exists as a simple rotten tomato.

This might not seem like a big deal. After all, averages can change as more diverse reviews are added to the mix. But this is a pattern I've seen repeatedly on Rotten Tomatoes with regards to large Hollywood movies. Early reviews of blockbuster films, and hence the early averages, are more positive overall than later reviews.

While you can't see the old averages for Man of Steel, you can sort the reviews by date. And sure enough, the earlier reviews appear to be generally more positive. A similar pattern is now taking place with Monsters University, where yesterday the film's average was 85% and today, on the opening day, it is 76%. 

Of course, not every blockbuster film drops so sharply. World War Z has maintained a somewhat stable rating, going from 71% on June 19 to 68% this morning. But what's strange is that if this pattern was totally random, as you'd expect from an overall averaging of film reviews, then we should see both sharp climbs and falls in some blockbuster film averages, along with many films like World War Z which stay relatively stable. But in my experience blockbuster film ratings either tend to fall sharply as more reviews come in or stay stable. They rarely climb sharply.

Again, you expect some statistical noise with averages like this. But the noise should go both ways.

One other observation—I haven't seen this weird pattern occurring with most non-blockbuster films. Once independent films and smaller Hollywood productions reach the threshhold of 40 reviews, they are relatively stable in their rankings.

So what does this suggest? As Hollywood has made clear in recent years, their bottom line is heavily dependent on the money raised by a film in the first few weeks, when they don't have to share as much revenue with theaters. This is especially true for a blockbuster movie's critical first week, which helps set the film's trajectory to success or failure. Studios have also long been known to use fake critics, to pay critics, and prefer critics who seem to almost write PR copy for studios.

So I wonder if studios have realized how important an early Rotten Tomatoes film ranking can be—duh—and, as a result, make sure that sympathetic critics who publish early are given advance screenings. Or, alternately, this pattern could simply emerge from studios using sympathetic critics to push out positive reviews in the run-up to a film's release (instead of as a deliberate attempt to skew Rotten Tomatoes). Either way, while such attempts wouldn't change a blockbuster film's overall ranking by much in the long run, they could skew the ranking in the lead-up to a film's release.

Now again, all of this is merely based on my observations. But the great thing about Rotten Tomatoes is their linked reviews, and the dates when the reviews were posted, are all available to see. Perhaps someone with a statistical bent could import the data and run an analysis. It would be fascinating to discover whether or not my observations hold up across all Hollywood blockbusters from the last few years.

If I'm wrong, I'll be the first to admit it. But something does appear to smell rotten in the land of blockbuster Rotten Tomatoes rankings.