Friday, June 22, 2007

Predicting hits in entertainment

About a month ago, Fat Knowledge posted on a study investigating how big hits in entertainment are determined. In a nutshell, the authors conclude that early buzz begets more buzz--the cumulative-advantage theory in action. The study separated participants into different "worlds" without overlap. All songs in each world started with zero downloads, but once downloading began, the popularity of various songs diverged in the different worlds, with the songs that happened to get off to a strong start tending to continuously build on that early advantage.

The results won't come as much of a surprise to marketing majors, as word-of-mouth is held up as an elusive but critical method of successful advertising in business schools across the country. Target the market mavens, who have disproportionate influence on making the hot stuff hot, and they'll do the legwork for you.

The lesson for aspiring musicians and actors: Put your stuff out there. Promote it on MySpace and YouTube. Take the pro bono gigs offered to you in college towns. You're not building a better mouse trap, you're attempting to entertain. You have to beat your own pathway.

The authors note that in a parallel universe, Madonna may be nothing while some other bunny makes it big. The entertainment industry is unique in that way, because a huge swath of people have the abilities and competence, even the appearance, to do what the entertainment industry's upper echelon (the actors and actresses up for consideration at the Oscar's) do. In music, especially the Pop 40 scene, little talent is actually required of the singer--synthesized rhythms are put to words the artist didn't write. The vocals are enhanced in edit, and concerts are lip-synced if need be.

In the realms of engineering, investing, and sports, an objective set of parameters holds the mediocre down, buoying only the best to the top. If we traveled back twenty years, repitched the world, and let entropy do its thing, Lebron James will still become the NBA's best, Warren Buffet will still be one of the world's richest men. But would Lindsey Lohan or Ashton Kutcher be the celebrities they are today? Britney Spears? It's not difficult to make an argument that they would not be.

While predicting the success of movies and the actors who star in them is a tall order, there is an entire class of putative experts who do quite well in opining about this stuff. Shouldn't they be able to prognosticate with some precision?

Collectively, they're not much help. Two popular rating sites, Rotten Tomatoes and MetaCritic, amalgamate the ratings and reviews of movie critics across the country to construct average rating scores. I looked at domestic box office receipts for 133 movies released in 2005 and compared the revenue they brought in with how the critics evaluated them (data via Swivel here).

MetaCritic scores correlate with receipts at .27. That means you can predict about 7% of a movie's performance based on what the critics have to say about it. Rotten Tomatoes fared slightly better, at .30 and 9%, respectively. Pretty paltry. And how much of this modest predictive power comes from cumulative-advantage? As a professional reviewer, praising a movie should create some self-fulfilling effect on that movie's performance unrelated to the movie's actual merits.

Removing a few titles (Syriana, Brokeback Mountain, and Munich) with clear leftist agendas that were mediocre box office performers but not surprisingly garnered glowing reviews bumped the correlations up about .01 for both sites. Although the changes are small, a realizable difference discovered by pulling only three of 133 data points suggests that extricating less obvious political or cultural agendas would make critic reviews more useful.

What about predicting the business success of a film? Movie critics are notorious for scorning mindless action thrillers with lots of big explosions and little plot substance or character depth. Actually, the critics are scarcely better at predicting which movies will have the greatest return on investment than randomly assigned ratings would be, with neither site's critics coming anywhere near statistical significance in their reviews.

A surer way of estimating the likelihood of a blockbuster is to look at how much was spent in making and promoting it. Budget and receipts for the same movies correlate at .63, meaning that 40% of total box office receipts can be predicted by seeing how much money was plowed into the movie to begin with.

That doesn't do you a whole lot of good if you're a cost accountant for Miramax, but if you're a movie critic, you'll be head-and-shoulders above the rest of the pack if you simply look at how much money is spent on a movie and rate it accordingly, praising the big-dollar projects and criticizing the poor man productions. Your opinions will be more in line with that of the moviegoing public than Ebert's are. And if you're a member of the rabble, subject to the vagaries and viccissitudes of your lemming-minded lot, you should go to the films that were the most costly, as they're likely to end up being quite popular!

An aside, of interest to Idiocracy-Watch members, is this small victory: March of the Penguins had the highest ROI of all major 2005 releases. But don't you dare get complacent, for breathing down its neck in the number two spot: Saw II.


Steve Sailer said...

Critics tend to have a half-full attitude toward low budget movies, especially documentaries, and a half-empty attitude toward big budget movies.

Critics tend to systematically underrate comedies.

Still, there is a fair amount of agreement between critics and the public on movies within various classes: e.g., that Saving Private Ryan was better than Flags of Our Fathers, or that Gladiator was better than Alexander. Movies tend to "work" or not work, rather like a good band within a musical genre is pretty clearly better than a bad band.

Fat Knowledge said...


I wonder though, how well do the fan's ratings of movies correlate with box office success? Is it much better than the critics? Just because a movie is a box office smash doesn't mean it is a good movie (most sequels fall in this category).

I like using Yahoo Movies reviews to take a quick look at how the critics rate a movie compared to the fans. With those two ranking I can quickly get a pretty good feel on whether I will like the movie or not.

Also, how well do fans ranking and critics ranking correlate?

Good stuff.

Audacious Epigone said...

Still, there is a fair amount of agreement between critics and the public on movies within various classes...

That'll be my next foray.


MetaCritics user reviews were even less predictive of box office success than the critics. But many of the movies only have a handful of user reviews. Yahoo is stocked much fuller. I'll look at it.

Just because a movie is a box office smash doesn't mean it is a good movie...

I agree, generally. But the study you cited suggests that for the most part people are like cattle, following the leaders and letting the market mavens decide what they like for them.

Also, a review's value goes beyond its analysis of the actual movie at hand. I enjoy Steve's movie reviews, even though I never go to see any of the films he writes on (the last movie I saw was Spiderman II for my brother's birthday).

Steve Sailer said...

The way to analyze this is to look at the "legs" of a movie -- it's total revenue after the first weekend relative to the first weekend. That's a good measure of word of mouth. But you can only look at movies that open wide. The tops legs movies of the late 1990s were "Sixth Sense" and "Something About Mary".

That measure is a little unfair to movies that opened huge and turned out to be pretty good -- the Lord of the Rings movies, the first Spiderman, etc. But it does show sleepers -- e.g., The Notebook from a few years ago was a real world of mouth hit.