Our fall pledge drive ends on Friday, and we're still $5,000 short of our goal.
Help make in-depth reporting sustainable with your tax-deductible donation today.
Metacritic.com is an acclaimed Web site that combines thousands of media reviews of entertainment offerings — movies, games, books and albums — into a Metascore, a sort of weighted average of critics' reviews that ranges from zero to 100. Analysis of just a small subset of the site's information shows the power of numbers to confirm — or defy — expectation.
The colored horizontal bars on this chart present a graphical representation of the distribution of scores given to movies in which each of the listed actors appear. The numbers inside the bars represent the average of review scores for those movies; the actors listed are the top 50 and the bottom 10, in terms of those averages. Note that the reviews are primarily from the last decade; no consideration is given to the magnitude of the actor's role; and a high average rating could indicate acting skill, the ability to pick good projects (or good trilogies), reviewer bias or just luck. To the extent that the ordering of the actors appears generally reasonable, some unexpected placements may inspire a rethinking of subjective assessments (or, in the case of Viggo Mortensen's rating above Clint Eastwood, a good long laugh).
This scatterplot shows 25 prolific movie critics in terms of the favorability with which they rate films, and the degree to which their reviews tend to agree with those of other critics, scaled to reflect their volume of reviews written. If you want to get a sense of the zeitgeist but can only read one review, you might prefer Rene Rodriguez, whose low standard deviation from the mean review score makes him very nearly a living critical average. If you are interested in an alternative perspective, Mick LaSalle's high standard deviation places him further from the critical pack than any of these peers. Reviews from both Michael Wilmington and Marc Savlov are so regularly and respectively positive and negative that they should perhaps be taken with a grain of salt.
A "smoothed" plot of movie scores over time is depicted, highlighting the expected seasonal peaks in mid-summer and at the end of the year, along with the mid-winter and early autumn doldrums. Also listed are some of the more influential movies of their eras, in terms of number of reviews, along with their mean scores. Might the poorly reviewed summer of 2002 be attributed to releases delayed in the wake of 9/11? Does the relative lack of troughs from 2003 to 2006 reflect a real or imagined streak of high-quality films?