Friday, January 25, 2013

Awards 2012 - Postlude

This year, 1,141 albums met the eligibility criteria: released between August 2011 and July 2012 inclusive and reviewed by at least 4 of my sources, including at least 3 reviews from the six main sources (Gramophone, BBC Music, IRR, American Record Guide, Fanfare, and MusicWeb). Classics Today has been downgraded from main source to lesser source since they introduced subscriptions—not all reviews are available to those of us who already spend too much money on magazines.

What kind of consensus do we see across the reviews? I allocate scores from 1 to 5, so let's define "consensus" as a difference of no more than 1 point between the best and worst review. In that case we can say there's a consensus more often than not: 56% of the time (635 albums). If "consensus" is a 0.5 difference (which is the difference between "good" and "good but with some reservations" or between "good" and "very good", or between "very good" and "outstanding"), then it occurs 22% of the time (252 albums).
Some 479 albums (42%) were regarded by all their reviewers as "good or better". 
34 albums (3%) were regarded by all their reviewers as at least "very good".
3 of the Award winners won despite getting a review lower than "good".
Incredibly, only 4—four!—of the longlisted albums were not regarded as "good" by any reviewer, which is to say that any given release that gets a reasonable amount of attention has a 99.6% chance of being liked by someone.
How many albums were regarded as "very good" by at least one reviewer? 990 of them, or 87%. 
But of those 990, a whopping 184 (19% of them) were regarded by at least one other reviewer as "bad".
How many albums received the highest praise from one reviewer and the lowest praise from another? 18 (1.6%).

And that is why I read lots of reviews. And why we have the Nereffid's Guide Awards as a result.

No comments: