Pitchfork’s top 200 albums of the decade (2000-2009) have been unveiled, and regardless of your opinions regarding what was included and what was left out, one thing is certain: numbers don’t lie. Pitchfork has been reviewing albums and compiling year end lists all decade and the 200 albums they feature on this list have been elevated to a place on the popular music canon, at least for a certain segment of the population. So let’s take a closer look at the numbers behind this list and see what we discover.
The Best Year of the Decade
The chart above shows how the top 200, top 50, and top 10 albums break down by year. 2001 comes out ahead for in all three breakdowns and the years 2000-2004 totally dominate the entire list, with 117 of the top 200 albums released in the first 5 years, 31 of the top 50, and 9 of the top 10. It only makes sense that 2009 would have the fewest showings (and none in the bottom 100); not only is the year not yet over but the impact 2009 has on the musical history of the decade is sure to be minimal when compared to the legacies of earlier years. Ditto 2008, although that year seems to have made a much stronger showing than 2003, a year which didn’t crack the top 10 and only saw 2 albums make the top 50. Ultimately, the argument for ‘which year was best/worst’ can’t be settled by this list but there is plenty of ammunition here for any side to use.
Comparing Changes in Pitchfork’s Rankings and Lists
The P2K Top 200 Albums highlights how Pitchfork, as a publication, has changed focus over the past decade. Many albums which were originally met with average or poor reviews, or were never reviewed at all, are included on this list.
Obviously higher-rated albums still dominate the list, but over 20% of the list is devoted to albums with average-to-low reviews. Andrew W.K.’s I Get Wet takes the prize of lowest rated album to appear on the list; it was originally given a 0.6 rating in a review by Ryan Schreiber himself. Certainly there have been albums with a lower review (there have been a few 0.0’s in the past), but going from an abysmally low original rating to a place in the top 200 albums of the decade marks a decided shift. Basement Jaxx’s Rooty takes 2nd place in lowest original score with a 3.8 original rating, but it has the good fortune to land in the top 50 category of the P2K list (it was also featured at 65 on the ‘Best of 2000-2004’ list, making it not quite as big an anomaly as the Andrew W.K. album). 10 albums are featured on the P2K Top albums list with an original rating of less than 7.0, and none of these 10 albums were released after 2003. Thus we can assume 2 things: The Pitchfork of 2003 would hate this list, and the Pitchfork of 2009 is substantially different in taste, tone, and quality assessment than its earlier incarnation.
Since the original reviews were the product of one writer and there probably was not much editorial oversight in assigning ratings, the original year-end lists provide a clearer point of comparison with regards to the changes in Pitchfork over the past decade.
Here we see 28% (55 albums) of the P2K top 200 albums did not appear on the site’s original year-end list for their respective year. This would seem to comply with the comparison to the original ranking list. Just as interesting, though, is the fact that 17 former top 10 albums do not appear on the P2K list at all. 5 albums ranked 10 on their original year-end list are missing altogether here, with the most recent being 2006’s The Drift by Scott Walker. The highest ranked album to be left off the P2K list is The Lemon of Pink by the Books, which was the number 2 album of 2003.
Pitchfork provided a ‘Top 100 albums of 2000-2004’ list that also gives a point of comparison in assessing how this list differs from their earlier assessments:
While 44 of the top 50 albums of 2000-2004 appear on the P2K list, 42 albums that weren’t included in that list make the cut here. This makes sense considering 76 of the top 100 albums are on this list and the relevant years make up 117 of the total 200 albums. Only 6 top 50 albums from the 2000-2004 list aren’t featured on this list, which must suck for those 6 artists.
Before we move to the next category, it’s also interesting to see how Pitchfork’s ‘Best New Music’ selections ranked on this P2K list.
The ‘Best New Music’ designation was instituted in 2003 and was awarded to most, but not all, albums with a rating of 8.3 or higher (and some 8.0-8.2 albums). 31 albums reviewed by Pitchfork from ’03-’09 appear on the P2K list but did not originally receive the Best New Music Designation. If we say that every album released before 2003 with a rating of 8.3 or higher would receive ‘Best New Music’ status, that still leaves 70 albums that do not qualify for Best New Music status. Again, some of this makes sense: Pitchfork didn’t really review as many rap, pop, or dance/techno/electronica albums before 2002 and in the early years they were especially critical of otherwise-popular acts like Bright Eyes, Elliott Smith, Andrew W.K., and Daft Punk. Still, this stands as evidence that ‘Best New Music’ is not necessarily the same thing as ‘Music with the Best Staying Power’ or ‘Most Important Music.’
Comparing Pitchfork to Other Critics
A common complaint I hear regarding Pitchfork is that they are often contrary to both popular and critical opinion; that their reviews, scores, and lists set the site apart from other critical sources and popular opinion as radically different. When we compare Pitchfork’s P2K top 200 albums with the critical consensus compiled on Metacritic, though, this argument starts to break down a little:
Here we see the majority (110 albums) of the P2K top 200 albums were reviewed comparably by other critics. Only 40 albums on the P2K list were rated with more than 10 points difference on Metacritic. Perhaps just as telling is the fact that, of the 200 P2K top albums of the decade, 76 appear on Metacritic’s top 200 of the decade list.
So while Pitchfork might maintain some “outsider status,” by reviewing albums that slip past the other mainstream critics Metacritic compiles (46 albums were reviewed by Pitchfork but not by Metacritic) only 18 of these ‘outsider’ reviews are for albums from 2004-2009; either Pitchfork has become more mainstream or Metacritic has become more ‘outsider’ (probably both are true; indie music has become increasingly popular throughout the decade and Pitchfork has increasingly given positive reviews to more mainstream music).
Conclusions
This list of the top 200 albums features some glaring omissions and some surprising inclusions, especially considering the source. But, more importantly, it documents a clear change in the type of music review coverage Pitchfork provides over the past 10 years; several albums that were not reviewed early in the site’s life are placed favorably on this top 200. Similarly, albums that received negative or average reviews originally- or were not included on their respective ‘best of the year’ list- are included while albums that were lauded at the time saw steep declines. The most obvious thing we can take away from this list is that the Pitchfork of 2009 has a very different voice than the Pitchfork of earlier in this decade.
For more data, see my database of the Pitchfork Top 200 Albums of the Decade