Monday, May 31, 2010
Top Ten Reasons Why "Best of" Lists Suck
The "Best Of" List Industry continues to grind out the sausages. Downbeat Magazine's recent listing of the 25 Favorite Big Band Albums is not the worst example, but it did snap this camel's back. Call it vain, call it ironic, but in an attempt to undermine this international conspiracy, I offer my own top ten list:
1. Usually there's no commentary at all to validate a listing. If there is, it doesn't.
2. Hyper-selectivity is anathema to discovery. Stumbling through a thicket of sound leads to real discovery, not following a road map.
3. Listing is something you do when you walk back to your cabin after drinking to forget that the ship is about to capsize.
4. Inclusiveness doesn't work. The more people you ask to help produce a list, the more the juice is sucked out.
5. The Net's about self-aggrandizement; no one argues with that. Can't ya be a little more subtle about it?
6. It may be possible that someone could go to a friend's house for a listening session and say "Play me your top 10 Zoot Sim's records." OK, but there are so many more interesting ways to go from one side to the next that if you actually did spend the night with these 10 albums, I'd recommend seeking treatment for OCDC.
7. By reifying the 'experts,' lists decrease, they don't increase, the flow of actual communication. Trust your friend's musical advice, not a stranger online.
8. At least keep the list short. The larger the list, the harder it falls.
9. Getting on such lists only misleads musicians into thinking their gigs will improve.
10. You've probably stopped reading this list by now, which only goes to prove my point.