Related stories Related stories

Other Stories By Oakes

  Oakes Weekly - July 23, 2009
       Jul 23, 2009

  Oakes Weekly - July 9, 2009
       Jul 9, 2009

  Oakes Weekly - July 2, 2009
       Jul 2, 2009

  Oakes Weekly - June 25, 2009
       Jun 25, 2009

  Oakes Weekly - June 19, 2009
       Jun 19, 2009

  Oakes Weekly June 11, 2009
       Jun 11, 2009

  Oakes Weekly - May 14, 2009
       May 14, 2009

  Cheers to America’s Craft Brewers
       May 8, 2009

  Scoping out the Scene in St. Lucia
       Mar 26, 2009

  A Short Visit to San Diego
       May 8, 2008

home Home > Subscribe to Ratebeer.com Weekly RateBeer Archives > Oakes Weekly

Oakes Weekly - September 2, 2005

The Top 50 and Beer Rating
Oakes Weekly September 2, 2005      
Written by Oakes

Vancouver, CANADA -

So much is made of the Top 50. it seems to function as both a magnet for newcomers – a representation of what beer can be – and a lightning rod for critics. But what it is? How does it work? Why does one page on a website with a hundred thousand pages garner so much attention?

Well, first, it’s a compilation of member’s ratings. This incorporates a lot of biases. The membership reflects a certain demographic – there is a definite skew in terms of gender, in terms of age and in terms of geography that does not necessarily correspond with the larger sets of beer drinkers or even of beer lovers. This isn’t Ratebeer’s fault. We’re the one website that actually recognizes the existence of the entire world, and we’ve built our site to reflect craft beer as a global phenomenon, rather than a regional one. We’ve also done a lot of things to attract and retain female members. Race? We honestly don’t know what race you are if you don’t post your real picture. But some groups are more wired than others, and some are more predisposed to the specific brand of geekiness that we foster than others. This results in specific membership trends. Great for marketers, by the way, but biases are the natural result of that. The Top 50 being pure democracy, unadulterated mathematics, the tastes of our major demographic groups will ultimately loom large in the Top 50. So inherently, the Top 50 isn’t the “Best Beers in the World” period that many make it out to be. It is merely the Top 50 Favourite Beers of our particular community. These are our choices for Best Beers in the World, not some definitive Best Beers in the World list. Being that such a list is literally impossible to construct, it’s actually quite flattering that so many would see our little list as the next best thing. Wholly inaccurate, but flattering nonetheless.

The inherent bias shows itself in terms of the “big beer bias”, which in turns leads to the “American microbrew bias”, being that American microbrewers are the leading purveyours of big beer. The big beer bias is a key focal point for the critics. They see this, then magically make up their own little reality and swear that because the list has such a bias it cannot be a definitive list. Of course, nobody ever said it was, but critics often need to create a reality that gives them something to criticize. Taking something totally subjective like beer ratings and ascribing such absolutes as “best” is fiction. I know you need numbers to qualify something as “best”, but not all numbers are created equal. Statistics in the sports page are absolutes. The Quad-Counties Do-Rags really did win 29 games last year. Beer ratings aren’t scientific. They can’t be. So let’s not get confused here. Just because there are numbers, doesn’t make the ratings any less subjective.

This leads to the question of competency. Another fun one for critics living in their own little dream worlds. Beer rating is not scientific. You can approach it methodically, yes, but unless you rate your beers on the basis of controlled laboratory testing, it isn’t scientific. So it is always subjective. As such, no amount of standardized training, nor experience, nor sensory sharpness will ever eliminate the impact that you, the taster, has on the rating. It’s yours. Only you saw, smelled, and tasted that beer that way. No two ratings are alike. So if you are well-trained, methodical, and have been tasting beers for years, maybe that makes your experiences more grounded, more consistent from rating to rating, and more valuable to others, but that doesn’t make them inherently better. No one’s rating is more “valid” that anyone else’s because they are all nothing more than a recording of a personal experience.

Moreover, the entire question of rater validity is moot because Ratebeer, like every other beer rating website, doesn’t pretend to be comprised of professionals. I’ve said it before, but to some this is still a newsflash, so I’ll say it again. Professionals don’t pay the bills. Beer drinkers do. And every time a beer drinker takes a sip, they form a judgment. Joe Blow walks into a pub and orders a pint of some new micro. Even if it is subconsciously, he makes notes about whether or not he likes it. Can he articulate why he likes it or not? Maybe, but probably not. Or least not very well. Does that mean he won’t tell his friends about his experience? The two are simply not related. People will pass judgment and relay that judgment within their peer group regardless of whether or not they meet some industry figure’s arbitrary definition of “qualified”. This process already happens; all we do is formalize it. Yes, the ratings reach a larger audience than the musings that Joe Blow makes to his friends, but the tradeoff is that Joe Blow is going to learn about beer here. He will soon be able to articulate what he likes and does not like.

So beer lovers who hang out on rating websites like big beers. That’s all the Top 50 and big beer bias mean. It looks ridiculous at first glance if you’re not in the “big flavours wow me” camp. But even the staunchest critic has to admit that it’s the big beers, the whacked-out experimental beers, the over-the-top beers that generate the hype at festivals and beer bars. Say what you want about the big beer bias, but that is a bias shared by a large portion of the craft beer community, not just Ratebeer.

If you wish to remove some of these biases, it’s actually not that hard. We have user groups who love certain types of beers and have “Favourite” lists comprised simply of ratings by members of that group. We have a customizable Top 50 to allow you to create one that looks a little more equitable to your taste. There are “best of” lists available for every style, state, province and country. You can break them down to individual brewers and raters as well. There are thousands of Top 50 lists you can create. Is it fair to suggest that a few extra mouse clicks renders them less visible and thus vital than the Top 50 linked from the Ratings page? Sure. But this should only matter if you’re a critic looking for something to gripe about. If you’re looking for information, we’ve got that informative list you want. That main Top 50 gets its pride of place for being the one Top 50 list that has the fewest criteria. It’s the one that’s pure democracy. If that’s for better, cool. If that’s for worse, move on to a more specialized list, or tinker with the parameters of your Top 50. You’ve got nobody to blame but yourself if you put special onus on any Top 50 list.

And remember, beer is supposed to be fun. These lists are for fun. Enjoy the content but try not to take them as some sort of ultimate truth. They’re not. They’re just the opinions of regular beer lovin’ people. The kind who like to spend cold, hard cash on beer and share their experiences. Sitting at the bar, this is a good thing. So pull up a stool, crack something tasty, and enjoy Ratebeer, Top 50 and all.



No comments added yet

You must be logged in to post comments


Anyone can submit an article to RateBeer. Send your edited, HTML formatted article to our Editor-In-Chief.

start quote Being that such a list is literally impossible to construct, it’s actually quite flattering that so many would see our little list as the next best thing. end quote