Surely for the average golfer, the thought of playing golf’s most celebrated courses is a fun thing to daydream about. To this end, some golf-related publications produce a “top course” list for us to ponder while stuffed in our cubicles or reclining on our porcelain thrones.
But beyond the “daydream” factor, do these lists merely serve as a badge of honor for the courses they contain? And furthermore, how accurate can they be? Golf Magazine and Golf Digest probably produce the most popular “top course” lists. A look inside how they compile their rankings sheds light on some flaws in their methodology.
Let’s first look at Golf Magazine’s method: Golf Magazine uses just 100 panelists to compile the top courses for the entire world. How can this be? There’s over 30,000 golf courses in the world, how can 100 people play them all? Answer: They can’t, and they don’t. You see, Golf Magazine has already narrowed that 30,000 down to around 400. How? They’re not saying. Nevertheless, we now have 400 courses for these 100 people to play and rank. But wait, not all 100 people have played these 400 courses. What? That’s right – the 100 panelists have not played all the courses. As such, panelists can only vote for courses they have played. Also, course architects and course owners on the panel (why are these people on the panel in the first place?) can’t vote for courses they are affiliated with (is it me or is this starting to get a little hairy?).
Meanwhile, Golf Digest’s method is even more vague, which is a little alarming seeing as they produce more lists (including one for top public courses and one for best new courses). At least they have an expanded range of panelists (more than 900 men and women) and grade courses on seven different criteria – but that’s about all we know. Who are these 900 people and how are they chosen? What courses do they play and how is that determined?
Perhaps most intriguingly, neither magazine considers price when calculating its rankings. While it makes sense to compile a list strictly based on course quality, it would also be nice to see a “Top 100 Under $100” list, or at least one that takes greens fees into consideration (for its part, Golf Digest does sort its top 100 public tracks by price, but only 20 of them are under $100).
The reality is, most of the courses on these lists include private and/or pricey facilities that do little more than mock us – relegating our daydreams to improbable realities. The likelihood of an Average Joe playing more than one or two top courses in his/her lifetime is slim-to-none (if you do, you write a book about it and/or spend nearly a quarter million dollars to play them all). Alas, for most of us, we must walk Amen Corner in our minds and then digress back to our everyday lives.
So considering the sketchy methodologies and the general uselessness of these lists, does any of it really matter? Sure, I guess its nice to daydream, or to boast that you’ve played one of the “top 100.” But as for practical purposes; Give me a list of courses I’d be able to start booking tee-times at tomorrow.
betfan says
Hi¡¡
The strategy of Golf picks is growing. You can see it in: http://betfan.blogspot.com/
I wait you and your comments. Thanks and good job
diane says
I agree with you about the course rankings. I look at the Golf Digest rankings just to see who’s what, with no expectation of ever seeing one, much less playing it.
However, they also have an annual list of the best municipal courses by state. My regular haunt was last year’s best Muni in Ohio. In 2010 it will be some other track for no known reason.