How did the idea for GolfBlog100 come about?
Will you ever post a section that lists the participants?
What is the general ranking system based on (how is it weighted)?
Will the public section ever be completed?
The GolfBlog100 came about for two reasons. First, I wanted to put together useful information on the top courses in one place. There are quite a few bloggers who write about their travels or course reviews, and a lot of it is good stuff. But sometimes it gets lost in the shuffle as newer posts replace older posts, etc. If I was heading to play a new course, I'd probably google the course and try to get as much information as possible. This tries to put it all that in one place. I've had fun researching courses, digging up scorecards and especially plotting out the routing from the aerials.
Secondly, I had a theory about how to fix the rankings and wanted some data to test it. Many moons ago, I created a ratings methodology for sorting out college basketball teams. I tried to shed some light on the flawed RPI that the NCAA selection committee was using to seed and select teams. That was the same engine I used for the golf blog 100 data. Instead of caring about the score you give a particular course, per se, it's only concerned with how strong you consider a course relative to others you've played. Combining all those individual comparisons, you can come up with a very good relative ranking.
I tried getting hold of the powers that be at the publications to talk about their rankings methodologies. I have a standing offer to test my methodology on their information free of charge to let them compare the results to their current practice, take it or leave it. I think it would come up with a ranking that is a much truer reflection of the collective view of their panelists, and I would hope that the publications have that as a basic objective.
In my day job, I have to spend a lot of time worrying about underlying assumptions. Like "house prices have never declined across the country, therefore they never will." A lot of the flaws in the system can based on flawed assumptions. We like to bash Golf Digest's rankings every time they come out. I don't think it's because the raters don't have a clue, I think it's because Golf Digest assumes they know the exact formula for what makes a golf course great. They intentionally don't ask for an overall ranking because they assume they have it right. The numbers cruncher in me would love to have all those categories ranked WITH an overall ranking...with that you could easily tell what categories matter and what the optimal weights would be. You could even segment golfers based on what categories they like best, and recommend courses in different areas based on segment.
I also think that relying on relative rankings would help Golf Digest's as well. We like to point out Arcadia vs. Kingsley in the Michigan state rankings as a prime example. I'm willing to bet that the numbers are skewed by golfer who have played Arcadia but not Kingsley. I know many golfers who think Arcadia is the best course they've ever played. If they were a panelist, they'd certainly give it a 10 and every other course would go from there. Give me somebody who's played Arcadia, Kingsley and Crystal Downs and if Arcadia is still a 10, then that it much more valuable information.
Golf Magazine seems to have a better set of rankings because they have a better system. Rely on an expert panel, rely on their rankings without a set criteria, allowing them to define what they think separates a good course from a great course. Some may inherently value one category over another, but in the end you are balancing it out to get the collective view. My only potential issue is some simplifying assumptions they use in scoring each panelist submission.
"The points break down as follows: Each course placing in the top three on a panelist's ballot earns 100 points; spots 4-10 earn 85 points, followed by 11-25 (70 points), 26-50 (60 points), 51-75 (50 points), 76-100 (40 points), 101-150 (30 points), 151-200 (20 points), 201-250 (10 points), 251 + (0 points). Any course that received a "remove from ballot" vote has 10 points deducted."
I'm not sure how they determined that the difference between somebody's 3rd and 4th or 10th and 11th ranked courses are important, but the difference between 4th and 10th or 11th and 25th aren't important. You could easily argue that the exact opposite is true. In my methodology, every one of those data points would be viewed as equally valuable. Again, I think it's just a simplifying assumption to convert the ranking into an overall score, but there's a much better way of doing it.
I think Golfweek is pretty sound in that they ask for categories and a separate overall rating, but again it could be skewed based on which courses a given panelist has played. They ask the panelist to score based on slotting the course into a certain range (consistent with top 100 modern or classic, for example), where a relative ranking wouldn't worry about that.
In general, one thing I've discovered is that it's important to have a wide array of opinions. In the end, those various opinions will balance out and the courses that consistently score well will rise to the top. The 'best' ranking might actually be one that combines all three of the major publications, assuming you could iron out some of the methodological flaws first. Of course, people are going to complain that a course is too high or too low, but all that is really saying is that individual has a different set of opinions than the average. We tend to get bent out of shape when somebody doesn't see a course the same way we see it, as if we have the one and only right opinion. One GB100 panelist got all riled up because the final rankings of a publication-consensus top 100 course made the GB100 list, while he had it as a '2'. He obviously thought he had the right answer on that course while the other 99% of golfers were wrong.
The initial release listed the bloggers who participated in the original rankings. I have a few other contributors as well. I am always looking for more data points, so if you're well-traveled (200+ courses), aren't a publication rater but like to keep a list, I could use you. Based on my 'Whip it Out' thread, a lot of us like to rank or categorize the courses we play.
I had hoped to publish the public list this spring, but I think I will hold off until the next update of the Top 100 U.S. in the fall and just do them all at one (international as well). With Old Mac opening in a couple weeks, I'd like to make sure it's properly represented in the 2010 rankings.