[Disclaimer: The subsequent ranking is not an official GCA.com ranking, but all ratings were provided by members of the Discussion Group.]
Ian Linford has kindly shared the data that he collected for the construction of GCA's top 100 courses. I have re-analyzed the data using an alternative method, and I will leave it to you to decide which method you prefer. If you remember, 177 GCAers rated 412 different courses on the 0-10 Doak Scale. Ian then provided two different top 100 lists by the average rating that each course received. The more refined method removed outlier ratings and weighted raters based on the number of courses played. Ian's methods are very similar (as far as I can tell) to those used by the major magazines.
There are several problems with averaging ratings from different raters. First, each rater has not played every course. If some raters tend to give higher ratings than others, this could bias the rankings. Second, raters have no incentive to represent their true preferences. They might give a courses an unfairly low score because they believe that course to be overrated. In my view, the better way to construct a ranking of golf courses, is through head-to-head match-ups. We can only determine that Course A is better than Course B if people who have played both courses tend to rate Course A higher than Course B. The fact that some raters think that Course A is an 8 while others think that Course B is a 7 tells us little about their relative quality if very few people have played both courses.
Therefore, I tried to construct a new ranking with the same data by looking only at head-to-head match-ups. I looked at all courses in the top 200 which were rated by 10 or more GCAers, and computed every head-to-head comparison. For each match-up, I looked at raters who had played both courses and determined the number of raters which prefer Course A over Course B and vice-versa. From there I could compute the number of wins, losses, and ties that each course has with every other course on the list. The results are similar to Ian's, with some notable differences.
Pine Valley is the clear #1, beating every other course in a head-to-head matchup, and Cypress Point is the clear #2, beating every course except Pine Valley. From there, things are a bit more complicated. Because most raters have only played a fraction of the courses, it is possible to get complicated cycles where Shinnecock beats Merion, Merion beats NGLA, NGLA beats Sand Hills, and Sand Hills beats Shinnecock (this actually happened). Therefore, I decided to give every course 1 point for each win and 1/2 a point for each tie. I think you could argue for other ways to break these cycles, but this seems reasonable as a first cut. Below are some of the results:
Top 20
Pine Valley Golf Club
Cypress Point Club
National Golf Links of America
Royal Melbourne (West)
Sand Hills Golf Club
Shinnecock Hills Golf Club
Merion Golf Club (East)
Royal County Down
St. Andrews (Old)
Royal Dornoch
Pacific Dunes
Oakmont Country Club
Royal Portrush
Crystal Downs Country Club
Muirfield
Ballybunion
Riviera Country Club
Augusta National Golf Club
Pebble Beach Golf Links
Prairie Dunes Country Club
Overrated (GCA Ranking / head-to-head ranking)
Royal County Down (3 / 8 )
Paraparaumu Beach (43 / 92)
Ballyneal (15 / 27)
Kingsley Club (51 / 79)
Rock Creek Cattle Company (47 / 71)
Underrated
National Golf Links of America (8 / 3)
Royal Porthcawl (125 / 78)
The Golf Club (75 / 48)
Casa de Campo (Teeth of the Dog) (56 / 36)
Riviera Country Club (26 / 17)
TPC at Sawgrass (Players Stadium) (74 / 49)
Oakland Hills Country Club (South) (104 / 74)
Much to my surprise and dismay, courses like the Kingsley Club and Ballyneal are overrated while TPC Sawgrass is underrated. This arises from the fact that GCAers who have played Kingsley and Ballyneal tend to give higher ratings to all courses, biasing the performance of these courses in the GCA Rankings. Even though these courses have high scores, they don't win as many head-to-head matches as you would expect by looking at the GCA rankings. I wouldn't read too much into NGLA and Royal County Down, since the difference between 3 and 8 in these rankings is minuscule.
Although I don't have the data, I would assume that the Golf Digest, Golf Magazine, Golfweek rankings suffer from the same problems. The courses with the highest average rankings may not win the most head-to-head match-ups with other courses. What do you think about this method or ranking compared to the typical approach taken by others? Do you buy the argument that the best way to compare courses is to focus on those raters that have only played both courses? How do you think the rankings in the major magazines might be biased as a result of this phenomenon?