Golf Club Atlas
GolfClubAtlas.com => Golf Course Architecture Discussion Group => Topic started by: Anthony Fowler on December 27, 2010, 10:09:21 PM
-
[Disclaimer: The subsequent ranking is not an official GCA.com ranking, but all ratings were provided by members of the Discussion Group.]
Ian Linford has kindly shared the data that he collected for the construction of GCA's top 100 courses. I have re-analyzed the data using an alternative method, and I will leave it to you to decide which method you prefer. If you remember, 177 GCAers rated 412 different courses on the 0-10 Doak Scale. Ian then provided two different top 100 lists by the average rating that each course received. The more refined method removed outlier ratings and weighted raters based on the number of courses played. Ian's methods are very similar (as far as I can tell) to those used by the major magazines.
There are several problems with averaging ratings from different raters. First, each rater has not played every course. If some raters tend to give higher ratings than others, this could bias the rankings. Second, raters have no incentive to represent their true preferences. They might give a courses an unfairly low score because they believe that course to be overrated. In my view, the better way to construct a ranking of golf courses, is through head-to-head match-ups. We can only determine that Course A is better than Course B if people who have played both courses tend to rate Course A higher than Course B. The fact that some raters think that Course A is an 8 while others think that Course B is a 7 tells us little about their relative quality if very few people have played both courses.
Therefore, I tried to construct a new ranking with the same data by looking only at head-to-head match-ups. I looked at all courses in the top 200 which were rated by 10 or more GCAers, and computed every head-to-head comparison. For each match-up, I looked at raters who had played both courses and determined the number of raters which prefer Course A over Course B and vice-versa. From there I could compute the number of wins, losses, and ties that each course has with every other course on the list. The results are similar to Ian's, with some notable differences.
Pine Valley is the clear #1, beating every other course in a head-to-head matchup, and Cypress Point is the clear #2, beating every course except Pine Valley. From there, things are a bit more complicated. Because most raters have only played a fraction of the courses, it is possible to get complicated cycles where Shinnecock beats Merion, Merion beats NGLA, NGLA beats Sand Hills, and Sand Hills beats Shinnecock (this actually happened). Therefore, I decided to give every course 1 point for each win and 1/2 a point for each tie. I think you could argue for other ways to break these cycles, but this seems reasonable as a first cut. Below are some of the results:
Top 20
Pine Valley Golf Club
Cypress Point Club
National Golf Links of America
Royal Melbourne (West)
Sand Hills Golf Club
Shinnecock Hills Golf Club
Merion Golf Club (East)
Royal County Down
St. Andrews (Old)
Royal Dornoch
Pacific Dunes
Oakmont Country Club
Royal Portrush
Crystal Downs Country Club
Muirfield
Ballybunion
Riviera Country Club
Augusta National Golf Club
Pebble Beach Golf Links
Prairie Dunes Country Club
Overrated (GCA Ranking / head-to-head ranking)
Royal County Down (3 / 8 )
Paraparaumu Beach (43 / 92)
Ballyneal (15 / 27)
Kingsley Club (51 / 79)
Rock Creek Cattle Company (47 / 71)
Underrated
National Golf Links of America (8 / 3)
Royal Porthcawl (125 / 78)
The Golf Club (75 / 48)
Casa de Campo (Teeth of the Dog) (56 / 36)
Riviera Country Club (26 / 17)
TPC at Sawgrass (Players Stadium) (74 / 49)
Oakland Hills Country Club (South) (104 / 74)
Much to my surprise and dismay, courses like the Kingsley Club and Ballyneal are overrated while TPC Sawgrass is underrated. This arises from the fact that GCAers who have played Kingsley and Ballyneal tend to give higher ratings to all courses, biasing the performance of these courses in the GCA Rankings. Even though these courses have high scores, they don't win as many head-to-head matches as you would expect by looking at the GCA rankings. I wouldn't read too much into NGLA and Royal County Down, since the difference between 3 and 8 in these rankings is minuscule.
Although I don't have the data, I would assume that the Golf Digest, Golf Magazine, Golfweek rankings suffer from the same problems. The courses with the highest average rankings may not win the most head-to-head match-ups with other courses. What do you think about this method or ranking compared to the typical approach taken by others? Do you buy the argument that the best way to compare courses is to focus on those raters that have only played both courses? How do you think the rankings in the major magazines might be biased as a result of this phenomenon?
-
Anthony:
Your description of your methodology has me a bit confused. Do you mean your rankings are based a winning percentage (wins/losses/ties) from their records against the rest of the top 200?
I have never seen a poll done that way, and am trying to visualize the results. I am thinking that the change in results for those newer courses is that one or two relatively poor votes from the most well-traveled panelists would put a lot of losses on their ledger, and drag them abruptly down the list. Perhaps that's not at all unfair -- indeed, the rankings of the courses you named are STILL much higher in the GCA poll than in any other -- but it does give the most-traveled panelists a lot of influence, pro or con.
-
Anthony,
I think the head-to-head information is where the value lies in this type of rating exercises, and that's the logic behind my GolfBlog100 ratings collaboration. I do wonder how you compile the results to reach your final conclusions, however. Perhaps I should get the same data from Ian and run it through my engine.
Jim
-
but it does give the most-traveled panelists a lot of influence, pro or con.
Not to mention the petty vindictiveness that has reared it's ugly head from members of this forum.
-
Tom and Jim: I should have been more clear about the methodology. For each pair of courses, I determined which one would win a head-to-head vote among all raters who rated both courses. Therefore, one individual can only influence the ranking if their vote was pivotal, meaning that a course was previously tied or within 1 vote of another course. Since there aren't many cases of this, 1 voter is unlikely to sway the results dramatically. Although as you've pointed out, this method implicitly gives more weight to the more traveled raters because they effectively cast more head-to-head votes. I think averaging is actually much more sensitive to the leanings of one particular voter. For example, only 26 of the 177 GCAers rated Royal Melbourne (West). If one person switched their rating from a 10 to a 0, the course's average would fall from 9.8 to 9.4, dropping it from 3rd to 12th. What's worse, that rater may have never played any other top 25 course and therefore has no ability to determine whether Royal Melbourne belongs in that category.
Just for fun, I went back to the data, and imagined that one of the most traveled raters (277 of the 412 courses) had switched their rating of Ballyneal from a 9 to a 0. Under the averaging system, Ballyneal would have fallen from 15th to 21st. Under my head-to-head system, it would have fallen from 27th to 38th. These seem like fairly comparable drops. However, if an inexperienced rater had given Ballyneal a 0, it would have the same effect in the average rankings but no effect whatsoever in the head-to-head ranking. I think it's reasonable to give more weight to the more experienced raters, and I think the head-to-head system has better safeguards to protect one rater from sabotaging a courses chances.
-
Hi Anthony,
Did you only use you method of assigning 1pt per win and 1/2pt per loss when courses violated the transitive property (A > B > C > A)? Otherwise it seems like head-to-head wins against weak courses would count the same as wins against great courses.
-
Hi Ian,
I used this point method throughout. However, if there is a Condorcet winner, where one course beats all courses below it and no course below it beats a course above it, then it will necessarily have the most points. Therefore, when there is no transitivity, this point system reveals the true head-to-head winners. Unfortunately, due to the large number of courses and the small number of ratings per voter, most courses are in some sort of cycle.
For those interested, here is the complete top 100:
1 Pine Valley Golf Club
2 Cypress Point Club
3 National Golf Links of America
4 Royal Melbourne (West)
T5 Sand Hills Golf Club
T5 Shinnecock Hills Golf Club
T7 Merion Golf Club (East)
T7 Royal County Down
T7 St. Andrews (Old)
10 Royal Dornoch
11 Pacific Dunes
12 Oakmont Country Club
13 Royal Portrush
14 Crystal Downs Country Club
15 Muirfield
16 Ballybunion
T17 Riviera Country Club
T17 Augusta National Golf Club
T19 Pebble Beach Golf Links
T19 Prairie Dunes Country Club
21 San Francisco Golf Club
22 Kingston Heath
23 Seminole Golf Club
24 Winged Foot Golf Club (West)
T25 Royal St. George's
T25 Pinehurst No. 2
27 Ballyneal
28 Fishers Island Golf Club
T29 Friar’s Head
T29 Barnbougle Dunes
31 Sunningdale (Old)
32 Rye
33 Turnberry (Ailsa)
34 Cape Kidnappers, N.Z.
35 Highlands Links
T36 Casa de Campo (Teeth of the Dog)
T36 Chicago Golf Club
38 The Country Club (Composite)
39 New South Wales GC
40 Los Angeles Country Club (North)
T41 Swinley Forest
T41 Woodhall Spa (Hotchkin)
43 Bethpage State Park (Black)
44 Sebonack Golf Club
45 Garden City Golf Club
T46 North Berwick
T46 Lahinch
48 The Golf Club
49 TPC at Sawgrass (Players Stadium)
50 Royal Birkdale
51 St. George's Hill
52 Yale University Golf Course
53 Ocean Course at Kiawah Island
T54 Carnoustie (Championship)
T54 St. Enodoc
T54 St. George's (Canada)
57 Olympic Club (Lake)
58 Cruden Bay
59 Prestwick
60 Myopia Hunt Club
61 Royal Lytham & St. Annes
62 Ganton
63 Portmarnock (Old)
64 Holston Hills Country Club
65 Royal Liverpool
66 Sunningdale (New)
67 Walton Heath (Old)
68 Pasatiempo Golf Club
T69 Macrihanish
T69 Royal Melbourne (East)
T69 Rock Creek Cattle Company
T72 Mid Ocean
T72 Valley Club of Montecito
T74 Oakland Hills Country Club (South)
T74 Wannamoisett Country Club
76 Pete Dye Golf Club
T77 Bandon Trails
T78 Royal Porthcawl
79 Kingsley Club
80 Plainfield Country Club
81 Royal Troon
T82 Bandon Dunes
T82 Camargo Club
84 Quaker Ridge Golf Club
85 Shoreacres Golf Club
86 Yeamans Hall Club
87 Monterey Peninsula Country Club (Shore)
T81 Maidstone Club
T81 Royal West Norfolk
90 Southern Hills Country Club
91 Banff Springs
92 Paraparaumu Beach
T93 Western Gailes
T93 Jasper Park
T95 Gleneagles King's
T95 Somerset Hills Country Club
T97 Notts (Hollinwell)
T97 Pennard
99 Whistling Straits (Straits)
100 Desert Forest Golf Club
-
I haven't done the count, but this list appears to have many more non-US courses than Ian's list. Anthony, is that the case? If so, any reason why?
-
so...who pisses the farthest again?
-
Hi George,
You are correct, thank you for pointing this out. Non-US courses do slightly better in my list than in the previous lists. On average, non-US courses are almost a full ranking point higher on my list than Ian's. This arises from the fact that raters who have played mostly American courses (Americans presumably) give more generous ratings. However, these US courses do worse than previously expected in head-to-head match-ups against international courses among those who have played both. I would expect this US bias to be even greater for Golf Digest, Golf Mag, and Golfweek than it is for GCAers.
-
Anthony:
I don't think the U.S. bias is worse for GOLF Magazine's list than for this one, based on past results. GOLF Magazine listed most of the international courses shown above at least once before you came around. As for the ones it didn't list, it might not be "bias" but instead common sense to keep them off.
However, overall, I think that your method is an improvement on the magazines' method, because you are only doing what the panelists themselves do -- comparing the courses head to head, and listing them in the order you think appropriate. That's what rankings really are -- they are comparative, not absolute. Some methods declare that they are ranking the "greatest" courses, but there is no magic formula for greatness upon which everyone agrees, so the results are often skewed from what the panelists themselves actually think.
In the end, though, a ranking is only as good as the judgment of the people who vote on it.
-
Anthony,
Very interesting work. It certainly makes me think more. Good job
-
A couple of guys have made interesting comments about how a well-traveled rater who's played many of the courses on the list can skew a won or loss ratio of a particularly course, particularly if they have a bias against it. This is the nature of the beast though when dealing with any sort of subjective judging. I tend to give courses a second chance when they are highly regarded and I don't necessarily get it. In some cases, the second visit changes my view but often it only reinforces it.
I remind everyone that the golf course ratings/rankings game has a unique dynamic that the last sentence emphasizes. Most course ratings and hence their rankings are based on a one-shot visit by the rater. This tends to be advantageous toward a eye-catching, dazzling course versus a more strategic course (National Golf Links is the ultimate example) where you cannot possibly catch all of the features in a single visit or even several visits. IMHO, the truly great courses get better with each successive play whereas the lesser courses tend to diminish.
Personally, I think the ratings process (particularly Digest's) would be better if the panel were smaller and better traveled having seen the breadth of courses to which they can compare a Black Rock to Seminole, a LACC North to Yeamans Hall, a Shinnecock to a Eugene CC; etc. I played with a fellow GCAer at a Top100 Course with a magazine rater and it was amazing to us that he had hardly seen anything outside his home state. As they say on ESPN's Monday NFL Countdown, "C'mon Man!!!"
The other point of note is that each rating I've come across has it's own inherent bias, just based on how the courses are rated. If you want to do an honest assessment, give each course you've played a 1-10 score like Doak does, then rate like Magazine does which wants you to place a course in it's realtive state in the world, then use the individual componet scoring like Digest, then do a hybrid 1-10 like Golfweek. If you score honestly, your rankings will be different using each system, particularly Digest's.
-
Personally, I think the ratings process (particularly Digest's) would be better if the panel were smaller and better traveled having seen the breadth of courses to which they can compare a Black Rock to Seminole, a LACC North to Yeamans Hall, a Shinnecock to a Eugene CC; etc.
How you chose your panel would probably influence the ratings more than any other single factor.
All ratings/rankings are flawed. Pick the one whose flaws suit you best! :)
-
Adam
I guess this idea of head to head doesn't sound good to me because a lot of times there is nothing to choose between courses - its a draw. Then I am forced to vote for my preference if I don't like the concept of courses being very different but essentially equal. This is how a solid course like Machrihanish can climb well up the table - the top of the solids if you like - even though its not really a #69 course in the world as this recent table suggests. I look at Pennard and Brora (which I don't see anywhere) and think these three should be standing shoulder to shoulder in any ranking - yet there seems to be quite a disparity and I would suggest it comes down to preference rather than quality golf.
Ciao
-
A few years ago, one of the big golf mags, I think Golf, did its equipment issue every year and gave a short bio of the testers and included remarks and observations from a few on each club. You could look for the guy whose game most resembled yours and then look for his comments/ratings.
That's about the only way to do rankings where the results are meaningful, to me anyway. On here, I look for certain guys whose tastes match mine and choose accordingly. It does not meet any requirements for well-rounded panels or anything silly like that, but it works perfectly for me.
Utilizing this system, I will never be accused of having done the heavy lifting necessary to contribute to worthless threads or rankings, but I will have a lot of fun!
-
The concept in having less people do the ratings would be helpful from a quality control standpoint but the issue still leaves unsaid how such people are determined and just how many sites / courses they can get to on an annual basis. Frankly, too many raters are simply homers for their own "neck of the woods."
I do take issue with the idea that ratings are meaningless -- they can be very useful but as Tom D mentioned it all comes down to the judgement of the people doing the reviewing.
-
Sean: Pennard is T97 in the list above, definitely in the same range as Machrihanish. Also, there is nothing wrong with raters saying that two courses are about equal. If the same rater gave Pennard and Machrihanish the same score on a 0 to 10 scale, they're saying that they are roughly tied in their mind which is find with me.
Charlie: I do not believe that this data has updated ratings, but it would be great to have a way for people to continually update. For one, we could have updated ratings. Also, you do something that George would like. We could find out which raters are similar and show each participant the courses that are highly rated by others with similar preferences.
George: See my other thread on the factors that separate us. The whole goal of that analysis is to try to develop a personalized list of courses based on the ratings of others who are similar to you. If you participated in the rating, I would be happy to help you figure out where you stand and figure out which courses you personally would prefer more so than the average GCA rater.
-
Anthony--
That's an interesting idea, comparing your personal scores used for the ranking versus the average of the group.
George P--
No question that how raters are selected is important but if you take it as fact that raters opinions are honest and are not skewed by outside factors, then I still think the Magazine's system of rating is more important. Take the opportunity to score courses using Golf Digest's method and you'll see what I mean. This is not a criticism of how Digest does it, just saying that it plays to certain strengths. Doak makes note of this in the Confidential Guide.
Sean--
When two courses have a similar score in my mind, then you break the tie via personal preferences. That's the spice of life. You should base your ranking of golf courses based on wat you like and that's great. Golf Course Architecture is a large world and there's room for many different types of courses.
-
Adam
I disagree. Great architecture imo has little to do with what I like. Sure, there is some cross over, but my preferences and what is great are two separate concepts. If I am talking only about the merits of a design than I try to keep my preferences out of it because they are quite defined, but not necessarily limiting. Additionally, I am on much softer ground when talking about great architecture because I don't have any experience with the subject except from a user PoV - which is quite limiting.
Anthont
That was my point. What is wrong with saying two or ten courses are about equal and leave it at that? To me, the actual number assignment is misleading and unnecessary.
Ciao
-
Anthony, I'm not sure I understand your methodology. You looked at situations where raters had played two courses on the list. You compared the Doak score of each rater, for both courses. If more raters gave higher Doak scores to one course, that course got a 1. If it was a tie -- i.e. an equal number of raters gave a higher Doak score to each course -- both courses got a 1/2.
Then you added up the total number of points for each course. You rank the courses in order of which ones have the most points, with the highest points being first, 2nd highest being second, and so forth. Is that right, or am I missing something?
-
Jim: I think you have it right. For every possible pair of courses, I looked only at the raters who had played both. Then I allocated 1 point between the two courses just as you described.
There were 177 courses that were in the top 200 and had more than 10 ratings. Therefore, for each course they could earn as many as 176 points (which Pine Valley did) or as little as 0. To say that Pine Valley beat every other course in a head-to-head match-up is much more meaningful than saying that they had the highest average rating among a list a raters who all played a different subset of courses.
-
One immediate point is that the raters didn't really do head-to-head. They gave Doak points, that you use as a proxy for head to head. If they actually had done head-to-head rankings, I wonder if the results might have differed a bit?
-
It would be interesting to do a head to head bracket for say the top 256 courses. Not sure what it all means, but it'd be a fun exercise nevertheless...
-
Anthony, can you post some of the actual scores. e.g. how did the PV/CPC "matchup" turn out?
-
Jim: I think you have it right. For every possible pair of courses, I looked only at the raters who had played both. Then I allocated 1 point between the two courses just as you described.
There were 177 courses that were in the top 200 and had more than 10 ratings. Therefore, for each course they could earn as many as 176 points (which Pine Valley did) or as little as 0. To say that Pine Valley beat every other course in a head-to-head match-up is much more meaningful than saying that they had the highest average rating among a list a raters who all played a different subset of courses.
Anthony:
Now I'm confused again as to how you did this.
I presumed what you had done was to compare individuals' votes on the top 200. For example, on my own ballot, Pine Valley would have beaten nearly all of the courses, but tied with a few others [including Cypress Point] ... hence it would not have a "perfect" score. And are you really saying that NO ONE rated any course higher than Pine Valley? That's a level of group-think that I have never seen in any ranking process.
-
Hi Tom,
I'm sorry about the confusion. It's not the case that every rater prefers Pine Valley to every other course. Rather, if we took a vote between Pine Valley and Course X for all raters who had played both Pine Valley and X, Pine Valley would win for any of the 176 other courses in my analysis. We're counting a vote for X over Y as giving X a higher score on the Doak Scale than Y, and we're calling it a tie (abstention vote due to indifference) if a rater gave them the same score. I don't think this is a problem, because I don't think raters should worry about splitting hairs between two 10s.
Jim: Your point is correct that raters did not cast head-to-head votes. I would be really interested to know whether the results would be different had this been the case. I pushed for this when Ian first proposed his ranking project, but the idea was not adopted. Nonetheless, it would be rather strange if someone would give Pinehurst a 9, Wannamoisett an 8, but then say that they prefer Wannamoisett over Pinehurst head-to-head. It's certainly possible, but I wouldn't put much stock in that persons ratings in this scenario.
-
Anthony, can you post some of the actual scores. e.g. how did the PV/CPC "matchup" turn out?
Of course. The Pine Valley/Cypress Point matchup turns out to be fairly uninteresting since only 28 people rated both courses and none of them gave either course less than a 9. Remember that you should not read seriously into the difference between any two courses that are close to each other on the list. 23 raters gave them the same score, 4, preferred PV, and 1 preferred CP.
Here are some of the popular public matchups
Pacific Dunes vs. Pebble Beach: 22 indifferent, 32 for Pacific, 14 for Pebble
Pacific Dunes vs. St. Andrews: 30 indifferent, 19 for St. Andrews, 8 for Pacific
Pacific Dunes vs. Pinehurst No. 2: 20 indifferent, 33 for Pacific, 5 for Pinehurst
Pebble Beach vs. St. Andrews: 20 indifferent, 40 for St. Andrews, 7 for Pebble
Pebble Beach vs. Pinehurst: 30 indifferent, 14 for Pinehurst, 22 for Pebble
St. Andrews vs. Pinehurst: 10 indifferent, 13 for St. Andrews, 4 for Pinehurst
What about some regional matchups?
Pacific Dunes vs. Bandon Dunes: 10 indifferent, 77 for Pacific, 1 for Bandon
Ballyneal vs. Sand Hills: 10 indifferent, 4 for Ballyneal, 19 for Sand Hills
Whistling Straits vs. Blackwolf Run: 11 indifferent, 6 for Blackwolf, 19 for Whistling
Shinnecock vs. NGLA: 23 indifferent, 9 for NGLA, 8 for Shinny
Cypress vs. Pebble: 13 indifferent, 2 for Pebble, 39 for Cypress
Pebble vs. Spyglass: 23 indifferent, 15 for Spyglass, 37 for Pebble
Crystal Downs vs. Kingsley Club: 9 indifferent, 7 for Kingsley, 14 for Crystal Downs
Riviera vs. LACC: 10 indifferent, 3 for LACC, 14 for Riviera
Olympic vs. SFGC: 11 indifferent, 2 for Olympic, 25 for SFGC
-
Tom Doak, I think PV wins over CPC because more people gave it higher Doak scores. Majority rules, in other words.
Or "sort-of" majority. One problem with the match-ups is the small numbers. e.g. PV/CPC came down to five people. While I'm not a statistician, my reaction is that was more a tie than a clear-cut win for PV. Almost everyone who played both rated them a 10. Same with Shinnie/NGLA. IMO we should count that as a tie, not a win for NGLA.
Anthony, the main reason I see head-to-head matches might yield different results is that the Doak scale is not too precise. Ten = ten, even if we prefer one course. But say we could give scores of 9.9, 9.8, and so on. Lots of ties would get unlocked. A true head-to-head matchup might overcome this problem and show people's real preferences.
Btw, very cool idea and analysis, with lots of interesting results. I'm still not sure it's more accurate than simply averaging the raw scores, though. Want to give this a bit more thought.
IMO the statistically-massaged version is far worse than either yours or the raw arithmetic mean.
-
I love the head to head idea-the problem is that you need both a statistically significant sample size of people who've played both courses, at least 20 and more likely 30, and a meaningful win by of say 2 or 3 guys (i.e. not 15-14 etc...). The problem with rating courses on specific criteria (i.e. the walk in the park factor, resistance to scoring etc.) is nobody even here can agree on exactly what the criteria should be...
-
Anthony,
I ran my approach looking at all of the individual head-to-head match-ups (there were around 800,000 of them). My results are generally the same as yours, although I have Cypress Point #1 and Pine Valley #2. I also ran one that looked at the score differential between the two and the results were largely the same (Merion jumped to #4 in that one).
I used 5 rankings as my minimum cut-off. Morfortaine (36) and Hamilton (52) come in with only 6 and 8 ratings, respectively, but 3 of those were from panelists with over 275 courses rated, so you are relying on their expert judgment for the most part.
Anthony's head-to-head rating in ()
1 Cypress Point Club (2)
2 Pine Valley Golf Club (1)
3 Royal Melbourne (West) (4)
4 Sand Hills Golf Club (T5)
5 Shinnecock Hills Golf Club (T5)
6 National Golf Links of America (3)
7 Royal County Down (T7)
8 Merion Golf Club (East) (T7)
9 St. Andrews (Old) (T7)
10 Royal Dornoch (10)
11 Oakmont Country Club (12)
12 Pacific Dunes (11)
13 Crystal Downs Country Club (14)
14 Royal Portrush (13)
15 Muirfield (15)
16 Kingston Heath (22)
17 Ballybunion (16)
18 Augusta National Golf Club (T17)
19 Riviera Country Club (T17)
20 Seminole Golf Club (23)
21 Prairie Dunes Country Club (T19)
22 Pebble Beach Golf Links (T19)
23 San Francisco Golf Club (21)
24 Fishers Island Golf Club (28)
25 Winged Foot Golf Club (West) (24)
26 Ballyneal (27)
27 Royal St. George's (T25)
28 Friar’s Head (T29)
29 Pinehurst No. 2 (T25)
30 Barnbougle Dunes (T29)
31 Sunningdale (Old) (31)
32 Highlands Links (35)
33 Cape Kidnappers, N.Z. (34)
34 Rye (32)
35 Turnberry (Ailsa) (33)
36 Morfontaine (NR)
37 New South Wales GC (39)
38 Chicago Golf Club (T36)
39 The Country Club (Composite) (38)
40 Sebonack Golf Club (44)
41 St. George's Hill (51)
42 Swinley Forest (T41)
43 Los Angeles Country Club (North) (40)
44 Royal Birkdale (50)
45 Lahinch (T46)
46 Casa de Campo (Teeth of the Dog) (T36)
47 St. George's (Canada) (T54)
48 Bethpage State Park (Black) (43)
49 Garden City Golf Club (45)
50 St. Enodoc (T54)
51 Woodhall Spa (Hotchkin) (T41)
52 Hamilton (NR)
53 North Berwick (T46)
54 The Golf Club (48)
55 TPC at Sawgrass (Players Stadium) (49)
56 Ocean Course at Kiawah Island (53)
57 Yale University Golf Course (52)
58 Cruden Bay (58)
59 Holston Hills Country Club (64)
60 Carnoustie (Championship) (T54)
61 Myopia Hunt Club (60)
62 Valley Club of Montecito (T72)
63 Portmarnock (Old) (63)
64 Rock Creek Cattle Company (T69)
65 Mid Ocean (T72)
66 Olympic Club (Lake) (57)
67 Ganton (62)
68 Wannamoisett Country Club (T74)
69 Royal Melbourne (East) (T69)
70 Pete Dye Golf Club (76)
71 Pasatiempo Golf Club (68)
72 Royal Lytham & St. Annes (61)
73 Prestwick (59)
74 Macrihanish (T69)
75 Plainfield Country Club (80)
76 Royal Liverpool (65)
77 Sunningdale (New) (66)
78 Boston Golf Club (NR)
79 Bandon Trails (T77)
80 Walton Heath (Old) (67)
81 Royal Porthcawl (T78)
82 Kingsley Club (79)
83 Bandon Dunes (T82)
84 Paraparaumu Beach (92)
85 Monterey Peninsula Country Club (Shore) (87)
86 Shoreacres Golf Club (85)
87 Oakland Hills Country Club (South) (T74)
88 Southern Hills Country Club (90)
89 Camargo Club (T82)
90 Royal Troon (81)
91 Royal West Norfolk (T81)
92 Maidstone Club (T81)
93 Yeamans Hall Club (86)
94 Quaker Ridge Golf Club (84)
95 Western Gailes (T93)
96 Pronghorn (Fazio) (NR)
97 Whistling Straits (Straits) (99)
98 Pennard (T97)
99 Banff Springs (91)
100 Somerset Hills Country Club (T95)
-
Jim: Thank you for all the work. Check your messages, and we'll see if we can resolve some of the discrepancies.
-
I love the head to head idea-the problem is that you need both a statistically significant sample size of people who've played both courses, at least 20 and more likely 30, and a meaningful win by of say 2 or 3 guys (i.e. not 15-14 etc...). The problem with rating courses on specific criteria (i.e. the walk in the park factor, resistance to scoring etc.) is nobody even here can agree on exactly what the criteria should be...
Jud,
I agree with the risks of having a statistically significant sample size. Yes, you may need more raters to get the number of head-to-heads you want (vs. the traditional "aggregate score" methods), but I think it is worth the effort because of a superior methodology.
In "aggregate ranking" listings, the risk is whether ALL the raters can maintain a constancy relative to the proscribed scale (which is practically impossible - my 8 may be your 6). However, what I really like about the "head-to-head" methodology is that it makes the individual rater the Constant, which is going to get a more honest result for comparative ranking.
I really like the thought behind this methodology - it's not a perfect system, but I think it's an advancement vs. traditional methods. Great Job Anthony!!
Ultimately, all quantitative rankings have to be taken with an acknowledgement that it's difficult to quantify something that is inherently qualitative. Numerical rankings will never replace descriptive critiques like The Confidential Guide. But even in that book, you almost have to put the numbers as an afterthought. There are times when I've read two different reviews by Doak and been shocked to see that they have the same "number," at which point I value the comments more.