News:

This discussion group is best enjoyed using Google Chrome, Firefox or Safari.


Anthony Fowler

  • Karma: +0/-0
Re-analysis of GCA's Top 100
« on: December 27, 2010, 10:09:21 PM »
[Disclaimer: The subsequent ranking is not an official GCA.com ranking, but all ratings were provided by members of the Discussion Group.]

Ian Linford has kindly shared the data that he collected for the construction of GCA's top 100 courses.  I have re-analyzed the data using an alternative method, and I will leave it to you to decide which method you prefer.  If you remember, 177 GCAers rated 412 different courses on the 0-10 Doak Scale.  Ian then provided two different top 100 lists by the average rating that each course received.  The more refined method removed outlier ratings and weighted raters based on the number of courses played.  Ian's methods are very similar (as far as I can tell) to those used by the major magazines.

There are several problems with averaging ratings from different raters.  First, each rater has not played every course.  If some raters tend to give higher ratings than others, this could bias the rankings.  Second, raters have no incentive to represent their true preferences.  They might give a courses an unfairly low score because they believe that course to be overrated.  In my view, the better way to construct a ranking of golf courses, is through head-to-head match-ups.  We can only determine that Course A is better than Course B if people who have played both courses tend to rate Course A higher than Course B.  The fact that some raters think that Course A is an 8 while others think that Course B is a 7 tells us little about their relative quality if very few people have played both courses.

Therefore, I tried to construct a new ranking with the same data by looking only at head-to-head match-ups.  I looked at all courses in the top 200 which were rated by 10 or more GCAers, and computed every head-to-head comparison.  For each match-up, I looked at raters who had played both courses and determined the number of raters which prefer Course A over Course B and vice-versa.  From there I could compute the number of wins, losses, and ties that each course has with every other course on the list.  The results are similar to Ian's, with some notable differences.

Pine Valley is the clear #1, beating every other course in a head-to-head matchup, and Cypress Point is the clear #2, beating every course except Pine Valley.  From there, things are a bit more complicated.  Because most raters have only played a fraction of the courses, it is possible to get complicated cycles where Shinnecock beats Merion, Merion beats NGLA, NGLA beats Sand Hills, and Sand Hills beats Shinnecock (this actually happened).  Therefore, I decided to give every course 1 point for each win and 1/2 a point for each tie.  I think you could argue for other ways to break these cycles, but this seems reasonable as a first cut.  Below are some of the results:

Top 20
Pine Valley Golf Club
Cypress Point Club
National Golf Links of America
Royal Melbourne (West)
Sand Hills Golf Club
Shinnecock Hills Golf Club
Merion Golf Club (East)
Royal County Down
St. Andrews (Old)
Royal Dornoch
Pacific Dunes
Oakmont Country Club
Royal Portrush
Crystal Downs Country Club
Muirfield
Ballybunion
Riviera Country Club
Augusta National Golf Club
Pebble Beach Golf Links
Prairie Dunes Country Club

Overrated (GCA Ranking / head-to-head ranking)
Royal County Down            (3 / 8 )
Paraparaumu Beach           (43 /   92)
Ballyneal                           (15 /   27)
Kingsley Club                   (51 /   79)
Rock Creek Cattle Company   (47 /   71)

Underrated
National Golf Links of America              (8 /   3)
Royal Porthcawl                                  (125 /   78)
The Golf Club                                   (75 /   48)
Casa de Campo (Teeth of the Dog)    (56 / 36)
Riviera Country Club                            (26 /   17)
TPC at Sawgrass (Players Stadium)    (74 /   49)
Oakland Hills Country Club (South)      (104    / 74)

Much to my surprise and dismay, courses like the Kingsley Club and Ballyneal are overrated while TPC Sawgrass is underrated.  This arises from the fact that GCAers who have played Kingsley and Ballyneal tend to give higher ratings to all courses, biasing the performance of these courses in the GCA Rankings.  Even though these courses have high scores, they don't win as many head-to-head matches as you would expect by looking at the GCA rankings.  I wouldn't read too much into NGLA and Royal County Down, since the difference between 3 and 8 in these rankings is minuscule.

Although I don't have the data, I would assume that the Golf Digest, Golf Magazine, Golfweek rankings suffer from the same problems.  The courses with the highest average rankings may not win the most head-to-head match-ups with other courses.  What do you think about this method or ranking compared to the typical approach taken by others?  Do you buy the argument that the best way to compare courses is to focus on those raters that have only played both courses?  How do you think the rankings in the major magazines might be biased as a result of this phenomenon?    
« Last Edit: December 28, 2010, 09:12:42 AM by Anthony Fowler »

Tom_Doak

  • Karma: +1/-1
Re: Re-analysis of GCA's Top 100
« Reply #1 on: December 27, 2010, 10:29:49 PM »
Anthony:

Your description of your methodology has me a bit confused.  Do you mean your rankings are based a winning percentage (wins/losses/ties) from their records against the rest of the top 200?

I have never seen a poll done that way, and am trying to visualize the results.  I am thinking that the change in results for those newer courses is that one or two relatively poor votes from the most well-traveled panelists would put a lot of losses on their ledger, and drag them abruptly down the list.  Perhaps that's not at all unfair -- indeed, the rankings of the courses you named are STILL much higher in the GCA poll than in any other -- but it does give the most-traveled panelists a lot of influence, pro or con.


Jim Colton

Re: Re-analysis of GCA's Top 100
« Reply #2 on: December 27, 2010, 10:47:23 PM »
Anthony,

  I think the head-to-head information is where the value lies in this type of rating exercises, and that's the logic behind my GolfBlog100 ratings collaboration.  I do wonder how you compile the results to reach your final conclusions, however.  Perhaps I should get the same data from Ian and run it through my engine.

  Jim

Adam Clayman

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #3 on: December 27, 2010, 11:24:24 PM »
but it does give the most-traveled panelists a lot of influence, pro or con.



Not to mention the petty vindictiveness that has reared it's ugly head from members of this forum.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Anthony Fowler

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #4 on: December 27, 2010, 11:54:08 PM »
Tom and Jim: I should have been more clear about the methodology.  For each pair of courses, I determined which one would win a head-to-head vote among all raters who rated both courses.  Therefore, one individual can only influence the ranking if their vote was pivotal, meaning that a course was previously tied or within 1 vote of another course.  Since there aren't many cases of this, 1 voter is unlikely to sway the results dramatically.  Although as you've pointed out, this method implicitly gives more weight to the more traveled raters because they effectively cast more head-to-head votes.  I think averaging is actually much more sensitive to the leanings of one particular voter.  For example, only 26 of the 177 GCAers rated Royal Melbourne (West).  If one person switched their rating from a 10 to a 0, the course's average would fall from 9.8 to 9.4, dropping it from 3rd to 12th.  What's worse, that rater may have never played any other top 25 course and therefore has no ability to determine whether Royal Melbourne belongs in that category.

Just for fun, I went back to the data, and imagined that one of the most traveled raters (277 of the 412 courses) had switched their rating of Ballyneal from a 9 to a 0.  Under the averaging system, Ballyneal would have fallen from 15th to 21st.  Under my head-to-head system, it would have fallen from 27th to 38th.  These seem like fairly comparable drops.  However, if an inexperienced rater had given Ballyneal a 0, it would have the same effect in the average rankings but no effect whatsoever in the head-to-head ranking.  I think it's reasonable to give more weight to the more experienced raters, and I think the head-to-head system has better safeguards to protect one rater from sabotaging a courses chances.

Ian_L

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #5 on: December 28, 2010, 01:43:36 AM »
Hi Anthony,

Did you only use you method of assigning 1pt per win and 1/2pt per loss when courses violated the transitive property (A > B > C > A)? Otherwise it seems like head-to-head wins against weak courses would count the same as wins against great courses.

Anthony Fowler

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #6 on: December 28, 2010, 09:10:28 AM »
Hi Ian,

I used this point method throughout.  However, if there is a Condorcet winner, where one course beats all courses below it and no course below it beats a course above it, then it will necessarily have the most points.  Therefore, when there is no transitivity, this point system reveals the true head-to-head winners.  Unfortunately, due to the large number of courses and the small number of ratings per voter, most courses are in some sort of cycle.

For those interested, here is the complete top 100:
1   Pine Valley Golf Club
2   Cypress Point Club
3   National Golf Links of America
4   Royal Melbourne (West)
T5   Sand Hills Golf Club
T5   Shinnecock Hills Golf Club
T7   Merion Golf Club (East)
T7   Royal County Down
T7   St. Andrews (Old)
10   Royal Dornoch
11   Pacific Dunes
12   Oakmont Country Club
13   Royal Portrush
14   Crystal Downs Country Club
15   Muirfield
16   Ballybunion
T17   Riviera Country Club
T17   Augusta National Golf Club
T19   Pebble Beach Golf Links
T19   Prairie Dunes Country Club
21   San Francisco Golf Club
22   Kingston Heath
23   Seminole Golf Club
24   Winged Foot Golf Club (West)
T25   Royal St. George's
T25   Pinehurst No. 2
27   Ballyneal
28   Fishers Island Golf Club
T29   Friar’s Head 
T29   Barnbougle Dunes
31   Sunningdale (Old)
32   Rye
33   Turnberry (Ailsa)
34   Cape Kidnappers, N.Z.
35   Highlands Links
T36   Casa de Campo (Teeth of the Dog)
T36   Chicago Golf Club
38   The Country Club (Composite)
39   New South Wales GC
40   Los Angeles Country Club (North)
T41   Swinley Forest
T41   Woodhall Spa (Hotchkin)
43   Bethpage State Park (Black)
44   Sebonack Golf Club
45   Garden City Golf Club
T46   North Berwick
T46   Lahinch
48   The Golf Club
49   TPC at Sawgrass (Players Stadium)
50   Royal Birkdale
51   St. George's Hill
52   Yale University Golf Course
53   Ocean Course at Kiawah Island
T54   Carnoustie (Championship)
T54   St. Enodoc
T54   St. George's (Canada)
57   Olympic Club (Lake)
58   Cruden Bay
59   Prestwick
60   Myopia Hunt Club
61   Royal Lytham & St. Annes
62   Ganton
63   Portmarnock (Old)
64   Holston Hills Country Club
65   Royal Liverpool
66   Sunningdale (New)
67   Walton Heath (Old)
68   Pasatiempo Golf Club
T69   Macrihanish
T69   Royal Melbourne (East)
T69   Rock Creek Cattle Company
T72   Mid Ocean
T72   Valley Club of Montecito
T74   Oakland Hills Country Club (South)
T74   Wannamoisett Country Club
76   Pete Dye Golf Club
T77   Bandon Trails 
T78   Royal Porthcawl
79   Kingsley Club
80   Plainfield Country Club
81   Royal Troon
T82   Bandon Dunes
T82   Camargo Club
84   Quaker Ridge Golf Club
85   Shoreacres Golf Club
86   Yeamans Hall Club
87   Monterey Peninsula Country Club (Shore)
T81   Maidstone Club
T81   Royal West Norfolk
90   Southern Hills Country Club
91   Banff Springs
92   Paraparaumu Beach
T93   Western Gailes
T93   Jasper Park
T95   Gleneagles King's
T95   Somerset Hills Country Club
T97   Notts (Hollinwell)
T97   Pennard
99   Whistling Straits (Straits)
100   Desert Forest Golf Club

George Freeman

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #7 on: December 28, 2010, 09:14:39 AM »
I haven't done the count, but this list appears to have many more non-US courses than Ian's list.  Anthony, is that the case?  If so, any reason why?
Mayhugh is my hero!!

"I love creating great golf courses.  I love shaping earth...it's a canvas." - Donald J. Trump

Ronald Montesano

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #8 on: December 28, 2010, 09:18:41 AM »
so...who pisses the farthest again?
Coming in 2024
~Elmira Country Club
~Soaring Eagles
~Bonavista
~Indian Hills
~Maybe some more!!

Anthony Fowler

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #9 on: December 28, 2010, 09:38:45 AM »
Hi George,

You are correct, thank you for pointing this out.  Non-US courses do slightly better in my list than in the previous lists.  On average, non-US courses are almost a full ranking point higher on my list than Ian's.  This arises from the fact that raters who have played mostly American courses (Americans presumably) give more generous ratings.  However, these US courses do worse than previously expected in head-to-head match-ups against international courses among those who have played both.  I would expect this US bias to be even greater for Golf Digest, Golf Mag, and Golfweek than it is for GCAers.

Tom_Doak

  • Karma: +1/-1
Re: Re-analysis of GCA's Top 100
« Reply #10 on: December 28, 2010, 10:17:34 AM »
Anthony:

I don't think the U.S. bias is worse for GOLF Magazine's list than for this one, based on past results.  GOLF Magazine listed most of the international courses shown above at least once before you came around.  As for the ones it didn't list, it might not be "bias" but instead common sense to keep them off.

However, overall, I think that your method is an improvement on the magazines' method, because you are only doing what the panelists themselves do -- comparing the courses head to head, and listing them in the order you think appropriate.  That's what rankings really are -- they are comparative, not absolute.  Some methods declare that they are ranking the "greatest" courses, but there is no magic formula for greatness upon which everyone agrees, so the results are often skewed from what the panelists themselves actually think.

In the end, though, a ranking is only as good as the judgment of the people who vote on it.

Jim Eder

Re: Re-analysis of GCA's Top 100
« Reply #11 on: December 28, 2010, 10:19:40 AM »
Anthony,

Very interesting work. It certainly makes me think more. Good job

Adam_Messix

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #12 on: December 28, 2010, 12:26:36 PM »
A couple of guys have made interesting comments about how a well-traveled rater who's played many of the courses on the list can skew a won or loss ratio of a particularly course, particularly if they have a bias against it.  This is the nature of the beast though when dealing with any sort of subjective judging.  I tend to give courses a second chance when they are highly regarded and I don't necessarily get it.  In some cases, the second visit changes my view but often it only reinforces it.

I remind everyone that the golf course ratings/rankings game has a unique dynamic that the last sentence emphasizes.  Most course ratings and hence their rankings are based on a one-shot visit by the rater.  This tends to be advantageous toward a eye-catching, dazzling course versus a more strategic course (National Golf Links is the ultimate example) where you cannot possibly catch all of the features in a single visit or even several visits.  IMHO, the truly great courses get better with each successive play whereas the lesser courses tend to diminish. 

Personally, I think the ratings process (particularly Digest's) would be better if the panel were smaller and better traveled having seen the breadth of courses to which they can compare a Black Rock to Seminole, a LACC North to Yeamans Hall, a Shinnecock to a Eugene CC; etc.  I played with a fellow GCAer at a Top100 Course with a magazine rater and it was amazing to us that he had hardly seen anything outside his home state.  As they say on ESPN's Monday NFL Countdown, "C'mon Man!!!" 

The other point of note is that each rating I've come across has it's own inherent bias, just based on how the courses are rated.  If you want to do an honest assessment, give each course you've played a 1-10 score like Doak does, then rate like Magazine does which wants you to place a course in it's realtive state in the world, then use the individual componet scoring like Digest, then do a hybrid 1-10 like Golfweek.  If you score honestly, your rankings will be different using each system, particularly Digest's. 

George Pazin

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #13 on: December 28, 2010, 12:58:26 PM »
Personally, I think the ratings process (particularly Digest's) would be better if the panel were smaller and better traveled having seen the breadth of courses to which they can compare a Black Rock to Seminole, a LACC North to Yeamans Hall, a Shinnecock to a Eugene CC; etc.

How you chose your panel would probably influence the ratings more than any other single factor.

All ratings/rankings are flawed. Pick the one whose flaws suit you best! :)
Big drivers and hot balls are the product of golf course design that rewards the hit one far then hit one high strategy.  Shinny showed everyone how to take care of this whole technology dilemma. - Pat Brockwell, 6/24/04

Sean_A

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #14 on: December 28, 2010, 01:07:25 PM »
Adam

I guess this idea of head to head doesn't sound good to me because a lot of times there is nothing to choose between courses - its a draw.  Then I am forced to vote for my preference if I don't like the concept of courses being very different but essentially equal.  This is how a solid course like Machrihanish can climb well up the table - the top of the solids if you like - even though its not really a #69 course in the world as this recent table suggests.  I look at Pennard and Brora (which I don't see anywhere) and think these three should be standing shoulder to shoulder in any ranking - yet there seems to be quite a disparity and I would suggest it comes down to preference rather than quality golf.


Ciao  
New plays planned for 2024:Winterfield, Alnmouth, Camden, Palmetto Bluff Crossroads Course, Colleton River Dye Course  & Old Barnwell

George Pazin

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #15 on: December 28, 2010, 01:23:04 PM »
A few years ago, one of the big golf mags, I think Golf, did its equipment issue every year and gave a short bio of the testers and included remarks and observations from a few on each club. You could look for the guy whose game most resembled yours and then look for his comments/ratings.

That's about the only way to do rankings where the results are meaningful, to me anyway. On here, I look for certain guys whose tastes match mine and choose accordingly. It does not meet any requirements for well-rounded panels or anything silly like that, but it works perfectly for me.

Utilizing this system, I will never be accused of having done the heavy lifting necessary to contribute to worthless threads or rankings, but I will have a lot of fun!
Big drivers and hot balls are the product of golf course design that rewards the hit one far then hit one high strategy.  Shinny showed everyone how to take care of this whole technology dilemma. - Pat Brockwell, 6/24/04

Matt_Ward

Re: Re-analysis of GCA's Top 100
« Reply #16 on: December 28, 2010, 01:33:15 PM »
The concept in having less people do the ratings would be helpful from a quality control standpoint but the issue still leaves unsaid how such people are determined and just how many sites / courses they can get to on an annual basis. Frankly, too many raters are simply homers for their own "neck of the woods."

I do take issue with the idea that ratings are meaningless -- they can be very useful but as Tom D mentioned it all comes down to the judgement of the people doing the reviewing.

Anthony Fowler

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #17 on: December 28, 2010, 01:39:43 PM »
Sean: Pennard is T97 in the list above, definitely in the same range as Machrihanish.  Also, there is nothing wrong with raters saying that two courses are about equal.  If the same rater gave Pennard and Machrihanish the same score on a 0 to 10 scale, they're saying that they are roughly tied in their mind which is find with me.

Charlie: I do not believe that this data has updated ratings, but it would be great to have a way for people to continually update.  For one, we could have updated ratings.  Also, you do something that George would like.  We could find out which raters are similar and show each participant the courses that are highly rated by others with similar preferences.

George: See my other thread on the factors that separate us.  The whole goal of that analysis is to try to develop a personalized list of courses based on the ratings of others who are similar to you.  If you participated in the rating, I would be happy to help you figure out where you stand and figure out which courses you personally would prefer more so than the average GCA rater.

Adam_Messix

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #18 on: December 28, 2010, 03:46:58 PM »
Anthony--

That's an interesting idea, comparing your personal scores used for the ranking versus the average of the group. 

George P--

No question that how raters are selected is important but if you take it as fact that raters opinions are honest and are not skewed by outside factors, then I still think the Magazine's system of rating is more important.  Take the opportunity to score courses using Golf Digest's method and you'll see what I mean.  This is not a criticism of how Digest does it, just saying that it plays to certain strengths.  Doak makes note of this in the Confidential Guide. 

Sean--

When two courses have a similar score in my mind, then you break the tie via personal preferences.  That's the spice of life.  You should base your ranking of golf courses based on wat you like and that's great.  Golf Course Architecture is a large world and there's room for many different types of courses. 

Sean_A

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #19 on: December 28, 2010, 05:38:03 PM »
Adam

I disagree.  Great architecture imo has little to do with what I like.  Sure, there is some cross over, but my preferences and what is great are two separate concepts.  If I am talking only about the merits of a design than I try to keep my preferences out of it because they are quite defined, but not necessarily limiting.  Additionally, I am on much softer ground when talking about great architecture because I don't have any experience with the subject except from a user PoV - which is quite limiting. 

Anthont

That was my point.  What is wrong with saying two or ten courses are about equal and leave it at that?  To me, the actual number assignment is misleading and unnecessary.

Ciao
New plays planned for 2024:Winterfield, Alnmouth, Camden, Palmetto Bluff Crossroads Course, Colleton River Dye Course  & Old Barnwell

Jim Nugent

Re: Re-analysis of GCA's Top 100
« Reply #20 on: December 29, 2010, 11:14:47 AM »
Anthony, I'm not sure I understand your methodology.  You looked at situations where raters had played two courses on the list.  You compared the Doak score of each rater, for both courses.  If more raters gave higher Doak scores to one course, that course got a 1.  If it was a tie -- i.e. an equal number of raters gave a higher Doak score to each course -- both courses got a 1/2. 

Then you added up the total number of points for each course.  You rank the courses in order of which ones have the most points, with the highest points being first, 2nd highest being second, and so forth.  Is that right, or am I missing something? 


Anthony Fowler

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #21 on: December 29, 2010, 11:22:14 AM »
Jim: I think you have it right.  For every possible pair of courses, I looked only at the raters who had played both.  Then I allocated 1 point between the two courses just as you described.

There were 177 courses that were in the top 200 and had more than 10 ratings.  Therefore, for each course they could earn as many as 176 points (which Pine Valley did) or as little as 0.  To say that Pine Valley beat every other course in a head-to-head match-up is much more meaningful than saying that they had the highest average rating among a list a raters who all played a different subset of courses.

Jim Nugent

Re: Re-analysis of GCA's Top 100
« Reply #22 on: December 29, 2010, 11:48:14 AM »
One immediate point is that the raters didn't really do head-to-head.  They gave Doak points, that you use as a proxy for head to head.  If they actually had done head-to-head rankings, I wonder if the results might have differed a bit? 

Jud_T

  • Karma: +0/-0
Re: Re-analysis of GCA's Top 100
« Reply #23 on: December 29, 2010, 11:53:58 AM »
It would be interesting to do a head to head bracket for say the top 256 courses.  Not sure what it all means, but it'd be a fun exercise nevertheless...
Golf is a game. We play it. Somewhere along the way we took the fun out of it and charged a premium to be punished.- - Ron Sirak

Jim Nugent

Re: Re-analysis of GCA's Top 100
« Reply #24 on: December 29, 2010, 01:06:01 PM »
Anthony, can you post some of the actual scores.  e.g. how did the PV/CPC "matchup" turn out?   

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back