News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #150 on: December 30, 2012, 09:11:18 AM »
Thanks, Kevin, I hadn't seen that. Yes, it appears Jim C did his best to apply the method to a survey that wasn't head to head and correctly noted that approach doesn't solve transitive violations. The survey instrument obviously matters a great deal; it influences and even determines outcomes, so an actual head to head would be best.

In the end, everything depends on what we're measuring. No one other than Ron Whitten should be under any illusions that we're doing anything other than measuring the subjective preferences of a defined population. As with the notion of the "ideal woman," tastes are subjective and vary, often greatly, over time.

I'd like to see the statistical experts like Anthony Fowler, Jim Colton, JC Cummings, et al discuss the merits / problems of the following two exercises.

I. Scaled comparison course ranking:

1. Head to head comparisons.
2. Dump Doak Scale (I don't see how is it remotely possible to apply in a scaled comparison exercise) and 1-to-10 types of scoring systems and force reviewer to allocate 10 rounds between the two, instructing the reviewer to neglect distance, money, and access as considerations. (We could even make it head-to-head-to-head or head-to-head-to-head-to-head.) The outcome of this step is a measure or weight for each course as given by the individual rater.
3. Collate raters' course scores into a weight for each course.
4. Do NOT just list the courses for a ranking! Instead, graphically show the relation of the weights to each other. I think this would show that only a handful of courses matter. The rest is just noise / random walking.

This approach is criteria free, which is fine because in my heart of hearts I believe that people, unless given the narrowest, least-interesting, totally straitjacketing criteria, deep down inside just pick things they like for whatever reason. And if you ask them the reason they may not know or they may be lying to us and / or to themselves. They / we probably don't even realize it.

So...

II. Ranking criteria -- not sure I have the order right but here goes:

1. Prior to the scaled comparison exercise above, have each reviewer assign 100 pennies total across a list of architectural and non-architectural attributes.
2. Time gap / break.
3. Reviewer completes scaled comparison course-ranking exercise.
4. For a random selection of courses, have reviewer tick boxes next to the list of attributes he feels the course possesses.
5. Derive weightings of attributes using reviewer's scaled-comparison scores.
6. Normalize attribute weightings across all reviewers.
7. Compare to a priori 100 pennies exercise.

This would allow us to infer (fancy guess) the criteria raters use. It would allow us to explicitly weed out non-architectural criteria and then go back and reweight reviewers' original course rankings. And it would allow us to see how closely reviewers' stated criteria (like a magazine's rating criteria) match what they actually value in course attributes.

Mark
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.

Jud_T

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #151 on: December 30, 2012, 09:18:45 AM »
Mark,

I like it.  Let's do it! You're in charge of Son of Unofficial GCA Rankings.  I nominate you, anyone second?  8)
Golf is a game. We play it. Somewhere along the way we took the fun out of it and charged a premium to be punished.- - Ron Sirak

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #152 on: December 30, 2012, 09:22:54 AM »
Mark,

I like it.  Let's do it! You're in charge of Son of Unofficial GCA Rankings.  I nominate you, anyone second?  8)

2nd!!!!
Sportsman/Adventure loving golfer.

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #153 on: December 30, 2012, 09:25:43 AM »
Guys,
Part of the issue I think you have with the Digest results is just a difference of opinion as to what makes a golf course great in the first place. Many of you over time have voiced complaints with the categories.  The list rates golf courses, not architecture. I actually think this new list does a pretty good job of ranking courses that are proficient in the categories Golf Digest uses to define greatness. You can argue all you want whether the categories should be changed or whether the categories or results are good for golf, etc., but its a subjective exercise at the end of the day that's designed to elicit discussion (and perhaps some arguments). There's always more discussion of the Digest list than of any other non-internal list that comes out, so perhaps that's a good thing for growing interest in the topic. I'd argue the 100 Greatest gets more people to think about golf courses (and architecture as a subset of that) than just about any other medium, so that's probably a good thing even for those of you that don't love the list itself.

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #154 on: December 30, 2012, 09:33:34 AM »

The "head-to-head" concept you mentioned reminded me of a methodology that Anthony Fowler proposed (and Jim Colton ran with) from about 2 years ago.

http://www.golfclubatlas.com/forum/index.php/topic,46888.0.html

I thought it was an improvement over the traditional methods.  I'd be interested to hear your thoughts on their method.

Isn't this how LINKS Magazine is now compiling its own version of the top 100?  Head to head matchups?  [Hopefully they are using Jim Colton's approach to counting up the results instead of Anthony's.]

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #155 on: December 30, 2012, 09:38:05 AM »
Guys,
Part of the issue I think you have with the Digest results is just a difference of opinion as to what makes a golf course great in the first place. Many of you over time have voiced complaints with the categories.  The list rates golf courses, not architecture. I actually think this new list does a pretty good job of ranking courses that are proficient in the categories Golf Digest uses to define greatness. You can argue all you want whether the categories should be changed or whether the categories or results are good for golf, etc., but its a subjective exercise at the end of the day that's designed to elicit discussion (and perhaps some arguments). There's always more discussion of the Digest list than of any other non-internal list that comes out, so perhaps that's a good thing for growing interest in the topic. I'd argue the 100 Greatest gets more people to think about golf courses (and architecture as a subset of that) than just about any other medium, so that's probably a good thing even for those of you that don't love the list itself.

Andy:

The GOLF DIGEST list always "elicits more discussion" than any of the other lists because their formula is flawed but they just keep on using it in spite of the results it produces.  That does not make it a better ranking.  It makes it more influential ... in a bad way ... and it has done harm to the practice of golf course design because of its influence.

None of the other rankings really influence what architects are trying to do ... we want to build the "best" course but none of the other rankings give us any suggestion of how to succeed.  GOLF DIGEST tells us that if we build tees at 7700 yards and spend a lot on aesthetics and conditioning, we will do better, and if we tailor our definition of "shot values" to that of the average GOLF DIGEST panelist, who can't even describe it well, then we'll do best.

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #156 on: December 30, 2012, 09:39:47 AM »
...the issue I think you have with the Digest results is just a difference of opinion as to what makes a golf course great in the first place. Many of you over time have voiced complaints with the categories.

Agreed 100%.  That is the entire deal in this, and frankly in every, ranking list.  Criteria used.  We should be focusing on that and weightings of the criteria or process used for collecting and calculating the data relative to that criteria, before we can discuss the results.
Sportsman/Adventure loving golfer.

Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #157 on: December 30, 2012, 09:48:37 AM »
Andy,

First and most importantly, I appreciate your standing up and not hiding.

I agree with some of your statement but I believe:

1. The purpose of GD's list, just like other magazines', is not to generate discussion but to promote Golf Digest. I don't have a problem with this. People gotta eat. But what I do decry is the way they festoon these things with deceptive, pseudo-scientific, bogus authoritative baubles like "greatest" and "scores" that laughably are carried out to the hundredths as though they were measuring the Higgs Boson. GMAFB.

Well, it would be laughable if...

2. Rankings generate discussion here but in the real world they are used as a substitute for thought, with predictable results. Which is why I believe they should come with black-box warnings.

But I see Jud and Mac have nominated themselves to start a new ranking system. I wish them best.
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.

Jim Colton

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #158 on: December 30, 2012, 09:53:36 AM »
With respect to the head-to-head match-ups, the article accompanying the rankings said that 22 panelists had played both Pine Valley and Augusta and 16 of the 22 scored Pine Valley higher. That probably was true the last four years when Augusta was #1.

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #159 on: December 30, 2012, 09:56:59 AM »
With respect to the head-to-head match-ups, the article accompanying the rankings said that 22 panelists had played both Pine Valley and Augusta and 16 of the 22 scored Pine Valley higher. That probably was true the last four years when Augusta was #1.

That's pretty crazy to me.  GOLF Magazine always gets banged on for having a much smaller panel than DIGEST, but there must be at least 50 people on their panel of 100 who have played both Augusta National and Pine Valley and can compare the two.

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #160 on: December 30, 2012, 09:57:54 AM »
But, the first problem is the very idea that there is a "truly accurate rating" of such a subjective venture.  What makes the best courses the best is a truly subjective thing.  Everyone's got their own opinion, and the only reason one person's opinion is more valuable than another is if you personally agree with their viewpoint.  

Tom,
Do you really believe the above given your "flawed" comment (and many others like it)?

There are no "better" rankings IMO--they are all computed averages of select populations. Some are more appealing to you or I because they spit out results that fit our personal preferences, but that's still individual and subjective. As Mac said, you have to determine the validity of the whole exercise for yourself on a personal level. The frustration is that this ranking, which many of you don't care for, is widely accepted by a broader population.

Greg Holland

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #161 on: December 30, 2012, 10:00:20 AM »
With respect to the head-to-head match-ups, the article accompanying the rankings said that 22 panelists had played both Pine Valley and Augusta and 16 of the 22 scored Pine Valley higher. That probably was true the last four years when Augusta was #1.

That's pretty crazy to me.  GOLF Magazine always gets banged on for having a much smaller panel than DIGEST, but there must be at least 50 people on their panel of 100 who have played both Augusta National and Pine Valley and can compare the two.

If I recall correctly, the article said about 140 raters had played PV, 50 or so had played ANGC, and only 22 had played both.

Kevin_D

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #162 on: December 30, 2012, 10:17:05 AM »
I thought I would chime in as a "user" of the rankings as opposed to a "contributor".  I'm not an architect, developer, or rater. I haven't even played any top 100 courses yet - but have several in the works for 2013!

As a newbie I hope this post doesn't get my head taken off!  He goes:

I wonder, why doesn't everyone take these rankings - and in fact, all rankings (other golf rankings, colleges, etc) - with an enormous grain of salt? Any ranking is ultimately trying to quantify something that is inherently subjective.  I'm sure the GD rating system can be drastically improved, but doubt more statistical manipulation is the way to do it (I say this as someone whose undergraduate and graduate schooling as well as career are all mathematical-related).  I would argue that a far better way would be to have FEWER raters - say 5-10 - who actually know what they are talking about!  There would be no need to replay every course every year, unless substantial changes had occurred.  They could solicit potential new entrants for consideration each year, and try to play the best candidates and decide how they rank against the established greats.  Ultimately, the best ratings system is probably the Doak scale, in which you essentially have a lot of ties, and just tiers of courses, and not much changes any given. But that doesn't sell magazines...

I do understand the argument that the GD rankings influence new course construction negatively.  If that is really the case,it's just sad.  Unfortunately, outside rankings/evaluations by parties that have no idea what they are doing seems to influence a lot of professions (I know it does mine).  I guess golf is worse in a way, since you end up with courses built that last forever and can negatively influence the game (lead to courses that are too hard, too tricked out, too expensive, and that encourage taking a cart instead of walking).

Anyway, I guess given the current GD rankings, I would be most interested in hearing from the true experts: which courses should be in the top 100 that have been excluded, and which are on there that shouldn't be?  Which in the top 100 are grossly over- or under-rated?

Kevin
« Last Edit: December 30, 2012, 10:38:57 AM by KVDreyer »

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #163 on: December 30, 2012, 10:21:44 AM »
I think I would have less concerns with the GD list if they just did a classic list ala Golf Week. They still are missing Shoreacres, Yeaman's, Piping Rock, and Camargo, but I think the list becomes much less flawed that way. Now the modern list would be another story.

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #164 on: December 30, 2012, 10:30:07 AM »
One interesting thing that I noticed about the 2011-12 rankings Baltrusol Lower was ranked 32, but Baltrusol Upper did not receive enough ballots to make the list as it ranks between the 76th course and the 91st course on state rankings. Now it makes the list on the new edition, but essentially you had raters going and playing the Upper course without playing the lower course so it dropped off the list. I'm not sure if there was renovation responsible for this, but if not I find that absurd.  ???

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #165 on: December 30, 2012, 10:35:06 AM »
But, the first problem is the very idea that there is a "truly accurate rating" of such a subjective venture.  What makes the best courses the best is a truly subjective thing.  Everyone's got their own opinion, and the only reason one person's opinion is more valuable than another is if you personally agree with their viewpoint.  

Tom,
Do you really believe the above given your "flawed" comment (and many others like it)?

There are no "better" rankings IMO--they are all computed averages of select populations. Some are more appealing to you or I because they spit out results that fit our personal preferences, but that's still individual and subjective. As Mac said, you have to determine the validity of the whole exercise for yourself on a personal level. The frustration is that this ranking, which many of you don't care for, is widely accepted by a broader population.

Andy:

Absolutely.  There is no "right" way to do the rankings but there are many "wrong" ways.  Writing a definition of a great course as a model for your rankings, that most people find fault with, is a pretty bad first step.  That's my problem with the GOLF DIGEST ranking.  If they would just admit it's subjective -- as every other ranking does -- no problem.  But they always use the word "objective" in their accompanying article, and attach superiority to their flawed definition of greatness.

Do you really believe that the GOLF DIGEST definition of a great course is correct?

Does anyone?

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #166 on: December 30, 2012, 10:42:32 AM »
I've never played Oakland Hills or Oak Hill, but are they really better golf courses than Pinehurst #2? I just find that hard to believe based on what I know of the first two courses.

David Davis

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #167 on: December 30, 2012, 10:54:47 AM »
Ok, it's all subjective, I'm happy to accept that. May I just ask the question I didn't see yet and hope not to cause too much controversy. How much does it cost to move from say 55 in the US rankings to top 20? What about from top 20 to top 10?

And before several raters come back and say that rankings don't allow that I will point out that just because you are a rater and do a great job following the system you are given to work with doesn't mean you know what's done with the results and that what I'm mentioning is not the case. One thing is for certain, I guarantee it's a good investment for some of these courses. I will also go as far as saying I'm certain this is common practice although I don't know anything about GD. I do know about others and while this is an off the record practice it is also ethically questionable (although perhaps only in my personal opinion) it's still a business and advertising is how it's funded.

For argument sake, humor me a little please, if you were a resort with an excellent course that perhaps deserved to be top 50 in the list according to the subjective rating system and you had a chance to buy 10 or 20 places on the list and knew 1 million golfers were reading it and as a result putting your club on their bucket list as they would not have the opportunity most likely to get on the private clubs, what would that be worth to you in terms of greenfees and how much would you be willing to pay for it from your ad budget?

Sharing the greatest experiences in golf.

IG: @top100golftraveler
www.lockharttravelclub.com

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #168 on: December 30, 2012, 11:04:24 AM »

For argument sake, humor me a little please, if you were a resort with an excellent course that perhaps deserved to be top 50 in the list according to the subjective rating system and you had a chance to buy 10 or 20 places on the list and knew 1 million golfers were reading it and as a result putting your club on their bucket list as they would not have the opportunity most likely to get on the private clubs, what would that be worth to you in terms of greenfees and how much would you be willing to pay for it from your ad budget?


David:

I do not think that GOLF DIGEST sells upgrades to their list, so this is probably not the best place to have the hypothetical discussion you have suggested.  I do fear that a few spots on some of the other lists are bought and paid for, but my knowledge of that is strictly second-hand.  I can say that nobody has ever offered money to improve their Doak scale rating ... although perhaps that's because I put out my ratings before they knew what was coming!

But, really, I don't know that it helps a lot to be ranked higher.  People on GCA pay attention to whether Pacific Dunes has moved down four spots on the list, but I don't, because I know most others don't either. 

Maybe being in the top 50 is better than being in the top 100, and being in the top 10 is better than the top 50 ... but I don't believe anyone has ever bought their way into the top 10 in ANY ranking discussed here.  Being in the top ten might be worth another $20 on the green fee times however many rounds you are hosting for the two years the course is ranked that high ... and even more if it attracted people to a course that wasn't currently playing near to capacity.

But, you couldn't afford to buy your way onto the top ten of ALL the lists, so buying your way onto one would be viewed a bit suspiciously. 

BCrosby

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #169 on: December 30, 2012, 11:06:07 AM »
Some thoughts on rankings:

- A history of rankings would be fascinating (and revealing). The earliest one I know of was conducted by John Low in Nisbet's Golf Year Book in 1905 or so. Low polled the top amateur and professional players of the era. TOC came out on top, as might be guessed. Interesting was how low many pros ranked TOC.

- Then there was Joshua Crane's ranking system in the mid-20's. A topic for another time.

- All rankings, including Low's, are commissioned by periodicals to sell stuff. However you define 'stuff'. GD is in the business of distributing adverts to as wide a public as possible. They do that by way of a monthly magazine that people will buy for its swing tips, golfer gossip and course rankings, among other things. Which means that the ultimate test of a ranking, not unlike the value of a swing tip, is not whether it is ultimately true or accurately maps the real world. It's whether it attracts reader eyeballs.    

- The rat in the wood pile in rankings is the issue of access. Let me come at this indirectly. Why does no one care about rankings of great composers? Bach vs. Mozart vs. Brahms, etc.? Or artists? Why don't your read annual Rolling Stone pieces on where The Beatles rank vs, Jerry Lee Lewis? You don't see those articles because no one cares. No one cares because I can listen to Bach or Mozart or Jerry Lee Lewis and come up with my own preferences. I don't need a magazine to guide me in setting my preferences.

None of that is possible with golf courses. Few people will have played more than a handful of the best courses in the world, whether because of limited access or because they don't have the resources to visit courses spread across the world. Course rankings allow us to experience vicariously courses we'll never play by seeing how those unplayed courses rank against courses with which we are more familiar. The appeal of rankings is that they make readers feel like they are learning something (albeit not much) about courses they will never see or play.

Which means that the real value of rankings to the magazines that carry them is not whether they correspond to reality or whether they match up with criteria for good architecture or whether the reflect a consensus of the experts. No one (other than us GCA nutballs) really cares about any of that. What matters is that readers get a second hand feel for courses with reputations but which they will likely never experience. (Which helps explain why it is virtually impossible for a course without a reasonably wide reputation to crack into the rankings. It also explains why the big splash of new courses gets them into the rankings and why, when the spash fades, they disappear from the rankings.)

- Golf course rankings are not really about ranking golf courses for the simple reason that no ranking has a good answer to the "compared to what?" question. What rankings are really about is ranking the prevailing reputations of golf courses at a point in time.  

Bob      

  
« Last Edit: December 30, 2012, 11:22:59 AM by BCrosby »

C. Squier

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #170 on: December 30, 2012, 11:10:34 AM »
The main problem, IMO is that they use a set of criteria that very few of us would come up with ourselves if we had to say "what things make courses great?"

I may actually try to rate courses I've played using the GD criteria.....who knows, my list may look just like theirs. The criteria just don't sync with what my "great course" ideals.

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #171 on: December 30, 2012, 11:32:05 AM »

Which means that the real value of rankings to the magazines that carry them is not whether they correspond to reality or whether they match up with criteria for good architecture or whether the reflect a consensus of the experts. No one (other than us GCA nutballs) really cares about any of that. What matters is that readers get a second hand feel for courses with reputations but which they will likely never experience. (Which helps explain why it is virtually impossible for a course without a reasonably wide reputation to crack into the rankings. It also explains why the big splash of new courses gets them into the rankings and why, when the spash fades, they disappear from the rankings.)

- Golf course rankings are not really about ranking golf courses for the simple reason that no ranking has a good answer to the "compared to what?" question. What rankings are really about is ranking the prevailing reputations of golf courses at a point in time.  

Bob      

  

I think this is exactly how they should be taken. Rankings have been a starting point towards greater knowledge for me.


Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #173 on: December 30, 2012, 11:39:52 AM »
I don't think it is possible to "buy" your way into the top ten in GD. You might be able to spend money to create resistance to scoring and  aesthetics, but in reality the top 12-13 courses have changed very little since I have been subscribing to GD, and they look remarkably similar to GOLF, LINKS, and Golf Week (at least their classic courses). GD does a decent job at the top, but that just goes back to an earlier point I made that once you weed out the truly great courses there are probably around 200+ courses that are really pretty close.  I think the fact that GD tries to come across as scientific is what gets everybody so riled up about it.

Ronald Montesano

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #174 on: December 30, 2012, 04:38:28 PM »
If only they measured midi-chlorians, we'd know for certain...
Coming in 2024
~Elmira Country Club
~Soaring Eagles
~Bonavista
~Indian Hills
~Maybe some more!!

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back