News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #125 on: December 29, 2012, 05:42:15 PM »

But, the first problem is the very idea that there is a "truly accurate rating" of such a subjective venture.  What makes the best courses the best is a truly subjective thing.  Everyone's got their own opinion, and the only reason one person's opinion is more valuable than another is if you personally agree with their viewpoint.  Having hundreds of anonymous raters write down a bunch of numbers does not make the process objective.  Garbage in, garbage out.

Sorry, poor choice of words.  I meant "statistically" accurate, as an attempt to compensate for potential data bias or improper application of a "standardized" scale. 

However, to your larger point, I'm not attempting to give these quantitative rankings any more value than a neat theoretical exercise to be taken with many grains of salt.  As you indicated, ratings mean little without qualitative feedback, so I can determine what the raters value (and if it aligns with my priorities).

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #126 on: December 29, 2012, 05:44:38 PM »
On the topic of statistical analysis, the golf digest criteria indicates "outlier" scores are thrown out before the list is published.  It's also interesting Dean Knuth does their analysis as he's often credited with the creation of the slope/rating system.

Thanks for the information.  I may try to contact Dean through his "Pope of Slope" website.

I doubt he would release data, but perhaps he'd be interested in a similar analysis (I can dream).

Kevin,
Unless I misunderstand the process, the information is not Dean's to release. He processes it for the various internal procedures that I won't discuss here, but the magazine is very specific with panelists that our ballots are secret. I have no access to any data other than my own ratings and what is released in the magazine issue itself--I can't imagine they would release it to anyone else.

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #127 on: December 29, 2012, 05:55:01 PM »
Resistance to scoring is rated by panelists instead of using rating/slope because there is supposed to be a "fairness" component and not just pure difficulty. There are courses like Pine Valley and Oakmont that are fun challenges and there are others that are penal slogs, and we do have the ability to distinguish between the two--obviously that's not the technical concept but hopefully the idea is there. Rich Harvest has one hole that becomes difficult because of a tree in the middle of the fairway blocking most of the route to the hole--that's not a positive for resistance to scoring in my eye, even if it raises the difficulty of the hole.

Personally, I wish playability was still considered, because I think it balances resistance to scoring. But I don't get a vote.

Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #128 on: December 29, 2012, 06:17:59 PM »
Thanks for the information.  I may try to contact Dean through his "Pope of Slope" website.

I doubt he would release data, but perhaps he'd be interested in a similar analysis (I can dream).

Kevin,
Unless I misunderstand the process, the information is not Dean's to release. He processes it for the various internal procedures that I won't discuss here, but the magazine is very specific with panelists that our ballots are secret. I have no access to any data other than my own ratings and what is released in the magazine issue itself--I can't imagine they would release it to anyone else.

I assumed that the data would not be released by Dean (or anyone else).  In a best case scenario, I thought Dean may be enough of a numbers freak that he may be interested in performing his own type of volatility analysis to see which courses bring out the widest range of opinions (or may have done it already).  It may even make an interesting sidebar or listing for the magazine. 

But I realize it's the longest of longshots.

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #129 on: December 29, 2012, 06:22:15 PM »
I expect Dean knows which courses have the greatest volatility--he calculates the standard deviation for each course at minimum to determine the outliers. Whether he could or would talk about the results, I don't know.

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #130 on: December 29, 2012, 06:44:08 PM »
Whether he could or would talk about the results, I don't know

Oh, he'll talk alright.  Kevin is VERY persuasive.

 ;)
Sportsman/Adventure loving golfer.

Mark Steffey

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #131 on: December 29, 2012, 06:46:22 PM »
1) using existing course slope/ratings would be subjective as well.  these are conducted by individual state associations and while supposed to use guidelines they are questionable.  just using myself for example, i moved from a course that had marked hazards on every hole and many forced carries to a course that has only one pond to carry and a couple creeks that are for drainage only and usually dry and playable.  

the old club, with all the hazards and such was 6,667 from the tips and 0.8 over par with a slope of 136
my new club, with one hazard and the ability to play from two fairways over, was 3.5 over par and a 135 slope from only 6,588 at the tips.

we had it re-rated a couple years ago as some o.b. was remove and it is now 2.9 over par.

really no explanation how this could be (and my hdcp did fall from a range of 9-11 to 4-6 :( )

2) scratch too as used to resistance to scoring is not a professional.  in my example above if i shoot 3 over at home i am scratch.   i recall one of the magazines giving an example of what tiger's ghin would be and it was like +8.3 for the months they covered.

3) there is some static in the numbers involved too.  i don't think a new rater is going to call up angc and get a tee time.  so the numbers that pv and angc have are not changing except for some limited additions that established individuals who already have access to these places will get to go and see changes and adjust their ratings forthwith.   given this, a new rater will of course base his call on a new course he plays on his experience, but also realize if he calls alotion a '10' he may think "can i justify this as being better than pv?"  "is this really possible?"  "maybe i'll just call it an 8."  so whether peer/behavior aspects come into play too will keep some things the same or trend in the same directions.

4) some courses do in fact 'not care' about their ratings.  i played a rhody course late this year and my guest and i were the last to leave and spent some time with the gm.  i asked him point blank "how do you stay under the radar?"  he laughed and ask me to expand my question.  they are not on the states top 10 list.  he said that he'll get calls from raters and magazines wanting to see the course and do stories.  he asked the cmte members if they have interest and are told "no".  he shared though that did get them on a list of 'snobbiest clubs' :)

Terry Lavin

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #132 on: December 29, 2012, 07:53:18 PM »
Golfweek's list is pretty much the poster child for herd mentality.

Could you expand on that? I think I understand what you are saying, but I'm not 100% sure.

Brad Klein does a good job of selecting and grooming raters who wind up essentially parroting his likes and dislikes in course architecture. I've heard many anecdotal reports of interventions to keep the rankings fairly consistent. In speaking with raters, the groupthink is as obvious as it can be. Combine the above with rater trips and personal access to great courses if you maintain your status as a GW rater and you have a neat formula to make sure that everybody winds up having a similar walk in the park. And lest one think this is shrill criticism of Brad or GW, let me assure you that I think GW has the list that resonates best with my tastes and I consider Brad a good friend. I just don't want to be one of his raters.
« Last Edit: December 29, 2012, 08:12:03 PM by Terry Lavin »
Nobody ever went broke underestimating the intelligence of the American people.  H.L. Mencken

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #133 on: December 29, 2012, 08:02:43 PM »
Can anyone explain the disconnect between the State lists and the Top 200 rankings?

If you use the state rankings as a guide, there are at least 345 courses that have a higher score than Seven Canyons, the 200th ranked course on their list. There are a bunch of private clubs that would be in the second 100 if they had the 45 votes. There are 18 or 19 public courses in the first 100, and around 59 in the second 100. That just doesn't pass the sniff test.

Jim, can you expound on this?  Do you think like Mark suggests that the numbers are fudged? 


I am wondering if requiring 45 raters to play each course is too many.  Thoughts?
Sportsman/Adventure loving golfer.

Jim Colton

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #134 on: December 29, 2012, 08:22:03 PM »
Not fudged. Just look at the state by state rankings and use some logic. By the old transitive property, you can isolate about 145 courses that must have higher scores than the 200th ranked course. They are in the state by state rankings (10 ballots) but not the 45. The actual number is probably much larger, but you can only deduce 345 for certain.

Andy and I were speculating that the population of courses on the ballot with at least 45 votes is probably somewhere between 325-400, so they are simply publishing the top 200 of that subset.

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #135 on: December 29, 2012, 08:29:20 PM »
Not fudged. Just look at the state by state rankings and use some logic. By the old transitive property, you can isolate about 145 courses that must have higher scores than the 200th ranked course. They are in the state by state rankings (10 ballots) but not the 45. The actual number is probably much larger, but you can only deduce 345 for certain.

Andy and I were speculating that the population of courses on the ballot with at least 45 votes is probably somewhere between 325-400, so they are simply publishing the top 200 of that subset.

If you were the GD czar would you lower that 45 ballot requirement?
Sportsman/Adventure loving golfer.

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #136 on: December 29, 2012, 08:31:18 PM »
Personally, I wish playability was still considered, because I think it balances resistance to scoring. But I don't get a vote.

Playability has NEVER been one of the criteria for the GOLF DIGEST 100 Greatest list.

They did take ratings on it and use them in judging the 100 Greatest Public Courses, but never for the main ranking.

Mark Steffey

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #137 on: December 29, 2012, 08:42:09 PM »
If you were the GD czar would you lower that 45 ballot requirement?

isn't that based upon the total # of raters?  be it 450 (10%) or some number to get breadth?

Tom_Doak

  • Karma: +2/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #138 on: December 29, 2012, 08:45:29 PM »
Not fudged. Just look at the state by state rankings and use some logic. By the old transitive property, you can isolate about 145 courses that must have higher scores than the 200th ranked course. They are in the state by state rankings (10 ballots) but not the 45. The actual number is probably much larger, but you can only deduce 345 for certain.

Andy and I were speculating that the population of courses on the ballot with at least 45 votes is probably somewhere between 325-400, so they are simply publishing the top 200 of that subset.

If you were the GD czar would you lower that 45 ballot requirement?

I guarantee you they are looking at the results with a smaller ballot requirement, and deciding that while some worthy courses would be included, there are too many other courses that would crash the party and cause the list to be questioned even harder than it is now.  That's what editors do.

I am sure that Jim C. could produce a list of some of these easily, based on where courses were listed on the state rankings vs. the top 100 list vs. the top 200 list; the only thing he wouldn't have are the raw numbers to place the courses exactly within the top 100.
« Last Edit: December 29, 2012, 08:48:42 PM by Tom_Doak »

Jim Colton

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #139 on: December 29, 2012, 08:46:54 PM »
Not fudged. Just look at the state by state rankings and use some logic. By the old transitive property, you can isolate about 145 courses that must have higher scores than the 200th ranked course. They are in the state by state rankings (10 ballots) but not the 45. The actual number is probably much larger, but you can only deduce 345 for certain.

Andy and I were speculating that the population of courses on the ballot with at least 45 votes is probably somewhere between 325-400, so they are simply publishing the top 200 of that subset.

If you were the GD czar would you lower that 45 ballot requirement?

Mac,

 If was the GD czar, changing the 45 cut-off would be well down the list of priorities. Golf Magazine and Links do okay with probably far fewer visits. Other than Camargo, Yeamans and a couple others, it gets most of the worthy courses. Calling your list "America's greatest 100" and not having them on it kinda undermines the list, though maybe not as much as some of the courses that are on the list! If they continue posting the second 100, hopefully they will stress their panelists to get out to the second tier so the list is as robust as it can be.

 The article talks about Augusta having 52 ballots, which I guess could put it at risk of falling below 45 as some older ballots roll off. I doubt they would allow it to just disappear like Camargo -- I imagine there would be some editorializing if Augusta ever fell below the minimum.

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #140 on: December 29, 2012, 08:53:04 PM »
Personally, I wish playability was still considered, because I think it balances resistance to scoring. But I don't get a vote.

Playability has NEVER been one of the criteria for the GOLF DIGEST 100 Greatest list.

They did take ratings on it and use them in judging the 100 Greatest Public Courses, but never for the main ranking.

I didn't realize that, but the data is still being collected for playability. So there's still a chance!

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #141 on: December 29, 2012, 08:53:35 PM »
Jim...would you share a few things that would be at the top of your change priority list?
Sportsman/Adventure loving golfer.

Jim Colton

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #142 on: December 29, 2012, 09:07:56 PM »
Mac,

 I've been more than happy to share my thoughts with Ron and the powers that be.

 The Golf Digest list is dramatically different than the other publications. One thing that has never been clear to me is how much of that is driven by the system (the formula) vs the relative make-up of the panel. It'd be fine if the list was a fair representation of the collective view of the panel, but I think the formula is too rigid to account for that.

 The numbers cruncher in me thinks it would be relatively easy to figure out what the "right" weights should be for each category. Just ask the panelists to submit an overall score along with the category submission, run a regression and see what shakes out. Golfweek could do this too with their myriad of data (it surprises me that they never do much with this information)

Tom, Andy sent me the list of the 145 (he had 141). I'll let him post it if he wants. I have the scores but didn't isolate the ones that didn't show up. Plus only the top 20 has been publicly available so I rather no go "Satterfield" yet :)

« Last Edit: December 29, 2012, 09:10:56 PM by Jim Colton »

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #143 on: December 29, 2012, 09:10:28 PM »
Thanks, Jim.
Sportsman/Adventure loving golfer.

Andy Troeger

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #144 on: December 29, 2012, 09:18:41 PM »

Tom, Andy sent me the list of the 145 (he had 141). I'll let him post it if he wants. I have the scores but didn't isolate the ones that didn't show up. Plus only the top 20 has been publicly available so I rather no go "Satterfield" yet :)

I'm not going to post it, but following the exercise is easy enough for anyone that has the entire list and cares enough to compare the 2nd 100 to the state lists. It was not a scientific exercise by any means.

Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #145 on: December 29, 2012, 10:14:29 PM »
Mac,

 I've been more than happy to share my thoughts with Ron and the powers that be.

 The Golf Digest list is dramatically different than the other publications. One thing that has never been clear to me is how much of that is driven by the system (the formula) vs the relative make-up of the panel. It'd be fine if the list was a fair representation of the collective view of the panel, but I think the formula is too rigid to account for that.

 The numbers cruncher in me thinks it would be relatively easy to figure out what the "right" weights should be for each category. Just ask the panelists to submit an overall score along with the category submission, run a regression and see what shakes out. Golfweek could do this too with their myriad of data (it surprises me that they never do much with this information)



It might be more interesting to turn it around. "Force" the raters to select the higher of two courses in a long series of one-to-one comparisons. After that exercise have them associate attributes with specific courses, qualitatively not quantitatively. These should be a mix of architectural and non-architectural attributes: the latter should include everything from clubhouse food to their playing companions. (They don't need to be given a choice from every attribute on each course.) Run them through the head-to-heads and attribute linking quickly; don't give them enough time to think about any of this too much.

After collecting the results, next infer the "ideal" attributes.

Instead of asking people to score on the attributes and the overall -- ie, what raters are supposed to rate -- this approach should get at what raters really rate. The raters would rate the courses in simple fashion then afterwards ascribe attributes; the analysis would show which attributes matter and how much.

The results likely would be quite illuminating. As a bonus, non-architectural attributes at last could be screened from the final result. I haven't thought it through but I think this approach would not be troubled, at least not nearly to the same degree as the current approach, by small sample sizes. The analysis of attributes is not constrained by a need for a minimum number of scores across the courses.

Speaking of which, and riffing on your post, another thought: small sample sizes bedevil these rankings. GD tries to solve this problem by using stringent criteria and by "grooming" their pool of raters. That way they don't need every rater to see every course; their hope is that raters rate homogeneously, like interchangeable automatons.

As already discussed, that's not actually how things happen. A workaround for some people on this board is to single out an individual whose opinion they agree with / respect and use their ratings / recommendations. But GD has this huge pool of raters: surely some of them, if freed from the narrow constraints of formulae, would share the individual reader's preferences.

So why not present the views of rater subsets that match the views of the individual reader? The data and analysis discussed above could feed into a collaborative filtering mechanism to do just that.

This would produce the world's first truly bespoke rankings system. It could even be customized to the situation; eg, "greatest" courses to play alone.

Okay, ramble over.
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #146 on: December 29, 2012, 10:32:58 PM »
Mac,

 I've been more than happy to share my thoughts with Ron and the powers that be.

 The Golf Digest list is dramatically different than the other publications. One thing that has never been clear to me is how much of that is driven by the system (the formula) vs the relative make-up of the panel. It'd be fine if the list was a fair representation of the collective view of the panel, but I think the formula is too rigid to account for that.

 The numbers cruncher in me thinks it would be relatively easy to figure out what the "right" weights should be for each category. Just ask the panelists to submit an overall score along with the category submission, run a regression and see what shakes out. Golfweek could do this too with their myriad of data (it surprises me that they never do much with this information)



It might be more interesting to turn it around. "Force" the raters to select the higher of two courses in a long series of one-to-one comparisons. After that exercise have them associate attributes with specific courses, qualitatively not quantitatively. These should be a mix of architectural and non-architectural attributes: the latter should include everything from clubhouse food to their playing companions. (They don't need to be given a choice from every attribute on each course.) Run them through the head-to-heads and attribute linking quickly; don't give them enough time to think about any of this too much.

After collecting the results, next infer the "ideal" attributes.

Instead of asking people to score on the attributes and the overall -- ie, what raters are supposed to rate -- this approach should get at what raters really rate. The raters would rate the courses in simple fashion then afterwards ascribe attributes; the analysis would show which attributes matter and how much.

The results likely would be quite illuminating. As a bonus, non-architectural attributes at last could be screened from the final result. I haven't thought it through but I think this approach would not be troubled, at least not nearly to the same degree as the current approach, by small sample sizes. The analysis of attributes is not constrained by a need for a minimum number of scores across the courses.

Speaking of which, and riffing on your post, another thought: small sample sizes bedevil these rankings. GD tries to solve this problem by using stringent criteria and by "grooming" their pool of raters. That way they don't need every rater to see every course; their hope is that raters rate homogeneously, like interchangeable automatons.

As already discussed, that's not actually how things happen. A workaround for some people on this board is to single out an individual whose opinion they agree with / respect and use their ratings / recommendations. But GD has this huge pool of raters: surely some of them, if freed from the narrow constraints of formulae, would share the individual reader's preferences.

So why not present the views of rater subsets that match the views of the individual reader? The data and analysis discussed above could feed into a collaborative filtering mechanism to do just that.

This would produce the world's first truly bespoke rankings system. It could even be customized to the situation; eg, "greatest" courses to play alone.

Okay, ramble over.

Uh, yeah...I think I'll hire Jim to run the Top 100 list. 

 8)
Sportsman/Adventure loving golfer.

Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #147 on: December 29, 2012, 10:35:30 PM »
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.

Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #148 on: December 29, 2012, 11:30:50 PM »

It might be more interesting to turn it around. "Force" the raters to select the higher of two courses in a long series of one-to-one comparisons. After that exercise have them associate attributes with specific courses, qualitatively not quantitatively. These should be a mix of architectural and non-architectural attributes: the latter should include everything from clubhouse food to their playing companions. (They don't need to be given a choice from every attribute on each course.) Run them through the head-to-heads and attribute linking quickly; don't give them enough time to think about any of this too much.


Mark,

The "head-to-head" concept you mentioned reminded me of a methodology that Anthony Fowler proposed (and Jim Colton ran with) from about 2 years ago.

http://www.golfclubatlas.com/forum/index.php/topic,46888.0.html

I thought it was an improvement over the traditional methods.  I'd be interested to hear your thoughts on their method.

However, yours sounds a bit more complicated than I could imagine. 

Perhaps you & Jim can work together to solve course rankings once & for all.  I'd help, but I'll be busy solving the Israel / Palestine conflict next week. :)

Ronald Montesano

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #149 on: December 30, 2012, 06:35:42 AM »
Problems (perhaps ultimately unavoidable) along the way (quoted from the thread Kevin cites above):

"if the panel were smaller and better traveled having seen the breadth of courses to which they can compare a Black Rock to Seminole, a LACC North to Yeamans Hall, a Shinnecock to a Eugene CC; etc.Giving up power is a hobbit thing.

"When two courses have a similar score in my mind, then you break the tie via personal preferences."  Is it come to this, that the ties of life must be avoided in a ranking? Not having ties sound (in most cases) like certain professional sports. In many cases, courses that would tie are disparate, raising a worthwhile philosophical debate/meditation.

"Frankly, too many raters are simply homers for their own "neck of the woods." I suspect that there are people who have the time, money and access to move frequently beyond "neck of the woods" to do this type of thing. Ultimately, it should be a diversion/distraction and not the stuff of life for us, our equivalent of the Swimsuit issue. In the end, all the suits are expensive and attractive and the models are hot. The problem arises when courses themselves influence and skew the ratings via unscrupulous means. I don't know how frequent such transgressions are, nor how effective.

My therapist suggests that I'm simply jealous of wealthy people with time on their hands and that I'm lashing out like a frustrated child. I don't disagree.
Coming in 2024
~Elmira Country Club
~Soaring Eagles
~Bonavista
~Indian Hills
~Maybe some more!!

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back