News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Jonathan Cummings

  • Total Karma: -4
Re: Golf Digest Best New
« Reply #50 on: January 26, 2025, 09:15:42 AM »
For whatever reason the owner of White Oak (that gets next to no play) wanted to expose his course to the rating panels.  So he instructed the manager to determine the required minimum number of votes necessary to qualify for the various lists, and give that number of raters access, then re-shut the door.  Can't speak to Digest but my understanding is GW was given a month to allow 8 foursomes (no more than one a day!) to play White Oak.  Have no idea about the Magazine panel access.

Derek_Duncan

  • Total Karma: -25
Re: Golf Digest Best New
« Reply #51 on: January 26, 2025, 11:34:26 AM »
JC Jones,


What is your deal? The only arrangement that any golf course or club has with Golf Digest, and I presume other publications, is they allow panelists to play the course. Many of the clubs charge their standard guest fee to do so. The panelists then file their score, and at the end of the process, whether it's for Best New or Best in State or 100 Greatest, our statistician tabulates the numbers and gives us back the results. That's it. Why is that so difficult for you to understand?
www.feedtheball.com -- a podcast about golf architecture and design
@feedtheball

JC Jones

  • Total Karma: 15
Re: Golf Digest Best New
« Reply #52 on: January 27, 2025, 11:25:26 AM »
JC Jones,


What is your deal? The only arrangement that any golf course or club has with Golf Digest, and I presume other publications, is they allow panelists to play the course. Many of the clubs charge their standard guest fee to do so. The panelists then file their score, and at the end of the process, whether it's for Best New or Best in State or 100 Greatest, our statistician tabulates the numbers and gives us back the results. That's it. Why is that so difficult for you to understand?



"The only arrangement"?  So, what do you call the 48 person non-arrangement at Cabot Citrus Farms in March where you are speaking to panelists? Did Golf Digest book all of the tee times on the website without "arranging" anything with the resort?  Why continue to pretend these things don't exist?


I understand Golf Digest's methodology, entirely.  I am well aware of the system and process that your "statistician," the Pope of the Slope, has implemented over the last 10+ years.  The pivot in ~2013 created a system that perpetuates groupthink vis a vis the Scorecard; the ballooning of the panel to 2000 people to generate $2mm in "entry fees" and $600k+ per year in revenue has turned the rankings into a joke; and, the "awards" are easily gamed by clubs knowing they only need to find 15 panelists out of 2000 who are enamored with exclusive access, free rounds and other perks, to win "Best New".  Moreover, the flood of these panelists who have no idea what they are doing, other than paying their way into an annual "free golf" punchcard, has caused most of these clubs to just shut down access all together because they dont want to participate in the Golf Digest charade.


The Golf Digest rankings are broken.  Which is sad for me, because I always viewed them as the gold standard for US100, Best Public and Best in State rankings.
I get it, you are mad at the world because you are an adult caddie and few people take you seriously.

Excellent spellers usually lack any vision or common sense.

I know plenty of courses that are in the red, and they are killing it.

Jake McCarty

  • Total Karma: 2
Re: Golf Digest Best New
« Reply #53 on: January 27, 2025, 02:48:05 PM »
JC Jones,


What is your deal? The only arrangement that any golf course or club has with Golf Digest, and I presume other publications, is they allow panelists to play the course. Many of the clubs charge their standard guest fee to do so. The panelists then file their score, and at the end of the process, whether it's for Best New or Best in State or 100 Greatest, our statistician tabulates the numbers and gives us back the results. That's it. Why is that so difficult for you to understand?



"The only arrangement"?  So, what do you call the 48 person non-arrangement at Cabot Citrus Farms in March where you are speaking to panelists? Did Golf Digest book all of the tee times on the website without "arranging" anything with the resort?  Why continue to pretend these things don't exist?


I understand Golf Digest's methodology, entirely.  I am well aware of the system and process that your "statistician," the Pope of the Slope, has implemented over the last 10+ years.  The pivot in ~2013 created a system that perpetuates groupthink vis a vis the Scorecard; the ballooning of the panel to 2000 people to generate $2mm in "entry fees" and $600k+ per year in revenue has turned the rankings into a joke; and, the "awards" are easily gamed by clubs knowing they only need to find 15 panelists out of 2000 who are enamored with exclusive access, free rounds and other perks, to win "Best New".  Moreover, the flood of these panelists who have no idea what they are doing, other than paying their way into an annual "free golf" punchcard, has caused most of these clubs to just shut down access all together because they dont want to participate in the Golf Digest charade.


The Golf Digest rankings are broken.  Which is sad for me, because I always viewed them as the gold standard for US100, Best Public and Best in State rankings.

I know more than a few Golf Digest raters. Some are well-traveled and others haven't seen three top 100 courses and they readily admit that they have no idea what they are doing but are in it for "free golf and boozy trips where they can bring friends." And "that they just give out random scores to match existing scores."

Over drinks a friend who's a Golf Digest rater even told that he (and several others he knew) frequently made up scores for courses hi never visited.  He also told me that he just gives the similar scores "to fly below the radar" on everything.


He also said that he "knew of" more than a few Golf Digest raters "who knew absolutely nothing" and weren't "aware of the names Raynor and Fazio" and that "lots of randos" have been "turned onto"  a "discount golf card" for "free golf."


Perhaps these stories are outliers? One can suspect why that the head of the Golf Digest revenue program would disagree with the confessions of his panelists.
« Last Edit: January 27, 2025, 04:30:41 PM by Jake McCarty »

JC Jones

  • Total Karma: 15
Re: Golf Digest Best New
« Reply #54 on: January 27, 2025, 04:01:45 PM »
Hi Jake,


Exactly.  This is what I was alluding to above.  It is a product of both the "Scorecard," and the expansion of the panel to generate revenue for the magazine.


Edit: This is my thread from 5 years ago addressing these issues:


https://www.golfclubatlas.com/forum/index.php/topic,67772.0.html
« Last Edit: January 27, 2025, 04:17:36 PM by JC Jones »
I get it, you are mad at the world because you are an adult caddie and few people take you seriously.

Excellent spellers usually lack any vision or common sense.

I know plenty of courses that are in the red, and they are killing it.

Derek_Duncan

  • Total Karma: -25
Re: Golf Digest Best New
« Reply #55 on: January 27, 2025, 10:52:48 PM »
JC Jones,



You used to be a Digest panelist. I don't know the circumstances of your departure but it seems that you still carry emotional baggage. Thou doth protest too much.


Everyone is entitled to their opinion of the rankings--I'm not here to argue methodology or the way things are run. You have issues? Good for you. What I do care about is people spreading misinformation and saying things that aren't true.


You have claimed that White Oak and The Covey hosted large groups of panelists. Neither is true. You also insinuated that Golf  Digest was working with The Covey in some kind of arrangement. Also unequivocally false. Here's what actually happened: a bunch of people visited the course. The sum of their scores was higher than the other courses. End of story.


You've accused my having the architect on my podcast, which I've had since 2017 and run independently of Golf Digest, as being part of the, I don't know, buy out. Tell me more about how that works, please. I'd love to know because the money doesn't seem to be  getting back to me.


You keep bringing up a panelist summit that hasn't happened yet to bolster some claim of alleged past collusion. That's some strange logic. If we have events in the future, it doesn't change the fact that we didn't have them in the past.


Again, I don't care what your beef is with GD. You want to explicate on all the ways GD gets it wrong, knock yourself out. But there are ways to be critical without advancing and perpetuating tired falsehoods.




Jake McCarty,


I don't suppose you would DM me the names of these panelists, would you? Your comments sound specious to me but I'd like the opportunity to investigate now that you've hung out the dirty laundry for everyone to read.
www.feedtheball.com -- a podcast about golf architecture and design
@feedtheball

Alex_Hunter

  • Total Karma: 3
Re: Golf Digest Best New
« Reply #56 on: January 28, 2025, 05:10:53 PM »
15 of 2000 panellists seeing a course and having it be rated as best new does not add up to statistically significant. Even if it was "over 25" that's garbage data collection. I'd argue it's a bit of a sham in fact. If a course won't have at least 100 panelists out of 2000 then don't count it OR make your panel smaller (this is really what GD should do). But I get it, it's a cash cow for the magazine, which is a very peculiar thing to me.
@agolfhunter

Jake McCarty

  • Total Karma: 2
Re: Golf Digest Best New
« Reply #57 on: January 28, 2025, 07:13:31 PM »
I know that a couple of my courses have banned Golf Digest raters because they perceived that the leadership was dishonest and didn't vet the panelists.
« Last Edit: January 28, 2025, 07:18:51 PM by Jake McCarty »

Sean_A

  • Total Karma: 2
Re: Golf Digest Best New
« Reply #58 on: Today at 02:52:07 AM »
15 of 2000 panellists seeing a course and having it be rated as best new does not add up to statistically significant. Even if it was "over 25" that's garbage data collection. I'd argue it's a bit of a sham in fact. If a course won't have at least 100 panelists out of 2000 then don't count it OR make your panel smaller (this is really what GD should do). But I get it, it's a cash cow for the magazine, which is a very peculiar thing to me.

If 15 visits is the minimum even for a panel 15, it’s still 15 visits. The same amount of possible data sets is gathered in either case. I don’t believe that 15 visits from the 15 member panel is inherently any better than from a 2000 member panel. All aspects of course ranking are subjective so trying to use math to legitimise the subjectivity doesn’t somehow make the collective data objective. None of the data is reliable in any sense of trying to determine the best course. Somebody made that shit up.

Ciao
New plays planned for 2025: Machrihanish Dunes, Dunaverty and Carradale

Tom_Doak

  • Total Karma: 13
Re: Golf Digest Best New
« Reply #59 on: Today at 07:59:02 AM »

If 15 visits is the minimum even for a panel 15, it’s still 15 visits. The same amount of possible data sets is gathered in either case. I don’t believe that 15 visits from the 15 member panel is inherently any better than from a 2000 member panel. All aspects of course ranking are subjective so trying to use math to legitimise the subjectivity doesn’t somehow make the collective data objective. None of the data is reliable in any sense of trying to determine the best course. Somebody made that shit up.



Sean:


Not positive I am understanding you correctly, but there is a big difference in my mind if the panel is larger and there's no overlap in the voters.  If you've got 15 people from TX rating one course and 15 people from NY rating another, it's very possible that their idea of what's a 7 or a 10 is different, so how can you compare those votes?  If you have 5 or 6 of the 15 people who overlap and saw both courses, then you have at least a bit of basis for comparison.


I suspect this is what GOLF DIGEST is trying to do, and why Derek and/or some other panelists would go and check out a surprise contender toward the end of the voting period.  That does corrupt the "blind sample" nature of the voting, but realistically none of the other rankings are a blind sample anymore, either -- everyone knows what the pecking order is supposed to be based on prior rankings.  The Best New voting was supposed to avoid that, but it really can't -- the panelists talk to each other, and inevitably some guys go to TX or SC to see a new course with the expectation to vote high on it, because their buddy on the panel told them it was great.

Ben Hollerbach

  • Total Karma: 1
Re: Golf Digest Best New
« Reply #60 on: Today at 09:45:40 AM »
Wouldn't the solution to this problem just be a moratorium on brand new courses being evaluated for these type ratings?

I know we're talking about "best new" but for a product that is intended to last decades to centuries, wouldn't the product still be new after 3-5 years?

Similar to other sports hall of fame criteria concerning a player being eligible after a set number of years post their career end, shouldn't the course be open for some time before being eligible for rating, allowing a greater evaluation of the course's qualities rather than a more narrow day one perception?

Jimmy Muratt

  • Total Karma: 0
Re: Golf Digest Best New
« Reply #61 on: Today at 12:04:43 PM »
Wouldn't the solution to this problem just be a moratorium on brand new courses being evaluated for these type ratings?

I know we're talking about "best new" but for a product that is intended to last decades to centuries, wouldn't the product still be new after 3-5 years?

Similar to other sports hall of fame criteria concerning a player being eligible after a set number of years post their career end, shouldn't the course be open for some time before being eligible for rating, allowing a greater evaluation of the course's qualities rather than a more narrow day one perception?


A moratorium would certainly help if the true goal were to objectively compare and rank new courses.   Unfortunately, the true motives behind most of the "rankings" or "best new" lists tend to be more business related than architecture driven.


Tom makes a good point in his recent post.  The only way for a ranking to have any credibility is if the raters have seen many of the same courses.   I live in Virginia and the quality of our golf courses in nowhere near that of states like New York, Pennsylvania, Massachusetts, California, etc.   My idea of a course that I scored an 8 or 9 would be seriously skewed if my golf were limited to mostly Virginia.

Blake Conant

  • Total Karma: 3
Re: Golf Digest Best New
« Reply #62 on: Today at 12:12:01 PM »
Derek, did any of the 25 people who saw the covey see any of the other runner ups?

Paul Jones

  • Total Karma: 7
Re: Golf Digest Best New
« Reply #63 on: Today at 12:19:29 PM »

If 15 visits is the minimum even for a panel 15, it’s still 15 visits. The same amount of possible data sets is gathered in either case. I don’t believe that 15 visits from the 15 member panel is inherently any better than from a 2000 member panel. All aspects of course ranking are subjective so trying to use math to legitimise the subjectivity doesn’t somehow make the collective data objective. None of the data is reliable in any sense of trying to determine the best course. Somebody made that shit up.



Sean:


Not positive I am understanding you correctly, but there is a big difference in my mind if the panel is larger and there's no overlap in the voters.  If you've got 15 people from TX rating one course and 15 people from NY rating another, it's very possible that their idea of what's a 7 or a 10 is different, so how can you compare those votes?  If you have 5 or 6 of the 15 people who overlap and saw both courses, then you have at least a bit of basis for comparison.


I suspect this is what GOLF DIGEST is trying to do, and why Derek and/or some other panelists would go and check out a surprise contender toward the end of the voting period.  That does corrupt the "blind sample" nature of the voting, but realistically none of the other rankings are a blind sample anymore, either -- everyone knows what the pecking order is supposed to be based on prior rankings.  The Best New voting was supposed to avoid that, but it really can't -- the panelists talk to each other, and inevitably some guys go to TX or SC to see a new course with the expectation to vote high on it, because their buddy on the panel told them it was great.


In a "better" world, the same 15 people would play all courses up for Best New.  That would set a baseline, but pretty hard to accomplish.
Paul Jones
pauljones@live.com

Sean_A

  • Total Karma: 2
Re: Golf Digest Best New
« Reply #64 on: Today at 12:36:14 PM »

If 15 visits is the minimum even for a panel 15, it’s still 15 visits. The same amount of possible data sets is gathered in either case. I don’t believe that 15 visits from the 15 member panel is inherently any better than from a 2000 member panel. All aspects of course ranking are subjective so trying to use math to legitimise the subjectivity doesn’t somehow make the collective data objective. None of the data is reliable in any sense of trying to determine the best course. Somebody made that shit up.



Sean:


Not positive I am understanding you correctly, but there is a big difference in my mind if the panel is larger and there's no overlap in the voters.  If you've got 15 people from TX rating one course and 15 people from NY rating another, it's very possible that their idea of what's a 7 or a 10 is different, so how can you compare those votes?  If you have 5 or 6 of the 15 people who overlap and saw both courses, then you have at least a bit of basis for comparison.


I suspect this is what GOLF DIGEST is trying to do, and why Derek and/or some other panelists would go and check out a surprise contender toward the end of the voting period.  That does corrupt the "blind sample" nature of the voting, but realistically none of the other rankings are a blind sample anymore, either -- everyone knows what the pecking order is supposed to be based on prior rankings.  The Best New voting was supposed to avoid that, but it really can't -- the panelists talk to each other, and inevitably some guys go to TX or SC to see a new course with the expectation to vote high on it, because their buddy on the panel told them it was great.


Of course, if a panel is packed with too many folks of any certain persuasion, origin, quality of play etc then sure, it can make a big difference to the results. What I am saying is, assuming no sheenanigans or stupidity, 15 panelists is 15 sets of data no matter the size of the panel. 15 of 50 or 15 of 2000 yields the same amount of data. So it isn't reasonable to suggest that the Golf Digest 15 panelist minimum is inherently any less valid than the same minimum from another magazine.  Bottom line, a lot of statistical analysis concerning rankings is hogwash because the criteria is subjective and the interepretation of the criteria is subjective and the execution of a panelist vote is subjective. I take any magazine ranking with a shovel full of salt. So far as I can tell, each magazine seems to think their sysytem is the best. I am not coonvinced there is a best or worst. Why? Because all I would be doing is tagging the magazine ranking which I agree with most....which is meaningless so far as I am concerned. I have been saying it for years...lets talk about favourite courses and why. Lets just admit that there is no proper way to measure best courses.


Ciao
New plays planned for 2025: Machrihanish Dunes, Dunaverty and Carradale

Steve Abt

  • Total Karma: 0
Re: Golf Digest Best New
« Reply #65 on: Today at 03:32:18 PM »
The various ratings aren’t and cannot actually be naming the best courses. Instead, they (purport to) name the best courses according to that publication, as assessed by its raters and criteria. The problem with using 15 out of 2,000 raters is that it’s not even close to a representative sample of GD raters, so they can’t even plausibly claim the results ratings have meaning even when defined solely as GD ratings.


If you have 15 raters on panel and they all rate every course, the results would accurately reflect that publication’s ratings. With 15 out of 2,000, they do not.

Matt Schoolfield

  • Total Karma: -29
Re: Golf Digest Best New
« Reply #66 on: Today at 03:51:07 PM »
The various ratings aren’t and cannot actually be naming the best courses. Instead, they (purport to) name the best courses according to that publication, as assessed by its raters and criteria. The problem with using 15 out of 2,000 raters is that it’s not even close to a representative sample of GD raters, so they can’t even plausibly claim the results ratings have meaning even when defined solely as GD ratings.


If you have 15 raters on panel and they all rate every course, the results would accurately reflect that publication’s ratings. With 15 out of 2,000, they do not.

If the organization behind the rating has a system based on aggregated opinion (which most pretend to) then, theoretically, you could write a collaborative filtering algorithm to account for this issue. Assuming that there is some objective goodness and assuming that raters rate along that scale of goodness pretty consistently (both huge assumptions and philosophically problematic), then it should work very well, especially if you could use all their past ratings.

I think we can assume that nothing like this is happening or ever would happen though. Given the mutually-beneficial relationship between tastemakers and the institutions, there is little reason for anyone involved to take this stuff too seriously.
« Last Edit: Today at 03:53:47 PM by Matt Schoolfield »

Sean_A

  • Total Karma: 2
Re: Golf Digest Best New
« Reply #67 on: Today at 04:06:19 PM »
The various ratings aren’t and cannot actually be naming the best courses. Instead, they (purport to) name the best courses according to that publication, as assessed by its raters and criteria. The problem with using 15 out of 2,000 raters is that it’s not even close to a representative sample of GD raters, so they can’t even plausibly claim the results ratings have meaning even when defined solely as GD ratings.


If you have 15 raters on panel and they all rate every course, the results would accurately reflect that publication’s ratings. With 15 out of 2,000, they do not.

A representative sample of what? A load of subjective criteria which can mean anything to each panellist?  I take the wider view on this issue. The math doesn’t mean much if anything given the circumstances. Why argue about the small beer when the fake math obfuscates the issue more than anything? It’s a system built on soggy cardboard.

Ciao
« Last Edit: Today at 04:10:39 PM by Sean_A »
New plays planned for 2025: Machrihanish Dunes, Dunaverty and Carradale

PCCraig

  • Total Karma: -13
Re: Golf Digest Best New
« Reply #68 on: Today at 04:51:57 PM »
It's a shame what has happened to the Golf Digest rankings. We can all squabble with the position of certain courses here & there on any list, but GD has gone off the rails.


When they turned the lists into a profit center their panel turned into a bunch of guys more interested in free golf & access on a budget than the integrity of the overall list.


I think Derek is trying his best to educate, but setting thousands of uneducated raters out into the word then expecting a great result is a big leap. When it comes to data, crap in = crap out. I'd rather have 50 really educated, well-traveled, panelists than 3,000 randoms.


It's probably very unlikely, but the whole program needs a major overhaul.
H.P.S.

Sam Morrow

  • Total Karma: 0
Re: Golf Digest Best New
« Reply #69 on: Today at 05:13:59 PM »
I still wonder if there'd be this controversy if Tree Farm or Old Barnwell had won.


I think it's a shame this has devolved into another Rater bitch fest (the same one that's been going on for years) instead of being able to talk and celebrate an awesome crop of new courses.

Steve Abt

  • Total Karma: 0
Re: Golf Digest Best New
« Reply #70 on: Today at 05:42:57 PM »
The various ratings aren’t and cannot actually be naming the best courses. Instead, they (purport to) name the best courses according to that publication, as assessed by its raters and criteria. The problem with using 15 out of 2,000 raters is that it’s not even close to a representative sample of GD raters, so they can’t even plausibly claim the results ratings have meaning even when defined solely as GD ratings.


If you have 15 raters on panel and they all rate every course, the results would accurately reflect that publication’s ratings. With 15 out of 2,000, they do not.

A representative sample of what? A load of subjective criteria which can mean anything to each panellist?  I take the wider view on this issue. The math doesn’t mean much if anything given the circumstances. Why argue about the small beer when the fake math obfuscates the issue more than anything? It’s a system built on soggy cardboard.

Ciao


Yeah, that’s totally fair. My point is that a group of 15 who rate every course can fairly be said to accurately reflect that panel. A not-quite random and unknown group of 15 of 2,000 doesn’t even fairly reflect its own panel.

I’m not saying either list is more “correct” but I can read the Links Magazine list that polled a smaller number of architects and know that it reflects those architects opinions. With GD, I cannot read the list and determine whose opinions it reflects.

Mark_Fine

  • Total Karma: -16
Re: Golf Digest Best New
« Reply #71 on: Today at 05:57:35 PM »
Can someone here state what the overlap of courses are that GD, GM and GW have on their respective Top 100/200 courses?  I think you will find it is significant.  And forget the order the courses are in as there is no correct “or better” list as so many courses could be flip flopped with ease. 


If people whose opinion we respect like Ran and Tom Doak cared to list their 100 best courses they would NOT be the same and the order would be quite different.  They would also very much overlap with the three major lists.  I know because I have seen both at least what they thought years ago  ;D


Derek I am surprised you continue to post but wish it was more your architecture based thoughts vs defending yourself.  This site sadly won’t allow that  :-[

Matt Schoolfield

  • Total Karma: -29
Re: Golf Digest Best New New
« Reply #72 on: Today at 06:08:25 PM »
I think it's a shame this has devolved into another Rater bitch fest (the same one that's been going on for years) instead of being able to talk and celebrate an awesome crop of new courses.
You realize this is a non-trivial part of why there are rankings in the first place instead of just having profiles, right? There is no compelling reason why anyone should be ranking (or even rating) these courses in the first place if the point is to celebrate them. Even if we insist on maintaining the culture of elitism that has some single vector of goodness, there is no reason not to use the Michelin Fried Egg Star Egg system if what we care about is communicating excellence.

The complaining, and the complaining about the complaining, are all part of us directing attention toward the product, and their metrics for success are fully based on attention.
« Last Edit: Today at 06:18:57 PM by Matt Schoolfield »