The raters who play the most exclusionary courses are very probably not a random sampling of the pool of raters (much less golf enthusiasts). Suppose you have 15 or 20 raters that come play PV, and they end up rated #1. We don't then suggest that PV invites 20 new raters to come through the next year until all 100, 1000, or 10,000 raters (depending on the org) has played, right? I'm pretty skeptical that that would happen, but I honestly don't know.
If we have a limited number of raters come through, we should suspect that those ratings will by sticky. Combine that with the fact that folks who didn't love the course probably won't go though the effort to get back. Whereas someone who doesn't like Pebble might be able to swing by if they're in Monterey, because why not? Here, we've got and system that creates lopsided outcomes when access is restricted. Add in the fact that most rating orgs either "toss" or "investigate" statistical outliers, and you've got a system designed to create outcomes that look like group think.
Here, nothing nefarious need be happening, it's just that a chilling effect on criticism can show up as much in selection bias as it can in folks genuinely holding their tongue to maintain good relationships. Folks who make the effort to return to an exclusive course probably already love it, and few will make that same effort to return to an exclusive course just to shit on it.
Matt:
I agree with everything you've said here. Indeed, many newer courses try to manage this and sort out which panelists they will invite to see their course to gain a more favorable rating. That was happening on the GOLF Magazine panel for a while between my involvement and Ran's . . . one of the panelists was apparently getting paid $$$$ by courses to help them manage the results, which they did by limiting access to a few panelists whose votes were reliably favorable. [No idea if some of those other panelists were cut in on the $$$ or totally oblivious to the situation.]
That's why the GOLF DIGEST Best New results are under the microscope for some of us: because it would be a very easy process to manipulate, and many clients are willing to try.
And, no, Pine Valley doesn't make sure that every GOLF DIGEST panelist gets to go there. They are not worried at all about being supplanted as #1 anytime soon, and they handle it as all clubs should -- by insisting that all players are guests of a member, with no consideration for whether they are a rating panelist or not.
Quite different than the newish course I visited last year that wanted me to sign a non-disclosure agreement that I wouldn't talk about it . . . but had allowed a certain number of GOLF DIGEST panelists to play so it could be considered for the Best New Courses award.
Tom,
Quite a lot of what you've written about the Golf Digest rankings and how the publication works is either misinformed or based on assumptions that might have been true 15 or 20 years ago but are outdated and have long since passed. Up until a few weeks ago you were continuing to insist that Resistance to Scoring was one of the categories panelists used, but that was changed about 8 years ago. Small difference, you may say, but facts are important.
Clubs up for Best New Course, or Renovation or Transformation, do not reach out and hand pick panelists. That's absurd. In fact, for any panelist to play and review any of the nominees they have to be approved by me. I assign each and every panelist who reviews the nominees. I screen them based on tenure (new panelists and those with thin resumes are not eligible) and if I think they're capable of providing an honest assessment. No club is gaming the system.
I'm aware of a handful of clubs around the country (not Best New candidates) that attempt to contact panelists and bring in people they think will deliver favorable evaluations. That is against our policies, and if it comes to my attention we discard the score and dismiss the panelist. If it happened in the old days to some greater degree, I can't say, but it's zero tolerance now.
Most of what you write about Golf Digest is based on some obviously deep-seated belief that we don't conduct our business credibly, or that we can be manipulated. The topic of groupthink vs. consensus is a very interesting one and I don't disagree with many of the views and observations that have been posted here. But let's keep that the topic and stop suggesting that we're being played by clubs. It's false, and the logic doesn't add up.
I find it informative that this all stems from the fact that a course that was on few people's radar won Best New. Wouldn't that be an argument against groupthink? And why isn't the same microscope turned to how Interlachen, Medinah or Pinehurst 10 won their categories.
I perceive it a conflict of interest for me to be posting here, so I'm keeping it to a minimum. If anyone has serious questions about how things work at Golf Digest I'm happy to respond in DMs, on Twitter (@feedtheball) where I created a thread about The Covey, or email, derek.duncan@wbd.com. If anyone has questions about The Covey, such as if the streams there are real or created (the answer is both), you should listen to my talk with Chet Williams on my podcast, as well as the nearly 100 other architects and golf figures I've had long discussions with over the years.