News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Tom_Doak

  • Karma: +3/-1
Re: Ratings: Where do the differences emerge?
« Reply #25 on: January 02, 2011, 10:20:10 PM »
Jim:

GOLF DIGEST's Best in State lists have always put any new courses with less than the required number of votes, behind all of the courses that were ranked in the top 100, no matter their actual scores.  Knowing this, it used to be fairly easy to read between the lines as to which new courses were likely to crack the top 100 next time around.

However, if Friars Head is ranked behind a non-top-100 course on the state list, then either its score is not high enough to get it into the list (whether it has enough votes or not), OR, they screwed up the state list because Piping Rock was in the top 100 last time and they forgot it isn't anymore.

Matt_Ward

Re: Ratings: Where do the differences emerge?
« Reply #26 on: January 03, 2011, 05:29:32 PM »
The thing about the state ratings is that more emphasis shoudl go to those who actually live in the respective state(s). This way the one time visitor from parts unknown won't be able to elevate or lower a particular state. For whatever reason -- the state ratings should be better than an aggregate national listing. In GD's case -- there seems to be an issue with not just NY but others as well.

Tom_Doak

  • Karma: +3/-1
Re: Ratings: Where do the differences emerge?
« Reply #27 on: January 03, 2011, 05:39:37 PM »
Matt,

I don't agree with your reasoning above at all.  For starters, I don't agree because every time I have seen a list of the best courses in one state, it is miles off base.

More importantly, I don't agree with doing the ratings by state captures the big picture well.  Panelists at the national level understand that you can't put every Seth Raynor course into the top 100 ... but raters in each state think their one Raynor course is special.  Likewise, PGA West was special if you had never seen anything like it, but there already WAS something like it in Ponte Vedra.   And if you want to talk about group-think, remember that the pecking order of elite clubs in Chicago or New York is generally not something that can be questioned from the inside.
« Last Edit: January 03, 2011, 05:41:08 PM by Tom_Doak »

Matt_Ward

Re: Ratings: Where do the differences emerge?
« Reply #28 on: January 03, 2011, 09:43:34 PM »
Tom:

Help me out but do you read what I posted ?

I stated the state ratings are flawed because they don't provide added weight to those raters who live in the immediate area. In plenty of cases you get spiked numbers from the occasional visitor.

That's one of the main reasons the state ratings are "miles off base."

I never linked state ratings to a national perspective. The national perspective can only be done by people who are really nation wide in their wherewithal to play all of the key candidates on a fairly steady and consistent basis. Therefore such people would be able to discern the nature of a TPC / Sawgrass from the likes of a PGA West or the Seth Raynor portfolio.

Tom, let me say plain and simple -- those who are competent to truly be "national raters" are likely to be less than 15% of the existing people -- and even that may be a stretch. The assumption from the magazines is that all raters are equal. They're not.That kind of thinking applies in "one man / one vote" in terms of electoral politics but it doesn't hold water for ratings of the kind we are discussing. GD has long erroneously believed that having more panelists provides for more coverage. It doesn't.

I also don't see your point in the listing of courses in Chicago or New York being able to be discerned because those doing so are from the "inside." If anything those on the "inside" have a better apprecistion of the limitations and faults of the courses in question because of the sheer times they have played the coyurses and from the many varied ways in which they have seen such courses played.. They have long since abandoned the dream-like haze that so many outsiders oiften have of such courses.

Jim Nugent

Re: Ratings: Where do the differences emerge?
« Reply #29 on: January 04, 2011, 02:35:46 AM »
Jim Nugent,
There may be 16k courses in the US, but I don't know how the conclusions of the raters can be seen as drinking the same Koolaid, there just aren't that many exceptional places to choose from in the first place.
    

I agree with you that there are not that many exceptional places to choose from.  But how do we know that?  In a subjective area -- golf course ranking -- how is it almost all of us agree that a few hundred courses stand head and shoulders above the rest?  That is what I meant about drinking the same koolaid. 

I'm having trouble expressing this, but it seems to me an epistemological question.  I guess it's the same way we know great art or music or literature. 

Btw, my main point is that while we focus on the differences, the similarities are far greater, especially near the top, among all the major ranking services.     

Sean_A

  • Karma: +0/-0
Re: Ratings: Where do the differences emerge?
« Reply #30 on: January 04, 2011, 04:41:19 AM »
Jim

For a long time I have had my doubts about an elite 250 courses or whatever.  I can accept there is an elite, but a far smaller number which I don't know.  I surmise that once we depart from the truly great courses in the US that there are any number of challengers for the scraps.  Crucially, I have no desire to put my theory to the test as I generally dislike traveling or even driving, plus I don't see the point of chasing down courses just to determine their worth.  Hype, exclusivity, history and reputation are at least as important as what is in the ground.

Ciao
« Last Edit: January 04, 2011, 04:50:40 AM by Sean Arble »
New plays planned for 2024: Nothing

Jim_Kennedy

  • Karma: +0/-0
Re: Ratings: Where do the differences emerge?
« Reply #31 on: January 04, 2011, 04:26:38 PM »
Jim Nugent,
The question of what is or isn’t exceptional, or at least ‘the best’, is as old as the game itself and what qualifies as such has been debated over time by a great number of interested and, for the most part, knowledgeable sources. 
I think a small group of exceptional courses, with a few at the very pinnacle, is as it should be, and it follows that the closer to the top a course gets the fewer dissimilarities it should have with its brethren, i.e., it should at least match their near perfect presentation. If not the whole process would be questionable at best.   
"I never beat a well man in my life" - Harry Vardon

Matt_Ward

Re: Ratings: Where do the differences emerge?
« Reply #32 on: January 04, 2011, 11:49:30 PM »
Jim:

There are likely about 12-15 courses that one can put near the apex of the elite of elites here in the USA.

You then have roughly 25-30 courses that are likely just a slight drop-off with others slightly more distant.

The real issue for raters is not in saying that Oakmont or Merion is great -- wow -- that's really stepping off a limb of the obvious -- but it's in the next 50 or so courses that round out a top 100. In my mind, GD spends too much energy on the sheer demanding courses while an assessment from GW has plenty of the old tier and quirky layouts that their esteemed panel believes is good stuff.

I see the pendulum somewhere in the middle -- a place like Black Mesa and Kingsley Club are great examples of places that are rock solid in my mind but too often missed.

Too much of the ratings usually moves like the long lumbering buffalos in a herd mentality.