News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


David_Madison

  • Karma: +0/-0
Re: How do the
« Reply #25 on: May 23, 2010, 04:58:30 PM »
Patrick,

Golf Digest's categories:

Shot Values (counts 2x)
Resistance to Scoring
Design Variety
Memorability
Esthetics
Conditioning
Ambiance

The May, 2009 issue has the most recent listing of the the Top 100, along with how each of the courses fared in the various categories. So now we can argue not just the overall rankings, but also the sub rankings by category!

Michael Whitaker

  • Karma: +0/-0
Re: How do the
« Reply #26 on: May 23, 2010, 06:00:14 PM »
"Solving the paradox of proportionality is the heart of golf architecture."  - Tom Doak (11/20/05)

Tom Yost

  • Karma: +0/-0
Re: How do the
« Reply #27 on: May 23, 2010, 09:43:04 PM »
It seems I remember reading that one of the magazines (GolfWeek maybe?) recently eliminated "resistance to scoring" as a rating criteria?

Here are the GolfWeek rating guidelines:
http://www.golfweek.com/news/2009/oct/08/guidelines-used-golfweeks-course-raters/
« Last Edit: May 23, 2010, 09:51:41 PM by Tom Yost »

Patrick_Mucci

Re: How do the
« Reply #28 on: May 24, 2010, 11:38:43 AM »
Are there any of the 100 GOLF MAGAZINE raters on this site ?

Who rates for Golfweek ?

Who rates for Golf Digest ?

Jed Peters

  • Karma: +0/-0
Re: How do the
« Reply #29 on: May 24, 2010, 12:04:30 PM »
Are there any of the 100 GOLF MAGAZINE raters on this site ?

Yes, but you'd perhaps only hear from them in PM. They keep it quiet--I'm sure you, being a well-read man, have actually read the ratings issue of GOLF where it details this.

Who rates for Golfweek ?

Many have already commented.

Who rates for Golf Digest ?

Many have already commented.

An aside:

Why the (perceived) vitriol? Your commentary always is pointed to achieve an outcome. What is yours here?

Patrick_Mucci

Re: How do the
« Reply #30 on: May 24, 2010, 12:19:49 PM »
Are there any of the 100 GOLF MAGAZINE raters on this site ?

Yes, but you'd perhaps only hear from them in PM. They keep it quiet--I'm sure you, being a well-read man, have actually read the ratings issue of GOLF where it details this.

Who rates for Golfweek ?

Many have already commented.

I'm aware of those who have already commented.
I'm interested in knowing those who are raters who have yet to comment.


Who rates for Golf Digest ?

Many have already commented.

I'm aware of those who have already commented.
I'm interested in knowing those who are raters who have yet to comment.


An aside:

Why the (perceived) vitriol?

That perception is your and yours alone


Your commentary always is pointed to achieve an outcome.

Not always


What is yours here?

I haven't decided yet



Michael Whitaker

  • Karma: +0/-0
Re: How do the
« Reply #31 on: May 24, 2010, 02:54:01 PM »
Here is a list of the Golf Magazine panelists: http://www.golf.com/golf/courses_travel/article/0,28136,1919123,00.html

GolfWeek Golf Digest do not publish a list of their panelists. Why do you want to call them out on this website?
"Solving the paradox of proportionality is the heart of golf architecture."  - Tom Doak (11/20/05)

Mac Plumart

  • Karma: +0/-0
Re: How do the
« Reply #32 on: May 24, 2010, 03:03:12 PM »
Maybe I am missing something due to my low IQ, but what is this vitriol or "calling out" that you guys are talking about.  I get the sense that Patrick is looking to perform an analysis I've been doing for about 2 years.  Analyzing the rankings, looking for discrepancies, and investigating those discrepancies.  

What I've found good about doing this exercise, is uncovering the golf rating entity that most matches up with my ideals of a good golf course.  Looking at the list I put up there...if you like Canyata, The Quary at La Quinta, or Rich Harvest...perhaps Golf Digest lines up with your taste.  Golfweek, Kingsley, Wannamoisette, Dunes...Golf Mag. Torrey Pines, Mauna Kea. etc.

Why isn't this a good thing to do?

Conversely, you can avoid the opinions of the entities you don't like.  No harm done...no disrespect...simply different tastes.

I find it especially useful when planning a trip and looking for some new/fun courses to play.  These lists have proven to be good resources for me.
Sportsman/Adventure loving golfer.

Terry Lavin

  • Karma: +0/-0
Re: How do the
« Reply #33 on: May 24, 2010, 03:15:12 PM »
There's no question that each magazine has its own built-in set of biases, if only because of the scoring method that each utilizes.  There is also an inevitable "group think" that tends to attach itself if these raters descend in a pack upon a golf course.  It's simply human nature.  Finally, there's no doubt that there are leaders at each organization who have an effect upon the way that the bulk of the raters view the merits and demerits of the work of certain architects.  I tend to favor the Golfweek ratings because they seem to do a better job of rewarding the fun factor with its "walk in the park" analysis and I tend to not favor the Golf Digest system because it seems to reward difficulty over fairness and/or fun.  At the end of the proverbial day, we're talking about distinctions without too much difference, with only a handful of courses that are relatively highly rated by one magazine and completely dissed by another.
Nobody ever went broke underestimating the intelligence of the American people.  H.L. Mencken

Andy Troeger

Re: How do the
« Reply #34 on: May 24, 2010, 03:33:53 PM »
Mac,
Only comment regarding this would be that the more courses I play, the less I find I have agreement with any "one list." My opinions are my own and vary from any of the three major publications and also in some cases from any kind of consensus that one might gather from this website such that GCA list we did awhile back.  It is important, especially as a panelist, to avoid rating a course based on what others have rated it (if GolfWeek likes it then it must be good!) as opposed to developing your own opinion (not accusing you or anyone of that, but just making the point that it happens). Admittedly, most of my thoughts fall in line with the lists as a whole, but I can't say that I track with any one list more than the others. I can understand your logic, but please don't evaluate the Digest list solely based on the placement of Rich Harvest!  ;)


What I've found good about doing this exercise, is uncovering the golf rating entity that most matches up with my ideals of a good golf course.  Looking at the list I put up there...if you like Canyata, The Quary at La Quinta, or Rich Harvest...perhaps Golf Digest lines up with your taste.  Golfweek, Kingsley, Wannamoisette, Dunes...Golf Mag. Torrey Pines, Mauna Kea. etc.

Why isn't this a good thing to do?


Mac Plumart

  • Karma: +0/-0
Re: How do the
« Reply #35 on: May 24, 2010, 04:00:50 PM »
Terry and Andy...

Agreed 100%.
Sportsman/Adventure loving golfer.

Patrick_Mucci

Re: How do the
« Reply #36 on: May 24, 2010, 04:54:55 PM »
Here is a list of the Golf Magazine panelists: http://www.golf.com/golf/courses_travel/article/0,28136,1919123,00.html

GolfWeek Golf Digest do not publish a list of their panelists.

Why do you want to call them out on this website?

Michael,

That's one of the most assinine conclusions anyone could make.

Please tell us why would I want to call them out ?

It's clear that you don't see the formation of the obvious and relevant issues associated with this thread.

But, thanks for listing the golf magazine panelists.

ONE of my questions to the raters/panelists at all three magazines is the following.

IF you used the criteria offered by the other magazines, how different would your evaluations be ?

Another question would be, IF you consolidated the criteria, would it have a substantive impact on the/your ratings.

I have other questions but will wait for you to catch up on these two.

« Last Edit: May 24, 2010, 04:56:34 PM by Patrick_Mucci »

Mac Plumart

  • Karma: +0/-0
Re: How do the
« Reply #37 on: May 24, 2010, 05:18:48 PM »
Pat...if the ratings involve large numbers of raters and the criteria is switched (Golfweek uses Golf Digest's criteria and vice-versa), I am thinking the ratings would be very similiar as the large numbers of votes would severely reduce individual bias. 

If you consolidate the criteria, I think you might bet a list that looks like the one I posted with the composite list of averaged and aggregated scores from Golf Mag, Golfweek, and Golf  Digest.  Also, these guys do a very similiar things but for the entire world, not just US courses.  http://www.top100golfcourses.co.uk/htmlsite/topcourses.asp

Thoughts?
Sportsman/Adventure loving golfer.

Keith OHalloran

  • Karma: +0/-0
Re: How do the
« Reply #38 on: May 24, 2010, 05:41:13 PM »
What is the procedure for actually going to these courses to rate them, and in what time frame must you see them?  If there are 3 magazines and one of those has 100 panelists, that means that all those people see Pine Valley in the evaluation time? And how is this set up? Does the club always know when a panelist is there?

Andy Troeger

Re: How do the
« Reply #39 on: May 24, 2010, 05:46:36 PM »
Pat,
As to your first question, I think there are two types of panelists out there. One group rates based on specific categories and the total is somewhat of an accident after the fact. The other group rates to come up with a total score, and if they are required to use categories then those are simply a means to an end.  I suppose some use a combination of the two. For group A, if you change the categories you certainly could have a strong impact on the way courses are rated. For group B, since they are only interested in getting to their preconceived total, the categories mean very little and those folks would manipulate whatever criteria they are given to make sure that the totals are “correct.”

My goal as a Golf Digest panelist is to rate courses based on their criteria (per group A above). So if you changed those criteria, it would change my scores. For some courses, it might not make much of an impact, but for others it could be significant given that a point or two (on average) makes a huge difference in where a course is ranked.

I don’t really know how to answer your second question—other than conditioning the categories are totally different and really come from different viewpoints so if you used them all I think you would just have a mess of about 15 categories.

Andy Troeger

Re: How do the
« Reply #40 on: May 24, 2010, 06:02:06 PM »
What is the procedure for actually going to these courses to rate them, and in what time frame must you see them?  If there are 3 magazines and one of those has 100 panelists, that means that all those people see Pine Valley in the evaluation time? And how is this set up? Does the club always know when a panelist is there?

Keith,
All of the mags have their own criteria for a minimum number of raters that must see courses within their evaluation periods (which again can vary). Golf Digest ranks the 100 Greatest in the USA but also the best courses in each state, so there are probably well over 1,000 courses just in the USA that need evaluations so none of us see anything close to all of them. Panelists may contact the course to set up play, but are not required to do so. I have paid to play at public courses without contacting them and also played private clubs as the guest of a member without contacting the pro shop.

Keith OHalloran

  • Karma: +0/-0
Re: How do the
« Reply #41 on: May 24, 2010, 06:05:34 PM »
Thanks Andy, are there a minimum nember of reviews that need to be submitted for a course to be considered for the list?

Andy Troeger

Re: How do the
« Reply #42 on: May 24, 2010, 06:44:15 PM »
Thanks Andy, are there a minimum nember of reviews that need to be submitted for a course to be considered for the list?

Yes, although those vary depending on publication as well--the publications have tried to raise them in recent years to increase statistical validity.

David_Madison

  • Karma: +0/-0
Re: How do the
« Reply #43 on: May 24, 2010, 06:59:52 PM »
Pat,

I agree with Andy in that there are likely two approaches to our ratings, one that follows the criteria and the other starting from the end result and working back. I tend to use the first and then the second as a check. I start off first by scoring each of the criteria and then totaling up the scores. Then I do some checking to see if the total score seems to make sense. I'll look back through my personal data base of past ratings in order to find other courses that I think either are similar stylistically, or identify courses that had scores similar to what I came up with for the course I'm rating. Then I evaluate the subject course against the comparables, determine if it should be rated higher or lower, and make the appropriate adjustments so that how I rated the new course makes sense within my overall ratings universe.

My guess is that if we worked with GW's criteria, I mighty come up with slightly different results. Seems that as a GW panelist I could more heavily factor some of the factors I identified earlier as somewhat lacking in GD's system. On the other hand, I do make an effort to incorporate the elements of fun, quirk, and so on into criteria such as "shot values", "memorability", and "ambiance". While never having to evaluate GW's "walk in the park" criterion, I suspect that I'm covering it within GD's "ambiance" category.

Michael Whitaker

  • Karma: +0/-0
Re: How do the
« Reply #44 on: May 24, 2010, 07:12:16 PM »
Pat - i wasn't making a conclusion... I was simply curious as to why you were asking the GolfWeek and Golf Digest panelists to identify themselves without posing any specific query. Not trying to be confrontational in any way. 

Glad I could be a little help with the Golf Magazine list.

Have a nice thread.
"Solving the paradox of proportionality is the heart of golf architecture."  - Tom Doak (11/20/05)

Patrick_Mucci

Re: How do the
« Reply #45 on: May 24, 2010, 08:15:38 PM »
Michael,

If  you weren't trying to be confrontational, why would you accuse me of wanting to "call them (panelists) out on this web site" ?

Thanks again for the list of Golf Magazine raters/panelists

Andy, David, et. al.,

You may recall Ty Webb's response to Judge Smails when Judge Smails asked Ty how he compared himself to other golfers, since he didn't keep score.

Ty's reply was, "by their height"

Hopefully the analogy won't be lost.

If the rating criteria differs, is it not possible that the conclusions could differ ?

OR, are "THOSE" golf courses essentially "FIXED" in terms of their respective positions, irrespective of the rating system employed ?

A tangential question is:

How is it that a course, let's call it course XXX, can be ranked in the top 20, 30 or 40, and yet, decades later not appear on the list ?

Before you answer, consider, in theory, that courses ranked higher than course XXX are still in the top 100.

So, something had to change that was unique to that course.

Take for example, Pine Tree.
Once ranked # 27 by "Golf Digest"
How could it fall off the top 100 list while many courses ranked higher than # 27 remained on the top 100 list.

What was it about PT that caused it to vanish from the list while courses ranked higher remained ?

Andy Troeger

Re: How do the
« Reply #46 on: May 24, 2010, 08:30:26 PM »
Patrick,
Changes in the list over time are interesting, but I don't think you'll find a simple answer. Courses are dynamic things that constantly change and over time ratings can change too. Classic courses like Pine Tree, Bellerive, Colonial, Point O'Woods, and others have fallen out, but other classics like Crystal Downs and Fishers Island are somehow "discovered" and added to the list years after they were built. Just as styles of architecture change with time, I think some of that can be attributed to golfers' tastes in their courses over time. Some of it might be due to changes in the courses themselves due to renovations, restorations, or neglect.

I believe that if you change the methodology then you do change the results. The varying methods can come to the same conclusion, but they don't necessarily have to.
« Last Edit: May 24, 2010, 08:34:20 PM by Andy Troeger »

Mac Plumart

  • Karma: +0/-0
Re: How do the
« Reply #47 on: May 24, 2010, 08:31:34 PM »
Pat...

I think there is no doubt that the conclusions differ, somtimes significanlty, sometimes just a bit...reference some of the data posted on this thread.  Kingsley, Canyata, Torrey Pines are examples of significant differences, while Augusta National's ranking from the #1 course in the US all the way down to the 11th are minor differences.

On courses that have stood the test of time, while others have faded...I love the Tom Macwood "In my Opinion" piece...http://golfclubatlas.com/in-my-opinion/tom-macwood-the-worlds-finest-courses

I think it shows a very interesting list of courses and gives some good food for thought.

On Pine Tree, I can't answer...I've never played it, seen it, or studied it.  But in general this could happen if technology makes some of the courses features irrelevant or less significant, if maintenance issues affect the course, or better courses (Sand Hills, Pacific Dunes, Kiawah Ocean) come along and bump some older courses off the list.  I am probably missing a few reasons why a course could fall off the list but this was off the top of my head.
Sportsman/Adventure loving golfer.

Patrick_Mucci

Re: How do the
« Reply #48 on: May 24, 2010, 08:49:22 PM »
Mac,

If better courses came along subsequently, they'd have to displace all of those courses ranked higher than PT, before they displaced PT, NO ?

At 7,200 and still a stern test, I don't think PT has be obsoleted to the degree that many land locked courses would be obsoleted by modern equipment/technology.  And, PT still has the luxury of good winds.  Likewise, I don't believe that maintainance is an issue either.

As a "golf" club, PT "gets" fast and firm as does its Superintendent, Anthony Nysse, who's done a terrific job in achieving those conditions despite a poor winter.

I don't want to single out PT, rather, I'd prefer to focus on it's relative position at # 27 and how it was displaced while courses ranked higher than # 27 weren't displaced.  That seems almost impossible in terms of a numerical exercise.

P.S.  Next time you're heading to Florida let me know and I'll arrange for you to play PT.
        Likewise an old DR course in NJ

Is that a flaw in the rating system, an anomaly, or changing tastes ?

David_Madison

  • Karma: +0/-0
Re: How do the
« Reply #49 on: May 24, 2010, 09:09:42 PM »
Pat,

Changing tastes has to be the culprit in the case of a Pine Tree that is otherwise as good or better than ever.

If I remember correctly, the GD rankings first started out as a compilation of the toughest courses in America. Pine Tree surely belonged, because at 7,200 yards 30+ years ago it was a monster. Even when the rankings shifted away from a pure test of difficulty, I suspect that the difficulty bias still remained.

But now, aesthetic sensibilities have changed. Courses like The Ocean Course, Sebonack, and so on have so much more going on visually that naturally panelists are going to be drawn to them, while Pine Tree could be perceived as a bit of a yawn. I don't believe it is, but I could understand the perception and the resultant ratings drop.