News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Mike_Cirba

Course Ratings Have Come a LONG way
« on: April 08, 2003, 10:59:06 AM »
No, they're hardly perfect, and yes, they have their downsides that we are all able to point out easily.

But, what rarely gets acknowledged is the fact that all of the course ratings have shown marked improvement over time, particularly the oldest, Golf Digest's.  

I say this looking back at results of their "Greatest 100" courses from, say 1982.  I have the book at home, and perhaps by tomorrow I'll post the quite interesting results from that time period.

I can tell you that NGLA didn't even make the Top 100, but Innisbrook (Copperhead), Wilmington South, Hershey West, and New Seabury (Blue) did.

I know that within Golfweek, we're always trying to refine the process in the hope of getting the best possible results.  I sense that happens with the other publications, such as Golf Digest, as well.  

So, before we get outraged about this and that rating of any particular course (and this isn't meant as an apologist for various criteria that are dubious at best), we should look at it for a minute from a historical perspective and recognize that things are getting better all the time.
 
Oops...except when they listed Pebble as #1....even progress doesn't always work in a linear fashion. ;)
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

THuckaby2

Re: Course Ratings Have Come a LONG way
« Reply #1 on: April 08, 2003, 11:00:52 AM »
Some sage counsel there without a doubt, Mike.  And yes, I LOVED that last line!   ;D

TH
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

JohnV

Re: Course Ratings Have Come a LONG way
« Reply #2 on: April 08, 2003, 11:08:08 AM »
Mike and Tom, I couldn't agree more.  I suppose some may take these lists more seriously, but in general we have a good mix of GW and GD raters around here who love to poke fun at each other and the way we do things just like many "competing" groups do.  A few weeks ago GW got its hits and now GD is getting its turn in the oven.  Of course, all those who felt that the GW list was bad were WRONG and all those who think that the GD list is right are just as WRONG!  ;)
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

THuckaby2

Re: Course Ratings Have Come a LONG way
« Reply #3 on: April 08, 2003, 11:15:11 AM »
Right on, bruthah John.  I did enjoy it immensely when the GW list was on the "hot seat"... it has been very fun taking the "heat" as well.

The fun thing is all the raters I know seem to take it in this fun light... others are the ones who take it way too seriously... even the Emperor I trust is smiling at all this... I just don't know him well enough to say for sure.

TH
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Jonathan Cummings

  • Karma: +0/-0
Re: Course Ratings Have Come a LONG way
« Reply #4 on: April 08, 2003, 01:04:26 PM »
As someone who assists in compiling rating numbers and considers himself a student of the golf course rating process, I am completely baffled with a few differences in the recent lists.

For example, could someone please explain to me how one panel (GW) can vote the Pete Dye Club as the 5th best modern course in the US while another panel (GD) effectively says PDC is not worthy of combined top 100 status!  

I have no love nor hate for the PDC but am unsettled by how two esteemed panels, with lots of statistics could disagree so on this example.

JC
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

THuckaby2

Re: Course Ratings Have Come a LONG way
« Reply #5 on: April 08, 2003, 01:13:54 PM »
Jonathan:  different strokes for different folks... also very possible a small number of GD raters saw PDC period, with a small number of those giving it poor ratings for whatever reason... there are many possibilites, none of which would surprise me.

TH
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Mike_Cirba

Re: Course Ratings Have Come a LONG way
« Reply #6 on: April 08, 2003, 01:18:35 PM »
Well, I do know that each publication has a minimum number of votes that a course has to get before it's considered.  

I wonder if it makes sense to remove high and low outliers, if possible.  

Also, between GD & Golfweek, there are clear differences in what each is looking for.  GD has always been more favoring of the long, tight, tough "test of golf", as evidenced by their "US Open bounce", while a fun, fair, varied, 6,500 yard course like Fenway (#50 on GW's Classic List) doesn't have a snowball's chance with GD (currently #24 in the state of NY!!).  

Sheesh.

The classic example of the differences in rating methodology is Stanwich listed as the top course in Connecticut, over Yale.  In PA, it's Laurel Valley over Huntingdon Valley.  
« Last Edit: December 31, 1969, 07:00:04 PM by -1 »

THuckaby2

Re: Course Ratings Have Come a LONG way
« Reply #7 on: April 08, 2003, 01:29:53 PM »
Mike, I have been told the minimum # is 15 for GD, though I don't know that to true absolutely.  Assuming it is, if only 15 raters see a course, it isn't tough to see how a view dissenters could skew this mightily from how others see it.

This is all just conjecture, in any case.

TH
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Carlyle Rood

  • Karma: +0/-0
New Stance
« Reply #8 on: April 08, 2003, 01:45:48 PM »
I was never genuinely passionate about golf course rankings; but, after all the whining these last few days, I'm convinced I'm now anti-ANTI-rankings.  Which is to say that I'm now passionately opposed to opposing rankings!  ;D
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Mike Vegis @ Kiawah

  • Karma: +0/-0
Re: Course Ratings Have Come a LONG way
« Reply #9 on: April 08, 2003, 02:24:47 PM »
Originally, the Golf Digest rankings were for America's "toughest" courses.  I'm not sure when it melded into the "greatest."
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Jonathan Cummings

  • Karma: +0/-0
Re: Course Ratings Have Come a LONG way
« Reply #10 on: April 08, 2003, 03:50:25 PM »
Huck - GD requires a minimum of 30 raters.  GW likes a minimum of 10 ratings, but in regards to top courses, it gets many more.  It is not different strokes, but a troublesome difference in measurements...

Here is a history of the Digest Ranking system I wrote a number of years ago...

JC

In 1962, a map-making firm asked Golf Digest to compile a list of top courses in America for a charting project the firm was preparing.  At first turned off, but later intrigued by the concept of relative comparisons of the then-popular US courses, the editors of Digest, under the direction of Bill Davis, set about preparing a list.   The resultant list, called ‘America’s 200 toughest Courses’, was based solely on course lengths and USGA rating values, and was first published in 1966.  The list was revised in 1967 and has since been published by the magazine every odd year.  The editors soon realized the enormity and potential complexity of evaluating and ranking a golf course.  It could not be an undertaking performed by a library search, based on just a few parameters; such a basis would miss many of the qualities of ‘greatness’.  It was clear that the opinion of experienced players was needed.  So a panel was formed in 1968, comprised of both professionals and top amateurs, to assist Golf Digest in determining The List.  This panel was initially comprised of about 100 regional selectors (mostly friends of Bill Davis) who provided course recommendations to a national selector board made up of 20 of the then (and some still!) Who’s Who’s in golf - Snead, Beman, Boatwright, Demaret, Wind, Nelson, to name a few.  Panelists evaluated courses for this initial list based mostly on fame, tradition, and whether or not a course had a history of hosting tournaments.  Architectural features were also assessed, but only subjectively.  This national panel, in concert with the Golf Digest editors, then determined the final rankings by listing the top 50 courses in groups of ten - first ten, second ten, etc. - and listing the ten courses within a group alphabetically.  The courses ranked 51-100 were lumped together in a single list, alphabetically, and called ‘the second fifty’.  The first product of the newly formed panel was published in 1969

By 1975, the panel had begun to put more value on design features, like esthetics and quality of play, and to slightly de-emphasize tournament history of a course.  This precipitated the change in the name ‘America’s 200 Toughest Courses’, to  ‘America’s 100 Greatest Tests of Golf’, and a corresponding, reshuffling of The List.  Courses like Seminole and Pine Valley, both tremendous designs but lacking in tournament history, shot up to near tops of The List.  The panel modified further their evaluations by downplaying course length and favoring shorter designs that promoted shotmaking and finesse.  Golf as the thinking man’s game was being encouraged.

Much stayed the same until 1983, when several important changes impacting The List occurred.  The regional (state) panel, which before 1983 had fluctuated between 118 to 144 people, was now increased to 208, an increase of almost 50%.  The national panel had crept up from 20 in 1975 to 23 in 1983.  Additional raters were required to sample and rate new golf courses that were springing up in America at an ever-increasing rate.  More raters also insured a statistically appropriate sampling ratio required for accurate predictors, much like exit polls during elections require a certain number of people interviewed (samples) to be deemed accurate and used to predict winners.  The second and more profound change was the introduction of a point system, based on specific category.  This system was used for ranking the top 20, and to ‘break ties’ in instances of extremely similar course ratings.  The seven categories were:

(1)      Shot Value
(2)      Resistance to Scoring
(3)      Design Balance
(4)      Memorability
(5)      Esthetics
(6)      Conditioning
 
(7)      Tradition

These categories were not without controversy, mostly regarding course conditioning.  It was argued that a well-designed course retains its design features during transitory times of drought, frost, disease, and other natural events affecting living things.  Greenskeepers come and go, and while their job is unquestionably important - even critical - their talents should play only a small role in determining the ‘greatness’ of a course.  As one rater so aptly put it, “are we trying to find the most beautiful woman, or the best hairdresser?” (GD, Nov 1983 p64).  The board argued that conditioning, either good or poor, could alter a course design’s intent, potentially devaluing architectural strength, so conditioning was retained as a category.

Several more important changes occurred with the 1985 list.  First, a course was required to be at least three years old to be eligible for the top 100 list.  Second, the raters were instructed to employ the seven categories to all courses they rated and generate a numeric value for each category for that course.  Values of 1 to 10 (1 poor, 10 perfect) were assigned for all categories except shot value, in which the 1 to 10 values were doubled, and tradition, which ranged from 1 to 5.  A perfect course score was 75.   The national panel averaged the panelist’s results from the many courses rated and acted, along with the Digest editors, as arbitrators.   Third,  a true 1 to 100 ranking was employed based on composite, average values. The alphabetical listing within groups was abandoned.  Now there was a best course in the US, which turned out to be Pine Valley, and, since 1985, has never left the #1 position.  The 1985 listing also saw the emergence of Ron Whitten from rater and contributing editor to an elevated position as the ‘CEO’ of The List.

Whitten endeavored to broaden the pool of panelists to include women, minorities, and people who traveled and could sample a wide range of courses. Whitten still looked for lower handicapped amateurs who had a demonstrated experience in the world of golf. Gone were professionals and USGA types from The List. Architects were never allowed on Digest's panel, for it was felt that they would have too much of a vested interest, and - more - have a forum to promote their own courses.  
By 1987, the panel had grown to 287 regional raters with 24 golf dignitaries on the national board.  Making The List now required a course to be at least three years old, and for at least 10% of the panelists to have played it.

In 1989, Resistance-to-scoring and Tradition were dropped from the regional panelists scoring system.  The course-ranking editorial panel (formally national board), felt that the USGA course slope was a more detailed assessment of relative course shot value, so a total of up to four points were added to a course’s overall average, using an equation based on course slope.  Other changes included the devaluation of tradition from 5 to 2 point.  Tradition points were now added by the panel, based on their assessment of what impact the candidate course has had on the history and lore of the game.  A ‘perfect’ score now became 66.

 
1991 saw a quick reversal of the 1989 decision to drop resistance to scoring as a rating category.  Digest realized the fallacy of basing course difficulty on slope for two reasons: (1) slope was a measure of the average golfer's ability (bogey player), where raters derived rating values from the tournament tees (scratch tees), and (2) slope was a static figure measured and revised by the USGA (re-rated) all too infrequently. Course raters could gage difficulty (resistance to scoring) as many times a year as the candidate course was visited.  The highest score a course could receive was now 72.  Ties were broken by prioritizing the highest shot value averages.  This was also the last year in which the magazine published a glossary of the names and states of the individual panelists.

Controversy again struck The List in 1993 with the assignment of  two ‘bonus points’ given to a course that allowed walking (one point if walking is restricted to certain times).  A grass roots movement was afoot in golf to get back to the ‘traditional’ round: links courses were in vogue, caddie programs were strengthened, and walking was encouraged.  Major golf magazines, including Golf Digest, championed the walking cause and likely put political pressure on the rating committee to reflect this policy in the course rating.  Thus the added two points, which now made the  perfect score 74.  The regional group of panelists had swelled to 430.  

This past year, 1995, saw further changes in The List.  In an effort to correct the trend of new expensive, highly advertised courses replacing some of the quiet old established ones on The List, the tradition category value was increased from 2 back to 10 points.  The Digest Rating Board administered these points based on historical research about a candidate course.  The national raters, which now stood at 535 in number, did not rate tradition.  A course was now required to be at least five years old and must have been visited by at least 24 panelists (roughly 5%) before being considered for The List.  All rating values except conditioning is currently maintained for 10 years, or until a panelist revisits a course and revises his values. Conditioning values are considered too transitory and are purged from the averaging every two years.  It is interesting to note that the resultant 1995 List contains just under half of the original courses, which started out in the 1966 List.

« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Matt_Ward

Re: Course Ratings Have Come a LONG way
« Reply #11 on: April 08, 2003, 04:58:17 PM »
Jonathan:

Congrats on the long history report!

Just have to offer this:

There is the mistaken belief that having MORE people on a panel will somehow (I guess by magic!) be able to assess more accurately and definitively what are the greatest courses in the USA. That is poppycock!

What you need are people who thoroughly understand what it is they are looking at and also don't fall under the false notion in chasing courses simply because of "name" designers or whether a facility has served as a site for a particular event.

It's also important to point out you have people "voting" who are truly regional and not national in scope -- many simply don't have the time or wherewithal to travel that extensively to keep up with what is happening There are also "votes" cast from one person for one particular course and then you have completely different person casting numbers for another different course. How does one know that the numbers cast are being cast with some sort of consistency? How do provide for the necessary need for cross comparison purposes when no such cross comparison exists? At some point you have to "trust" what the panel arrives at and simply "add" up the totals. With this strict reliance on numbers you get the Zagat's guide approach.

However, GD given the storm clouds that formed over Shadow Creek cracking the top ten decided to "fudge" the system with the inclusion of other non-course criteria (tradition & walking). In essence -- the panel has a role, however, GD reserved the right to add "bonus" points to quell any thought that other "new" facilities could crash the party.

The issue boils down to the quality of your panel and whether or not the people you have are capable in making the kind of sweeping national cross comparisons necessary for any review. There is no perfect way to do this (subjectivity obviously still plays a role), however, throwing in non-course categories to bring some sort of "harmony" to the final results only shows a desire to manipulate the final product that emerges.

Clearly, it's possible to do this in-house if a publication desired to go in that direction and I believe Dan Kelly even suggested such a thing a short time ago on a different but related thread.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Joel_Stewart

  • Karma: +0/-0
Re: Course Ratings Have Come a LONG way
« Reply #12 on: April 08, 2003, 07:11:13 PM »
Jonathan:
Thats a very interesting history report, did it come from GD?

Matt:
I agree with most of what you said especially in training or educating panelists.  I have very mixed emotions when I discuss a course with another panelist and we have completely opposite reactions to golf courses.  Naturally I feel they are idiots but then I wonder if GD is seeking different reports from golfers of all different abilities.

Personally as I expressed on another thread that many many courses are not receiving enough panelist play.  Take Alpine as an example, I wonder how many panelists played it in the last 2 years.  We get between 50 and 100 per year at Olympic which is more then we need but the magazine would be better suited if half of these people played other courses in the area.  I think its going to be a wise move for GD to require panelists to play assigned older courses which hopefully should give better coverage to all that want it in the state rankings.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Mike_Cirba

Re: Course Ratings Have Come a LONG way
« Reply #13 on: April 08, 2003, 08:06:56 PM »
Ok..it's a little late to be typing this, but it's better than sedatives.

Here's the Golf Digest "Top 100 Courses", circa 1982.  At the time, the list was broken into the "First ten", "Second ten", etc., through 50, and then a "Second Fifty".

First Ten

Augusta
Cypress Point
Merion
Oakmont
Olympic (Lake)
Pebble Beach
Pinehurst 2
Pine Valley
Seminole
Winged Foot (West)

Not many surprises..yet

Second Ten

Baltusrol Lower
Colonial
Harbour Town
Los Angeles
Medinah 3
Muirfield Village
Oakland Hills South
Riviera
Shinnecock Hills
Southern Hills

Starting to get a little iffier...

Third Ten

Cascades Upper
Champions Cypress Creek
Firestone South
Oak Hill East
Pine Tree
Point O'Woods
Prairie Dunes
Quaker Ridge
San Francisco
The Golf Club

Ummm...

Fourth Ten

Bay Hill
Butler National
Canterbury
Cherry Hills
Inverness
Jupiter Hills
Laurel Valley
Peachtree
Saucon Valley Grace
Scioto

It gets worse...much worse

Fifth Ten

Baltimore Five Farms
Bob O'Link
Cog Hill 4
Concord Monster
CC of North Carolina
Crooked Stick
Doral Blue
Lancaster
Oak Tree
Shoal Creek

Ok...here's the "Second Fifty"....pretty wild bunch!

Aronimink
Atlanta CC Highlands
Atlanta CC
Bellerive
Boyne Highlands
Cedar Ridge
Champions Jackrabbit
Chicago
Coldstream
Congressional
CC of Birmingham West
CC of Detroit
CC of New Seabury Blue
Desert Forest
Dunes
Eugene
Garden City
Goodyear Gold
Greenville Chanticleer
Greenlefe West
Hazeltine
Hershey West
Innisbrook Copperhead
Interlachen
JDM East
Kemper Lakes
Kittansett
La Costa
Long Cove
Maidstone
Mayacoo Lakes
Meadow Brook
Moselem Springs
NCR South
Old Warson
Olympia Fields North
Pauma Valley
Plainfield
Princeville
Sahalee
Salem
Sawgrass
Sea Island
Spyglass Hill
The Country Club
TPC Sawgrass
Wannamoisett
Wild Dunes
Wilmington South

Remember folks, this was only 20 years ago.  And..it wasn't part of the "Most Difficult", but GD's first real attempt to list the "Best".  
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

THuckaby2

Re: Course Ratings Have Come a LONG way
« Reply #14 on: April 09, 2003, 06:09:26 AM »
Jonathan:  many thanks for the history report, although I have to ask the same question Joel did:  from where did you get this information?  Where also did you get the definitive answer that the minimum is 30?

I ask only because I heard it was 15.  30 would make more sense... but then I'd have to guess a LOT of courses don't get visited by that many.

Remember I am just a peon rater, one of 800, but unfortunately one who actually likes the GD methodology and is paying a painful price for this preference here.  ;)  Sounds like you are way more tuned into this process than I am.  And if that is the case - congratulations!  The zillions of questions I have been getting about it both here and by email will now be forwarded to you.  ;)

TH
« Last Edit: December 31, 1969, 07:00:04 PM by -1 »

Jonathan Cummings

  • Karma: +0/-0
Re: Course Ratings Have Come a LONG way
« Reply #15 on: April 09, 2003, 10:01:19 AM »
I wrote the above history of the GD panel in the mid-to-late 90s after interviewing Topsy Sideroff, Ron Whitten and others.  I even interviewed Ron's wife!  Don Olmstead (now deceased) also provided a wealth of historical information.  I also wrote a history of the Golf Magazine (after interviewing TD), Golfweek (Brad and Kris Krebs) and Golf World (London publishers) rating panels.  I could post those too, providing GCA has collectively and beforehand drunk a tanker truck of coffee.

Huck - in the boring world of statistics there is a convention used by statisticians for small populations.  Without going into details (I’m not a statistician) averages, standard deviations, etc computed from a sample base of less than 30 numbers require that you employ a correction (“fudge factor”).  If you have more than 30 samples you avoid using this correction.  I bet that GD was aware of this and decided that this was a good enough lower visit limit for ratings as any, so they made it a requirement.

JC
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Matt_Ward

Re: Course Ratings Have Come a LONG way
« Reply #16 on: April 09, 2003, 10:01:48 AM »
Joel S:

For what it's worth the bulk of many panels is simply there to play golf at the big name courses IMHO. If you check the actual report many panelists make very few actually travel to the places that need some sort of review -- hence -- the  desire to include more people under the mistaken belief that this will provide for more coverage and for pertinent analysis.

Too many panelists are busy hob-knobbing at the "name" clubs so that they can add these particular clubs to their trophy collection.

Joel, you're absolutely right -- how many reviews are really needed at Olympic? I mean if something REALLY comes up (conditioning issues / new or removed bunkering, etc, etc)then you need to go back and re-assess.

The issue is getting panelists to include on their ballot the names of courses many don't see -- i.e. The Kingsley Club, to name just one glaring example! GD, to it's credit, did try to "assign" courses for certain panelists to play so as to spread the numbers around and avoid "homers" dominating the process. I agree with that approach in concept, but the minmum number of courses that panelists should play should have been increased over the years. If someone complains it's a hardship then drop them and get someone else. I don't think getting someone qualified should prove to be that difficult.

When I was a GD rater I routinely did 30-40 new courses per year for over 15 years and I made it a point as a media member to schedule plenty of courses that were "off the beaten track" and in states that often get little attention. I'm not suggesting that everyone needs to follow such a rigorous pattern, but ultimately when any publication suggests to its readership that "such and such" courses are the "best" then the screening process needs to be tightened.

I've never believed that adding panelists for the sake of adding people accomplishes anything. I like the idea that GW does provide some sort of orientation and re-education of its panelists although I hope that those type of gatherings don't instill some sort of concrete mindsets on what is indeed worthy of attention -- I don't think that will happen to a number of GW raters that I know. I also think that a publication is well served if it decides to rotate people off and then move new people on to avoid long term perferences / biases.

One last thing -- I also think there should be a split panel -- one that permits people to review on a regional basis and a clear limited number of people who are truly "national" in scope and perspective. When you've done well at the regional level then you can be considered for a spot at the national level.

Mike C:

The comparisons in the course of time do reflect a mindset change that has happened with the GD ratings. What's really amazing is that years ago Cypress Point was not in the top 50 and Crystal Downs wasn't even rated. Think about that!

redanman:

I hear you loud and clear about Cherry Hills and Saucon. I still chuckle that these old bears are still revered as much as they are.

« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

THuckaby2

Re: Course Ratings Have Come a LONG way
« Reply #17 on: April 09, 2003, 10:07:56 AM »
Fantasic JC - thanks.  

I wonder what Matt's going to say about the statistical necessity of a fudge factor for 30 or less responses?  I shall sit back and listen to the back-peddling.   ;D

Statistics are not my bag, obviously.  But it is correct, isn't it, that if 30 responses are given, 5-6 outrageously low ones skew the total average tremendosly, correct?  Thus 5-6 who didn't like a course skew the total down?

This seems to me to be a very logical reason some courses don't come out rated as highly as people think they should.  Damn, trying to get 30 people to agree on Cypress Point has to be difficult - I know at least one who pooh-poohed it - not naming any contrarian names, mind you...  ;)

TH
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back