News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Matt_Ward

The Failure in State Ratings
« on: April 13, 2007, 08:56:07 AM »
There's been a few threads / posts on the topic of state ratings. Generally, most of the "top" courses are selected but there has been a clear lack of due diligence in a number of states (e.g. PA, NY, AZ, NJ, CT, NV, MD, etc, etc, etc).

My thinking on this is simple.

Too many panelists only concentrate on the select few layouts (the star course gazers) and simply either ignore or fail to dig down a little to find what else is taking place. The listing of no less than a half a dozen courses in the Jersey state ratings clearly demonstrates to me at least at how limited the knowledge base is of so many raters.

Too often panelists from outside a given area have little knowledge of recent happenings and continue to reinforce the same old courses with constant positioning as being one of the truly best.

In addition, it would be of immense help if the powers-that-be from the various magazines create a system in which those living within a given state are given more weight when assessing state courses than the infrequent outsiders who simply only cherry pick particular courses.

Does any of this matter?

For those who think ratings are a joke the simple answer is no.

For the rest who do monitor such listings it would add a bit more credibility to what is clearly a subjective exercise.

There will never be 100% correct answers -- but when one loses a big chunk of credibility in so many locations it pays to update the process so that the final results have more meaning and are not outright dismissed.

A.G._Crockett

  • Total Karma: -1
Re:The Failure in State Ratings
« Reply #1 on: April 13, 2007, 09:20:16 AM »
Matt,
Would it be correct to assume that neither Golfweek nor Golf Digest weight local ratings more heavily than the ratings of "out-of-staters"?  

Honestly, and with no offense to anyone on this board, the GD state rankings are such an incredible mess that I don't see why they bother, and your theory might be as good an explantion of why they are such a mess as any I've come across.  

The GW state ratings, on the other hand (at least for the limited number of states for which I have reasonable familiarity) are much, much more reasonable and seemingly thoughtful.  What accounts for the vast difference between the two?
"Golf...is usually played with the outward appearance of great dignity.  It is, nevertheless, a game of considerable passion, either of the explosive type, or that which burns inwardly and sears the soul."      Bobby Jones

Jim Franklin

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #2 on: April 13, 2007, 09:24:28 AM »
Matt -

You may be onto something with allowing more credibility to in-state panelists than to the ones that come in, see it once, and are gone. I know I try to play as many different places as I can, but when visiting a place infrequently, I want to try to hit the big names first and some of the lesser known gems go unseen.

I don't know the answer.
Mr Hurricane

John Kavanaugh

Re:The Failure in State Ratings
« Reply #3 on: April 13, 2007, 09:31:23 AM »
Golfweek only rates public courses by state.  You simply can not allow raters to rate courses where they are members and then double the value.

Tom_Doak

  • Total Karma: 20
Re:The Failure in State Ratings
« Reply #4 on: April 13, 2007, 09:32:43 AM »
Matt:

I think you are right that in-state panelists who have played courses multiple times should have greater weight in the voting process.  But the magazines don't do that because the in-state results for NY might look different than the ranking of the ten NY courses within the top 100 list, which they are so busy pretending is the be-all and end-all of rankings.  (And you can't apply the same procedure nationally, because you would be giving Oregon people more weight on pulling up Pacific Dunes, vs. New Jersey people pulling up Plainfield.)

In the end, the reason the in-state rankings seem so skewed is that you are comparing a bunch of courses which are fairly similar (sixes on the Doak scale) where personal preference is the difference between sixth place and sixteenth.  They are all exceptionally close together and just a couple of people disliking a course sends it way down the list.

John Kirk

  • Total Karma: 4
Re:The Failure in State Ratings
« Reply #5 on: April 13, 2007, 09:40:40 AM »
I have a problem with your proposal.  I think people here tend to rate courses close to home higher.  I am basing my perception on studying GolfClubAtlas for four years.

So I think it's important for out-of-town raters to come in and give an unbiased assessment.

Otherwise, I think the state lists are different than what we desire for other reasons.

Kirk Gill

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #6 on: April 13, 2007, 09:53:16 AM »
So I think it's important for out-of-town raters to come in and give an unbiased assessment.

Is there such a thing? I think that's why they want so many ratings for each course, to balance out the biases and try to obtain something like a consensus.
"After all, we're not communists."
                             -Don Barzini

Steve_ Shaffer

  • Total Karma: -1
Re:The Failure in State Ratings
« Reply #7 on: April 13, 2007, 09:58:23 AM »
I think there may be a bias in favor of local courses within raters of the same state. For example, in PA, just as there is a home town bias in politics, there may be a home town bias in course ratings. I've never played any course in the Pittsburgh area of note except for Nevillewood. I'd like to play Olde Stonewall and Mystic Rock to see how they compare, in the public arena, to Glen Mills, for example. I think our local golf writer will be doing that soon, if the weather ever breaks.
"Some of us worship in churches, some in synagogues, some on golf courses ... "  Adlai Stevenson
Hyman Roth to Michael Corleone: "We're bigger than US Steel."
Ben Hogan “The most important shot in golf is the next one”

John Kirk

  • Total Karma: 4
Re:The Failure in State Ratings
« Reply #8 on: April 13, 2007, 10:06:31 AM »
So I think it's important for out-of-town raters to come in and give an unbiased assessment.

Is there such a thing? I think that's why they want so many ratings for each course, to balance out the biases and try to obtain something like a consensus.

Perhaps it would be more accurate to say the out-of-town guy comes in and makes an assessment free from any hometown bias.

I'd be interested to see how homers and others rate courses, to see if I'm right about this.

Andy Troeger

Re:The Failure in State Ratings
« Reply #9 on: April 13, 2007, 10:16:54 AM »
The lists are what they are. One would hope that all panelists put a good faith effort into their ratings, and then the results are published. You can change the methodology all you want and it will change the results some as well. You cannot scientifically rate golf courses, the best you can really do is great discussion (which the lists do very well with obviously).

Each list is reflective of the panel it represents. I think personally that it is better to allow the out-of-town group to add their take because they at least provide a balance between the "hometown bias" and "national bias" arguments. The thing is, how many out-of-towners really came and rated the private clubs in NY? I'm sure there were some and they made some difference, but probably not enough to completely change the ordering of the lists.

The state rankings in the two states I've lived in (Indiana and New Mexico) are fairly representative. There are things I would change by all means. I prefer Wolf Run to Sycamore Hills, think Otter Creek is too high, and that Rock Hollow is too low in Indiana. In New Mexico, I've yet to play Las Campanas, but UNM-Championship is too low and should be in the 5-7 range, but other than that the ordering is pretty close of the ones I've seen.  However, I've put in my ratings on those courses, and fellow raters did not agree. So be it.

Ron Farris

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #10 on: April 13, 2007, 10:34:51 AM »
Matt,

The state ratings are what they are - based upon opinion of a few.  Here in SD we often see a great disparity in an east vs. west flavor.  The west has quirky, hilly courses while the east has more parkland style, flatter, more walkable,  courses.  It would seem that in most cases the raters favor the east by virtue of ethnocentricity.  

It is always interesting to see the ratings each year and try to imagine what the heck they were thinking.  

Kalen Braley

  • Total Karma: -9
Re:The Failure in State Ratings
« Reply #11 on: April 13, 2007, 11:04:46 AM »
While I'm not sure what the best system should be as far as assigning points, I think more weight should go to in-state raters ranking in-state courses.... but only for the state list.

I'm not sure if others feel the same, but being able to play all of your local courses several times and see them in different conditions at different times of the year, enables one to get a better feel for the course and learn thier subtilties.  As opposed to someone from out of town, just stopping in for a weekend visit.


John Kavanaugh

Re:The Failure in State Ratings
« Reply #12 on: April 13, 2007, 11:11:52 AM »
People don't use rankings to decide which local courses to play.  A more accurate method would be to not allow raters to rank any course within 100 miles of their home or within 100 miles of any course where they are a member.  This would paint a more accurate picture for the traveling golfer.  It wouldn't change the top 50 courses all that much but would have a huge change on the bottom 50 as it is almost always the local push that gets marginal courses on the bottom of a list.
« Last Edit: April 13, 2007, 11:22:30 AM by John Kavanaugh »

TEPaul

Re:The Failure in State Ratings
« Reply #13 on: April 13, 2007, 11:19:22 AM »
Hey, Bill Vostinak, I don't pretend to understand the nuances and distinctions between Doak's 7s and 8s and 9s or even 10s but it is sort of based on whether one is willing to drive something like a hundred miles to play a course, isn't it?

Well, if so did Tom Doak take into consideration whether that one hundred miles is connected by something like I95 or not?

If so perhaps that one hundred miles a golfer would be willing to travel that's connected with little country roads should be considered a 15 on the Doak Scale.  ;)

Phil_the_Author

Re:The Failure in State Ratings
« Reply #14 on: April 13, 2007, 11:22:36 AM »
Maybe it is time for the different ratings agencies (how's that for a title) to consider creating a standard that those who represent them must meet that is different from the current one.

For example, how does a rater go about choosing a course to rate? How many (number) are they required to visit and does the journal require their visiting some courses in order for their ratings of the ones they visited by choice to be accepted?

For example. Two years ago I was asked by the management of Bethpage State Park to see if I might be able to aid them in getting raters to play and rate the Red course. They mentioned that whenever they were asked by a rater for course access they would mention the fine work recently done on the Red and they would almost inevitably be told that the rater only had one day and wanted to play the Black and would be unable to play the Red.

I contacted individual raters and others such as Brad and made mention that the Red deserved a good look at for ratings purposes and that the Park wanted to do whatever it could to get raters to play the course. They received a minimal response despite my even arranging a day that was set aside for raters to play.

If raters refuse to play a good course that is on the same site as a great one, doesn't this reflect upon the veracity of  the magazine's ratings?

I believe that since the magazine's created this monster that they call course ratings, that they should be held accountable for making every effort to insure that those courses that want to be looked at are, and for individual raters to accept an ethics to play any and all courses and rate them fairly and not view their raters privilege as an all-access badge.
« Last Edit: April 13, 2007, 11:25:40 AM by Philip Young »

Steve_ Shaffer

  • Total Karma: -1
Re:The Failure in State Ratings
« Reply #15 on: April 13, 2007, 11:31:53 AM »
Phil,

That's called the "notch on the belt" method of rating courses. Only the big names get rated over and over again. I don't know how GD gets its raters to play Best New Affordable, for example. Do they just send out a list to their raters in the area and say try and play this or that course? They need 10 raters to play a course in my understanding before it gets ranked.
"Some of us worship in churches, some in synagogues, some on golf courses ... "  Adlai Stevenson
Hyman Roth to Michael Corleone: "We're bigger than US Steel."
Ben Hogan “The most important shot in golf is the next one”

Tom_Doak

  • Total Karma: 20
Re:The Failure in State Ratings
« Reply #16 on: April 13, 2007, 11:55:15 AM »
Phil and Steve:

For a while I think GOLF DIGEST was actually assigning its raters to play some of the lesser lights and some of the new courses that might not get enough votes to be ranked in the Best New.  I think they insisted the raters play 3-5 "assigned" courses if they were going to keep using their card as a free pass to the famous ones; and I think that's one of the best things they've tried to do.

There are way too many people who set out to "play the Top 100" and travel thousands of miles to get all 100 belt notches without bothering to play the very good course next door to any of them.  That's entirely unproductive from the ratings standpoint -- the magazines already know those courses are pretty good, and an additional vote from some new "expert" who has played them all doesn't mean much.  They need votes on the challengers.

I was very surprised to see Lost Dunes suddenly make it onto the GOLF DIGEST list this year ... after looking at the numbers it appears possible that it might have had good enough votes all along, but just didn't have enough votes to qualify until seven years after it opened for play.  I'm sure there are a few other courses in the same position today.

Matt_Ward

Re:The Failure in State Ratings
« Reply #17 on: April 13, 2007, 12:12:41 PM »
Tom Doak:

I've had a bit of experience in serving on different panels and frankly it comes down to having people who have the wherewithal (time and $$) to do the researching and playing that's involved.

I can tell you this -- there are marked differences and it doesn't -- check that -- should not take a mega effort to realize certain courses are marked higher simply because og long standing "celebrity status." My classic example on that front is Maidstone. Wonderful layout indeed with the dunes holes but gets plenty of extra considerations because of the tony address it finds itself.

There is no perfect system because people aren't perfect nor is the subject matter at hand 100% quantifiable.

I salute Digest from a number of years back in mandating that certain panelists must play specific courses -- this was done to ensure that courses got the minimum tallies -- and now, if memory served, Golfweek has been doing something akin to this as well.

This was also instituted because certain architects were quite clever in itemizing their latest works to raters they know and to be sure to have them stop by and play the course. The net effect? Pumping up the numbers.

No doubt the "homer" situation is the reverse of what I mentioned when looking at things at the state level. You will have certain "homers" who will protect their turf -- sometimes with inflated numbers in order to keep certain courses high within their state -- that's the only reason why I can explain how Baltusrol / Lower and to a lesser extent the Upper maintains its high position in NJ.

Tom, the real deficiency also works when a top 100 is presented. In that situation you have a process in tallying votes from people who only play locally or regionally for nearly all the golf they play in a given year. Their votes are then tabulated on an "equal" basis with those raters who are truly "national" in scope.

When I was on the Digest panel I mentioned to them that it was clear that of the upteeeeeen panelists they would surely be able to ID those panelists who have the time and effort to be much more than regional in scope.

There is a place for regional / state raters -- there is also a need to segregate those who are able to rate the larger forest (national).

My suggestion was reviewed and tossed aside.

Panels need some sort of two-dimensional capability. You can't sprinkle the one-time player at the same voting weight with those who see and play the area / regional courses on a much more frequent basis.

On the flip side -- when a national assessment is done you need people fully capable in providing the essential CROSS COMPARISON analysis. Otherwise, you have numbers from one person who has only played courses on the west coast with another singular person who has only played courses on the east coast. You are then left trusting the allocation of specific course rating numbers from two different people.

What's even more inane -- is the idea that adding panelists somehows makes the process more consensus oriented. Watering down results from limited people doesn't highlight greatness IMHO.

Bill V:

The reason why Joe Sixpack and his companion Mary Wineglass see "6's" as 8, 9 and 10 is that their CUMULATIVE SAMPLE SIZE is that small. When one has only played a limited number of courses the likelihood (unless that small sample includes the likes of Cypress, Shinnecock, Southern Hills, Riviera, et al) you will get that sort of mindset.

Phil:

I love the concept of "accountability" but the only real response people can take is not to purchase or continue to subscribe to such magazines.

Clearly, the magazine will defend themselves -- see the response Tarde gave to Trump's assertion that advertising in Digest was central to whether or not a facility gets selected.


John Shimp

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #18 on: April 13, 2007, 02:19:20 PM »
Matt,
What  is the explanation for a state loaded with strong classic courses like Mass. only get 2 top 100's?  I agree with you that there are clearly states that hog the rankings and others that are lightly represented.  Some, like NY, deserve their high share.  Why though do all these new Fazio's and Engh's in the heartland (eg. Flint Nat'l, Sand Ridge, etc) make the list and courses like Myopia Hunt, Essex County, etc. don't. I think several courses in mass. that are nearly as unique as Fishers Island but still don't make the cut?

John Kavanaugh

Re:The Failure in State Ratings
« Reply #19 on: April 13, 2007, 02:23:46 PM »
Matt,
What  is the explanation for a state loaded with strong classic courses like Mass. only get 2 top 100's?  I agree with you that there are clearly states that hog the rankings and others that are lightly represented.  Some, like NY, deserve their high share.  Why though do all these new Fazio's and Engh's in the heartland (eg. Flint Nat'l, Sand Ridge, etc) make the list and courses like Myopia Hunt, Essex County, etc. don't. I think several courses in mass. that are nearly as unique as Fishers Island but still don't make the cut?

It is good for golf to have new courses crack the lists...What is good for golf is good for the magazines.
« Last Edit: April 13, 2007, 02:24:22 PM by John Kavanaugh »

Tom_Doak

  • Total Karma: 20
Re:The Failure in State Ratings
« Reply #20 on: April 13, 2007, 02:32:49 PM »
Matt:

I agree with nearly everything you said to me above -- except I happen to have a lot of respect for Maidstone, and I think it's great to have a couple of courses without massive maintenance budgets on the list.  

Perhaps you could argue that Myopia or Eastward Ho or Newport is more deserving, but I'd much rather see Maidstone on there than Atlantic or The Bridge or any of the other new-money architectural wonders.  For myself at least, the tony membership of Maidstone has zero to do with it.  It's partly about the setting, and partly about having their priorities in order as so few American courses do.

I played Prestwick a couple of weeks ago and I would put it in the same category.

Andy Troeger

Re:The Failure in State Ratings
« Reply #21 on: April 13, 2007, 02:41:33 PM »
Matt,
What  is the explanation for a state loaded with strong classic courses like Mass. only get 2 top 100's?  I agree with you that there are clearly states that hog the rankings and others that are lightly represented.  Some, like NY, deserve their high share.  Why though do all these new Fazio's and Engh's in the heartland (eg. Flint Nat'l, Sand Ridge, etc) make the list and courses like Myopia Hunt, Essex County, etc. don't. I think several courses in mass. that are nearly as unique as Fishers Island but still don't make the cut?

John,
Your comment reminds me that sometimes I think the problem is that there are about 300 "Top 100" courses for 100 spots. GW makes it a little easier by including 200, but that only partially negates the issue. Someone from the midwest or southwest might reply "with all of these great courses being built in our area how are these old classic courses hanging on to the lists?" Both arguments would have validity...there's a lot of good stuff out there.

Craig Van Egmond

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #22 on: April 13, 2007, 04:17:19 PM »

I thought Golf Digest did a good job with the Oklahoma rankings. Of course there are only 3 top courses and then the rest. After Southern Hills, Oak Tree Country Club and Karsten Creek the dropoff is significant, some solid courses in there but nothing above a 6 on the Doak Scale.



 


Andy Troeger

Re:The Failure in State Ratings
« Reply #23 on: April 13, 2007, 04:27:35 PM »
Bill,
Same idea...except you took the time to word it significantly better than I did :)

Bet we couldn't all agree on the last few of those 30-50 though either ;)

Matt_Ward

Re:The Failure in State Ratings
« Reply #24 on: April 13, 2007, 04:47:16 PM »
Craig E:

Just curious -- do you see the courses that made the OK listing -- after the top three as being the clear success stories. Were there any glaring omissions?

John S:

I personally am a big fan of The Bay State. The private layouts you have mentioned are listed by Golfweek but Digest simply ignores them and with that comes a mega loss of credibility in the minds of many -- myself included.

John, I can say the same for a number of Jersey private layouts that were ignored by Digest (e.g. Essex County, Forsgate, to name just two).

However ...

Keep this in mind -- Tom Fazio and Jim Engh may not be the cup of tea for the narrow-focused GCA classic type preferred person who inhabits this site. I have to ask you -- have you played a representative sampling of the courses these gents have designed?

I can certainly say I have and candidly some of their finest work is not even listed among the best of their efforts -- see TF's work at Glenwild in Park City, UT as just one example. On the Engh side I really enjoy Pradera in Parker, CO even though it has a few bowled shape greens that are bit a redundant but there's plenty of solid holes too.

Too often people see things in the relative narrow context of their own backyard. Quality golf design is not the home of just one area alone -- e.g., the I-95 corridor between DC thru Boston. With that said -- I have to say that too many times people who come to the Northeast only concentrate on the elites -- Winged Foot, Shinnecock, PV, Merion and fail to comprehend how solid the next tier of courses truly is.

What you may not realize is that the best public layouts you see now being designed in some of the lesser populated areas are very good indeed (see Greg Norman's Red Sky Ranch in Wolcott, CO as just one example) and a number of them can seriously contend for such an elite placement.

Tom D:

I don't disagree with you on Atlantic or The Bridge specifically (which has never been rated among the top 100) are rated too high to start with. To be clear -- my issue with Maidstone doesn't have an ounce of argument in regards to the maintenance element you outlined. I thoroughly enjoy the dunes holes but the sheer totality and consistency needed for such a high rating is lacking in my book. One other thing - I do enjoy courses that aren't long or favor high slopes and CR's and would like to see others included -- Golfweek clearly does this better than Digest and that's a big plus for those seeking a much more broader recognition of such layouts.

Frankly, if you put any number of top tier layouts in other states in close proximity to Shinnecock, NGLA and Sebonack the resulting spillover could very well be just as positive.

John K:

I see no issue with new courses cracking the listings - unfortunately there are people who thing golf cannot be serious if it's located in the mountain time zone, to name just one spot that comes quickly to mind. I will put Black Mesa, just outside of Santa Fe, against any number of other courses that have opened in the last 5-6 years.

Unfortunately, there are a number of people who pay more homage to where they live than in doing some sort of justice to what kind of passion they have for quality golf -- no matter where it is located. I can fully understand how folks from the western areas may believe there is a deep-seated bias from the Northeast area of the USA. Again, plenty of these folks -- on both sides of the discussion -- sometimes fail to play a bit more layouts than just the very elite few.